Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review

TitleWhy and when can deep-but not shallow-networks avoid the curse of dimensionality: A review
Publication TypeJournal Article
Year of Publication2017
AuthorsPoggio, T, Mhaskar, H, Rosasco, L, Miranda, B, Liao, Q
JournalInternational Journal of Automation and Computing
Pagination1-17
Date Published03/2017
Keywordsconvolutional neural networks, deep and shallow networks, deep learning, function approximation, Machine Learning, Neural Networks
Abstract

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.

URLhttp://link.springer.com/article/10.1007/s11633-017-1054-2?wt_mc=Internal.Event.1.SEM.ArticleAuthorOnlineFirst
DOI10.1007/s11633-017-1054-2

Research Area: 

CBMM Relationship: 

  • CBMM Funded