Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?

TitleTheory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
Publication TypeCBMM Memos
Year of Publication2016
AuthorsPoggio, T, Mhaskar, H, Rosasco, L, Miranda, B, Liao, Q
Date Published11/2016
Abstract

[formerly titled "Why and When Can Deep - but Not Shallow - Networks Avoid the Curse of Dimensionality: a Review"]

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.

arXiv

https://arxiv.org/abs/1611.00740v5

DSpace@MIT

http://hdl.handle.net/1721.1/105443

CBMM Memo No:  058

Research Area: 

CBMM Relationship: 

  • CBMM Funded