Theory of Deep Learning III: Generalization Properties of SGD

TitleTheory of Deep Learning III: Generalization Properties of SGD
Publication TypeCBMM Memos
Year of Publication2017
AuthorsZhang, C, Liao, Q, Rakhlin, A, Sridharan, K, Miranda, B, Golowich, N, Poggio, T
Date Published04/2017
Abstract

In Theory III we characterize with a mix of theory and experiments the generalization properties of Stochastic Gradient Descent in overparametrized deep convolutional networks. We show that Stochastic Gradient Descent (SGD) selects with high probability solutions that 1) have zero (or small) empirical error, 2) are degenerate as shown in Theory II and 3) have maximum generalization.

DSpace@MIT

http://hdl.handle.net/1721.1/107841

Download: 

CBMM Memo No: 

067

Research Area: 

CBMM Relationship: 

  • CBMM Funded