Complexity Control by Gradient Descent in Deep Networks

TitleComplexity Control by Gradient Descent in Deep Networks
Publication TypeJournal Article
Year of Publication2020
AuthorsPoggio, T, Liao, Q, Banburski, A
JournalNature Communications
Volume11
Date Published02/2020
Abstract

Overparametrized deep network predict well despite the lack of an explicit complexity control during training such as an explicit regularization term. For exponential-type loss functions, we solve this puzzle by showing an effective regularization effect of gradient descent in terms of the normalized weights that are relevant for classification.

URLhttps://www.nature.com/articles/s41467-020-14663-9
DOI10.1038/s41467-020-14663-9

Associated Module: 

CBMM Relationship: 

  • CBMM Funded