Publication
Export 7 results:
Filters: Author is Brando Miranda [Clear All Filters]
Classical generalization bounds are surprisingly tight for Deep Networks. (2018).
CBMM-Memo-091.pdf (1.43 MB)
CBMM-Memo-091-v2.pdf (1.88 MB)


Theory III: Dynamics and Generalization in Deep Networks. (2018).
TheoryIII_ver2 (2.67 MB)
TheoryIII_ver11 (4.17 MB)
TheoryIII_ver12 (4.74 MB)
TheoryIII_ver13 (4.75 MB)
TheoryIII_ver14 (3.89 MB)
TheoryIII_ver15 (3.9 MB)
TheoryIII_ver20 (3.91 MB)
TheoryIII_ver22 (4.97 MB)
TheoryIII_ver25 (1.19 MB)
TheoryIII_ver28 (1.17 MB)
TheoryIII_ver29 (1.17 MB)
TheoryIII_ver30 (1.17 MB)
TheoryIII_ver31 (most typos and other errors corrected in main text) (1.18 MB)
TheoryIII_ver35 (more edits; regression note in appendix) (1.56 MB)
TheoryIII_ver39 (look at footnote 5) (2.14 MB)















Musings on Deep Learning: Properties of SGD. (2017).
CBMM Memo 067 v2 (revised 7/19/2017) (5.88 MB)
CBMM Memo 067 v3 (revised 9/15/2017) (5.89 MB)
CBMM Memo 067 v4 (revised 12/26/2017) (5.57 MB)



Theory of Deep Learning IIb: Optimization Properties of SGD. (2017).
CBMM-Memo-072.pdf (3.66 MB)

Theory of Deep Learning III: explaining the non-overfitting puzzle. (2017).
CBMM-Memo-073.pdf (2.65 MB)
CBMM Memo 073 v2 (revised 1/15/2018) (2.81 MB)
CBMM Memo 073 v3 (revised 1/30/2018) (2.72 MB)
CBMM Memo 073 v4 (revised 12/30/2018) (575.72 KB)




Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review. International Journal of Automation and Computing 1-17 (2017). doi:10.1007/s11633-017-1054-2
art%3A10.1007%2Fs11633-017-1054-2.pdf (1.68 MB)

Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?. (2016).
CBMM-Memo-058v1.pdf (2.42 MB)
CBMM-Memo-058v5.pdf (2.45 MB)
CBMM-Memo-058-v6.pdf (2.74 MB)


