Publication
Export 9 results:
Filters: Author is Brando Miranda [Clear All Filters]
Dynamics & Generalization in Deep Networks -Minimizing the Norm. NAS Sackler Colloquium on Science of Deep Learning (2019).
Classical generalization bounds are surprisingly tight for Deep Networks. (2018). CBMM-Memo-091.pdf (1.43 MB) CBMM-Memo-091-v2.pdf (1.88 MB)
Theory III: Dynamics and Generalization in Deep Networks. (2018). Original, intermediate versions are available under request (2.67 MB) CBMM Memo 90 v12.pdf (4.74 MB) Theory_III_ver44.pdf Update Hessian (4.12 MB) Theory_III_ver48 (Updated discussion of convergence to max margin) (2.56 MB) fixing errors and sharpening some proofs (2.45 MB)
Musings on Deep Learning: Properties of SGD. (2017). CBMM Memo 067 v2 (revised 7/19/2017) (5.88 MB) CBMM Memo 067 v3 (revised 9/15/2017) (5.89 MB) CBMM Memo 067 v4 (revised 12/26/2017) (5.57 MB)
Theory of Deep Learning IIb: Optimization Properties of SGD. (2017). CBMM-Memo-072.pdf (3.66 MB)
Theory of Deep Learning III: explaining the non-overfitting puzzle. (2017). CBMM-Memo-073.pdf (2.65 MB) CBMM Memo 073 v2 (revised 1/15/2018) (2.81 MB) CBMM Memo 073 v3 (revised 1/30/2018) (2.72 MB) CBMM Memo 073 v4 (revised 12/30/2018) (575.72 KB)
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review. International Journal of Automation and Computing 1-17 (2017). doi:10.1007/s11633-017-1054-2 art%3A10.1007%2Fs11633-017-1054-2.pdf (1.68 MB)
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?. (2016). CBMM-Memo-058v1.pdf (2.42 MB) CBMM-Memo-058v5.pdf (2.45 MB) CBMM-Memo-058-v6.pdf (2.74 MB) Proposition 4 has been deleted (2.75 MB)