Publication
Compositional sparsity of learnable functions. Bulletin of the American Mathematical Society 61, 438-456 (2024).
Implicit dynamic regularization in deep networks. (2020).
v1.2 (2.29 MB)
v.59 Update on rank (2.43 MB)
Stable Foundations for Learning: a framework for learning theory (in both the classical and modern regime). (2020).
Original file (584.54 KB)
Corrected typos and details of "equivalence" CV stability and expected error for interpolating machines. Added Appendix on SGD. (905.29 KB)
Edited Appendix on SGD. (909.19 KB)
Deleted Appendix. Corrected typos etc (880.27 KB)
Added result about square loss and min norm (898.03 KB)
On efficiently computable functions, deep networks and sparse compositionality. (2025).
Deep_sparse_networks_approximate_efficiently_computable_functions.pdf (223.15 KB)
Deep Learning: mathematics and neuroscience. (2016).
Deep Learning- mathematics and neuroscience.pdf (1.25 MB)
Visual Cortex and Deep Networks: Learning Invariant Representations. 136 (The MIT Press, 2016). at <https://mitpress.mit.edu/books/visual-cortex-and-deep-networks>
I-theory on depth vs width: hierarchical function composition. (2015).
cbmm_memo_041.pdf (1.18 MB)
Complexity Control by Gradient Descent in Deep Networks. Nature Communications 11, (2020).
s41467-020-14663-9.pdf (431.68 KB)
Deep Leaning: Mathematics and Neuroscience. A Sponsored Supplement to Science Brain-Inspired intelligent robotics: The intersection of robotics and neuroscience, 9-12 (2016).
Theory II: Landscape of the Empirical Risk in Deep Learning. (2017).
CBMM Memo 066_1703.09833v2.pdf (5.56 MB)
From Associative Memories to Powerful Machines. (2021).
v1.0 (1.01 MB)
v1.3Section added August 6 on self attention (3.9 MB)
Computational role of eccentricity dependent cortical magnification. (2014).
CBMM-Memo-017.pdf (1.04 MB)
Theory II: Deep learning and optimization. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).
03_775-788_00920_Bpast.No_.66-6_31.12.18_K2.pdf (5.43 MB)
Notes on Hierarchical Splines, DCLNs and i-theory. (2015).
CBMM Memo 037 (1.83 MB)
An Overview of Some Issues in the Theory of Deep Networks. IEEJ Transactions on Electrical and Electronic Engineering 15, 1560 - 1571 (2020).
The History of Neuroscience in Autobiography Volume 8 8, (Society for Neuroscience, 2014).
Volume Introduction and Preface (232.8 KB)
TomasoPoggio.pdf (1.43 MB)
Compositional Sparsity of Learnable Functions. (2024).
This is an update of the AMS paper (230.72 KB)
Theoretical Issues in Deep Networks. (2019).
CBMM Memo 100 v1 (1.71 MB)
CBMM Memo 100 v3 (8/25/2019) (1.31 MB)
CBMM Memo 100 v4 (11/19/2019) (1008.23 KB)
Is Research in Intelligence an Existential Risk?. (2014).
Is Research in Intelligence an Existential Risk.pdf (571.42 KB)
Theory I: Deep networks and the curse of dimensionality. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).
02_761-774_00966_Bpast.No_.66-6_28.12.18_K1.pdf (1.18 MB)
Theoretical issues in deep networks. Proceedings of the National Academy of Sciences 201907369 (2020). doi:10.1073/pnas.1907369117
PNASlast.pdf (915.3 KB)
Cervelli menti algoritmi. 272 (Sperling & Kupfer, 2023). at <https://www.sperling.it/libri/cervelli-menti-algoritmi-marco-magrini>
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review. International Journal of Automation and Computing 1-17 (2017). doi:10.1007/s11633-017-1054-2
art%3A10.1007%2Fs11633-017-1054-2.pdf (1.68 MB)
Loss landscape: SGD has a better view. (2020).
CBMM-Memo-107.pdf (1.03 MB)
Typos and small edits, ver11 (955.08 KB)
Small edits, corrected Hessian for spurious case (337.19 KB)
Turing++ Questions: A Test for the Science of (Human) Intelligence. AI Magazine 37 , 73-77 (2016).
Turing_Plus_Questions.pdf (424.91 KB)
]