Publication
Export 127 results:
Filters: Author is Poggio, Tomaso A [Clear All Filters]
Deep Learning for Seismic Inverse Problems: Toward the Acceleration of Geophysical Analysis Workflows. IEEE Signal Processing Magazine 38, 89 - 119 (2021).
Distribution of Classification Margins: Are All Data Equal?. (2021).
CBMM Memo 115.pdf (9.56 MB)

The Effects of Image Distribution and Task on Adversarial Robustness. (2021).
CBMM_Memo_116.pdf (5.44 MB)

From Associative Memories to Powerful Machines. (2021).
CBMM-Memo-114.pdf (1.01 MB)
Adding a second part (1.85 MB)


An analysis of training and generalization errors in shallow and deep networks. Neural Networks 121, 229 - 241 (2020).
Biologically Inspired Mechanisms for Adversarial Robustness. (2020).
CBMM_Memo_110.pdf (3.14 MB)

Complexity Control by Gradient Descent in Deep Networks. Nature Communications 11, (2020).
s41467-020-14663-9.pdf (431.68 KB)

CUDA-Optimized real-time rendering of a Foveated Visual System. Shared Visual Representations in Human and Machine Intelligence (SVRHM) workshop at NeurIPS 2020 (2020). at <https://arxiv.org/abs/2012.08655>
Foveated_Drone_SVRHM_2020.pdf (13.44 MB)
v1 (12/15/2020) (14.7 MB)


Dreaming with ARC. Learning Meets Combinatorial Algorithms workshop at NeurIPS 2020 (2020).
CBMM Memo 113.pdf (1019.64 KB)

For interpolating kernel machines, the minimum norm ERM solution is the most stable. (2020).
CBMM_Memo_108.pdf (1015.14 KB)
Better bound (without inequalities!) (1.03 MB)


Function approximation by deep networks. Communications on Pure & Applied Analysis 19, 4085 - 4095 (2020).
1534-0392_2020_8_4085.pdf (514.57 KB)

Hierarchically Local Tasks and Deep Convolutional Networks. (2020).
CBMM_Memo_109.pdf (2.12 MB)

Implicit dynamic regularization in deep networks. (2020).
TPR_ver2.pdf (2.29 MB)
Substantial edits (1.52 MB)
Edits that are extensive but minor in content (1.98 MB)
Extending theory, setting a post (2 MB)
Fine tuning (2.01 MB)
Corrections in Appendix about Neural Collapse (2.01 MB)
Small edits clarifying role of weight decay (2.39 MB)
Added: prove NC for multiclass+theorem on connected global minima (2.4 MB)








Loss landscape: SGD has a better view. (2020).
CBMM-Memo-107.pdf (1.03 MB)
Typos and small edits, ver11 (955.08 KB)
Small edits, corrected Hessian for spurious case (337.19 KB)



An Overview of Some Issues in the Theory of Deep Networks. IEEJ Transactions on Electrical and Electronic Engineering 15, 1560 - 1571 (2020).
Scale and translation-invariance for novel objects in human vision. Scientific Reports 10, (2020).
s41598-019-57261-6.pdf (1.46 MB)

Stable Foundations for Learning: a framework for learning theory (in both the classical and modern regime). (2020).
Original file (584.54 KB)
Corrected typos and details of "equivalence" CV stability and expected error for interpolating machines. Added Appendix on SGD. (905.29 KB)
Edited Appendix on SGD. (909.19 KB)
Deleted Appendix. Corrected typos etc (880.27 KB)
Added result about square loss and min norm (898.03 KB)





Theoretical issues in deep networks. Proceedings of the National Academy of Sciences 201907369 (2020). doi:10.1073/pnas.1907369117
PNASlast.pdf (915.3 KB)

An analysis of training and generalization errors in shallow and deep networks. (2019).
CBMM-Memo-098.pdf (687.36 KB)
CBMM Memo 098 v4 (08/2019) (2.63 MB)


Biologically-plausible learning algorithms can scale to large datasets. International Conference on Learning Representations, (ICLR 2019) (2019).
gk7779.pdf (721.53 KB)

Deep Recurrent Architectures for Seismic Tomography. 81st EAGE Conference and Exhibition 2019 (2019).
Double descent in the condition number. (2019).
Fixing typos, clarifying error in y, best approach is crossvalidation (837.18 KB)
Incorporated footnote in text plus other edits (854.05 KB)
Deleted previous discussion on kernel regression and deep nets: it will appear, extended, in a separate paper (795.28 KB)
RevisedPNASV2.pdf (261.24 KB)




Dynamics & Generalization in Deep Networks -Minimizing the Norm. NAS Sackler Colloquium on Science of Deep Learning (2019).
Eccentricity Dependent Neural Network with Recurrent Attention for Scale, Translation and Clutter Invariance . Vision Science Society (2019).