Publication

Export 143 results:
Filters: Author is Tomaso Poggio  [Clear All Filters]
2020
Mhaskar, H. & Poggio, T. Function approximation by deep networks. Communications on Pure & Applied Analysis 19, 4085 - 4095 (2020).PDF icon 1534-0392_2020_8_4085.pdf (514.57 KB)
Deza, A., Liao, Q., Banburski, A. & Poggio, T. Hierarchically Local Tasks and Deep Convolutional Networks. (2020).PDF icon CBMM_Memo_109.pdf (2.12 MB)
Poggio, T., Liao, Q. & Xu, M. Implicit dynamic regularization in deep networks. (2020).PDF icon v1.2 (2.29 MB)PDF icon v.59 Update on rank (2.43 MB)
Poggio, T. & Cooper, Y. Loss landscape: SGD has a better view. (2020).PDF icon CBMM-Memo-107.pdf (1.03 MB)PDF icon Typos and small edits, ver11 (955.08 KB)PDF icon Small edits, corrected Hessian for spurious case (337.19 KB)
Poggio, T. & Banburski, A. An Overview of Some Issues in the Theory of Deep Networks. IEEJ Transactions on Electrical and Electronic Engineering 15, 1560 - 1571 (2020).
Han, Y., Roig, G., Geiger, G. & Poggio, T. Scale and translation-invariance for novel objects in human vision. Scientific Reports 10, (2020).PDF icon s41598-019-57261-6.pdf (1.46 MB)
Poggio, T. Stable Foundations for Learning: a framework for learning theory (in both the classical and modern regime). (2020).PDF icon Original file (584.54 KB)PDF icon Corrected typos and details of "equivalence" CV stability and expected error for interpolating machines. Added Appendix on SGD.  (905.29 KB)PDF icon Edited Appendix on SGD. (909.19 KB)PDF icon Deleted Appendix. Corrected typos etc (880.27 KB)PDF icon Added result about square loss and min norm (898.03 KB)
Poggio, T., Banburski, A. & Liao, Q. Theoretical issues in deep networks. Proceedings of the National Academy of Sciences 201907369 (2020). doi:10.1073/pnas.1907369117PDF icon PNASlast.pdf (915.3 KB)
2019
Mhaskar, H. & Poggio, T. An analysis of training and generalization errors in shallow and deep networks. (2019).PDF icon CBMM-Memo-098.pdf (687.36 KB)PDF icon CBMM Memo 098 v4 (08/2019) (2.63 MB)
Xiao, W., Chen, H., Liao, Q. & Poggio, T. Biologically-plausible learning algorithms can scale to large datasets. International Conference on Learning Representations, (ICLR 2019) (2019).PDF icon gk7779.pdf (721.53 KB)
Adler, A., Araya-Polo, M. & Poggio, T. Deep Recurrent Architectures for Seismic Tomography. 81st EAGE Conference and Exhibition 2019 (2019).
Poggio, T., Kur, G. & Banburski, A. Double descent in the condition number. (2019).PDF icon Fixing typos, clarifying error in y, best approach is crossvalidation (837.18 KB)PDF icon Incorporated footnote in text plus other edits (854.05 KB)PDF icon Deleted previous discussion on kernel regression and deep nets: it will appear, extended, in a separate paper (795.28 KB)PDF icon correcting a bad typo (261.24 KB)PDF icon Deleted plot of condition number of kernel matrix: we cannot get a double descent curve  (769.32 KB)
Banburski, A. et al. Dynamics & Generalization in Deep Networks -Minimizing the Norm. NAS Sackler Colloquium on Science of Deep Learning (2019).
Zhang, J., Han, Y., Poggio, T. & Roig, G. Eccentricity Dependent Neural Network with Recurrent Attention for Scale, Translation and Clutter Invariance . Vision Science Society (2019).
Han, Y., Roig, G., Geiger, G. & Poggio, T. Properties of invariant object recognition in human one-shot learning suggests a hierarchical architecture different from deep convolutional neural networks. Vision Science Society (2019).
Han, Y., Roig, G., Geiger, G. & Poggio, T. Properties of invariant object recognition in human oneshot learning suggests a hierarchical architecture different from deep convolutional neural networks . Vision Science Society (2019). doi:10.1167/19.10.28d
Poggio, T., Banburski, A. & Liao, Q. Theoretical Issues in Deep Networks. (2019).PDF icon CBMM Memo 100 v1 (1.71 MB)PDF icon CBMM Memo 100 v3 (8/25/2019) (1.31 MB)PDF icon CBMM Memo 100 v4 (11/19/2019) (1008.23 KB)
Liao, Q., Banburski, A. & Poggio, T. Theories of Deep Learning: Approximation, Optimization and Generalization . TECHCON 2019 (2019).
Banburski, A. et al. Weight and Batch Normalization implement Classical Generalization Bounds . ICML (2019).

Pages