Publication

Export 109 results:
Filters: Author is T. Poggio  [Clear All Filters]
2019
Mhaskar, H. N. & Poggio, T. An analysis of training and generalization errors in shallow and deep networks. (2019).PDF icon CBMM-Memo-098.pdf (687.36 KB)PDF icon CBMM Memo 098 v4 (08/2019) (2.63 MB)
Xiao, W., Chen, H., Liao, Q. & Poggio, T. Biologically-plausible learning algorithms can scale to large datasets. International Conference on Learning Representations, (ICLR 2019) (2019).PDF icon gk7779.pdf (721.53 KB)
Adler, A., Araya-Polo, M. & Poggio, T. Deep Recurrent Architectures for Seismic Tomography. 81st EAGE Conference and Exhibition 2019 (2019).
Poggio, T., Kur, G. & Banburski, A. Double descent in the condition number. (2019).PDF icon Fixing typos, clarifying error in y, best approach is crossvalidation (837.18 KB)PDF icon Incorporated footnote in text plus other edits (854.05 KB)PDF icon Deleted previous discussion on kernel regression and deep nets: it will appear, extended, in a separate paper (795.28 KB)PDF icon RevisedPNASV2.pdf (261.24 KB)
Banburski, A. et al. Dynamics & Generalization in Deep Networks -Minimizing the Norm. NAS Sackler Colloquium on Science of Deep Learning (2019).
Han, Y., Roig, G., Geiger, G. & Poggio, T. Properties of invariant object recognition in human one-shot learning suggests a hierarchical architecture different from deep convolutional neural networks. Vision Science Society (2019).
Poggio, T., Banburski, A. & Liao, Q. Theoretical Issues in Deep Networks. (2019).PDF icon CBMM Memo 100 v1 (1.71 MB)PDF icon CBMM Memo 100 v3 (8/25/2019) (1.31 MB)PDF icon CBMM Memo 100 v4 (11/19/2019) (1008.23 KB)
Liao, Q., Banburski, A. & Poggio, T. Theories of Deep Learning: Approximation, Optimization and Generalization . TECHCON 2019 (2019).
Banburski, A. et al. Weight and Batch Normalization implement Classical Generalization Bounds . ICML (2019).
2018
Mhaskar, H. & Poggio, T. An analysis of training and generalization errors in shallow and deep networks. (2018).PDF icon CBMM-Memo-076.pdf (772.61 KB)PDF icon CBMM-Memo-076v2.pdf (2.67 MB)
Xiao, W., Chen, H., Liao, Q. & Poggio, T. Biologically-plausible learning algorithms can scale to large datasets. (2018).PDF icon CBMM-Memo-092.pdf (1.31 MB)
Villalobos, K. M. et al. Can Deep Neural Networks Do Image Segmentation by Understanding Insideness?. (2018).PDF icon CBMM-Memo-095.pdf (1.96 MB)
Liao, Q., Miranda, B., Hidary, J. & Poggio, T. Classical generalization bounds are surprisingly tight for Deep Networks. (2018).PDF icon CBMM-Memo-091.pdf (1.43 MB)PDF icon CBMM-Memo-091-v2.pdf (1.88 MB)
Isik, L., Tacchetti, A. & Poggio, T. A fast, invariant representation for human action in the visual system. Journal of Neurophysiology (2018). doi:https://doi.org/10.1152/jn.00642.2017
Tacchetti, A., Isik, L. & Poggio, T. Invariant Recognition Shapes Neural Representations of Visual Input. Annual Review of Vision Science 4, 403 - 422 (2018).PDF icon annurev-vision-091517-034103.pdf (1.55 MB)
Arend, L. et al. Single units in a deep neural network functionally correspond with neurons in the brain: preliminary results. (2018).PDF icon CBMM-Memo-093.pdf (2.99 MB)
Poggio, T. & Liao, Q. Theory I: Deep networks and the curse of dimensionality. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).PDF icon 02_761-774_00966_Bpast.No_.66-6_28.12.18_K1.pdf (1.18 MB)
Poggio, T. & Liao, Q. Theory II: Deep learning and optimization. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).PDF icon 03_775-788_00920_Bpast.No_.66-6_31.12.18_K2.pdf (5.43 MB)
Banburski, A. et al. Theory III: Dynamics and Generalization in Deep Networks. (2018).PDF icon Original, intermediate versions are available under request (2.67 MB)PDF icon Theory_III_ver44.pdf Update Hessian (4.12 MB)PDF icon Theory_III_ver48 (Updated discussion of convergence to max margin) (2.56 MB)

Pages