Publication
From Associative Memories to Powerful Machines. (2021).
v1.0 (1.01 MB)
v1.3Section added August 6 on self attention (3.9 MB)
Theory I: Deep networks and the curse of dimensionality. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).
02_761-774_00966_Bpast.No_.66-6_28.12.18_K1.pdf (1.18 MB)
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review. International Journal of Automation and Computing 1-17 (2017). doi:10.1007/s11633-017-1054-2
art%3A10.1007%2Fs11633-017-1054-2.pdf (1.68 MB)
Notes on Hierarchical Splines, DCLNs and i-theory. (2015).
CBMM Memo 037 (1.83 MB)
Turing++ Questions: A Test for the Science of (Human) Intelligence. AI Magazine 37 , 73-77 (2016).
Turing_Plus_Questions.pdf (424.91 KB)
An Overview of Some Issues in the Theory of Deep Networks. IEEJ Transactions on Electrical and Electronic Engineering 15, 1560 - 1571 (2020).
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?. (2016).
CBMM-Memo-058v1.pdf (2.42 MB)
CBMM-Memo-058v5.pdf (2.45 MB)
CBMM-Memo-058-v6.pdf (2.74 MB)
Proposition 4 has been deleted (2.75 MB)
The History of Neuroscience in Autobiography Volume 8 8, (Society for Neuroscience, 2014).
Volume Introduction and Preface (232.8 KB)
TomasoPoggio.pdf (1.43 MB)
Theoretical Issues in Deep Networks. (2019).
CBMM Memo 100 v1 (1.71 MB)
CBMM Memo 100 v3 (8/25/2019) (1.31 MB)
CBMM Memo 100 v4 (11/19/2019) (1008.23 KB)
Theory II: Landscape of the Empirical Risk in Deep Learning. (2017).
CBMM Memo 066_1703.09833v2.pdf (5.56 MB)
Theoretical issues in deep networks. Proceedings of the National Academy of Sciences 201907369 (2020). doi:10.1073/pnas.1907369117
PNASlast.pdf (915.3 KB)
What if.. (2015).
What if.pdf (2.09 MB)
On Generalization Bounds for Neural Networks with Low Rank Layers. (2024).
CBMM-Memo-151.pdf (697.31 KB)
A Virtual Reality Experimental Approach for Studying How the Brain Implements Attentive Behaviors. Tri-Institute 2019 Gateways to the Laboratory Summer Program (2019).
Spatiotemporal dynamics of neocortical excitation and inhibition during human sleep. Proceedings of the National Academy of Sciences (2012). doi:10.1073/pnas.1109895109
SpatiotemporalDynamic.pdf (2.56 MB)
Individual Differences in Face Looking Behavior Generalize from the Lab to the World. Journal of Vision 16, (2016).
Real World Face Fixations, Journal of Vision article, 2016 (20.25 MB)
Eye movements and retinotopic tuning in developmental prosopagnosia. Journal of Vision 19, 7 (2019).
Individual differences in face-looking behavior generalize from the lab to the world. Journal of Vision (2016).
How does the primate brain combine generative and discriminative computations in vision?. arXiv (2024). at <https://arxiv.org/abs/2401.06005>
Rapid Physical Predictions from Convolutional Neural Networks. Neural Information Processing Systems, Intuitive Physics Workshop (2016). at <http://phys.csail.mit.edu/papers/9.pdf>
Rapid Physical Predictions - NIPS Physics Workshop Poster (1.47 MB)
Oscillations, neural computations and learning during wake and sleep. Current Opinion in Neurobiology 44C, (2017).
Temporal Grounding Graphs for Language Understanding with Accrued Visual-Linguistic Context. Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI 2017) (2017). at <c>
Incentives Boost Model-Based Control Across a Range of Severity on Several Psychiatric Constructs. Biological Psychiatry 85, 425 - 433 (2019).
Spoken ObjectNet: A Bias-Controlled Spoken Caption Dataset. (2021).
CBMM-Memo-128.pdf (2.91 MB)
]