Publication
Stable Foundations for Learning: a framework for learning theory (in both the classical and modern regime). (2020).
Original file (584.54 KB)
Corrected typos and details of "equivalence" CV stability and expected error for interpolating machines. Added Appendix on SGD. (905.29 KB)
Edited Appendix on SGD. (909.19 KB)
Deleted Appendix. Corrected typos etc (880.27 KB)
Added result about square loss and min norm (898.03 KB)
Computational role of eccentricity dependent cortical magnification. (2014).
CBMM-Memo-017.pdf (1.04 MB)
Complexity Control by Gradient Descent in Deep Networks. Nature Communications 11, (2020).
s41467-020-14663-9.pdf (431.68 KB)
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review. International Journal of Automation and Computing 1-17 (2017). doi:10.1007/s11633-017-1054-2
art%3A10.1007%2Fs11633-017-1054-2.pdf (1.68 MB)
A Perspective: Sparse Compositionality and Efficiently Computable Intelligence. (2026).
Perspective_SPCOMP-9.pdf (170.23 KB)
Turing++ Questions: A Test for the Science of (Human) Intelligence. AI Magazine 37 , 73-77 (2016).
Turing_Plus_Questions.pdf (424.91 KB)
Notes on Hierarchical Splines, DCLNs and i-theory. (2015).
CBMM Memo 037 (1.83 MB)
Compositional sparsity of learnable functions. Bulletin of the American Mathematical Society 61, 438-456 (2024).
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?. (2016).
CBMM-Memo-058v1.pdf (2.42 MB)
CBMM-Memo-058v5.pdf (2.45 MB)
CBMM-Memo-058-v6.pdf (2.74 MB)
Proposition 4 has been deleted (2.75 MB)
On efficiently computable functions, deep networks and sparse compositionality. (2025).
Deep_sparse_networks_approximate_efficiently_computable_functions.pdf (223.15 KB)
From Associative Memories to Powerful Machines. (2021).
v1.0 (1.01 MB)
v1.3Section added August 6 on self attention (3.9 MB)
What if.. (2015).
What if.pdf (2.09 MB)
On Generalization Bounds for Neural Networks with Low Rank Layers. (2024).
CBMM-Memo-151.pdf (697.31 KB)
A Virtual Reality Experimental Approach for Studying How the Brain Implements Attentive Behaviors. Tri-Institute 2019 Gateways to the Laboratory Summer Program (2019).
Spatiotemporal dynamics of neocortical excitation and inhibition during human sleep. Proceedings of the National Academy of Sciences (2012). doi:10.1073/pnas.1109895109
SpatiotemporalDynamic.pdf (2.56 MB)
Individual Differences in Face Looking Behavior Generalize from the Lab to the World. Journal of Vision 16, (2016).
Real World Face Fixations, Journal of Vision article, 2016 (20.25 MB)
Individual differences in face-looking behavior generalize from the lab to the world. Journal of Vision (2016).
Eye movements and retinotopic tuning in developmental prosopagnosia. Journal of Vision 19, 7 (2019).
How does the primate brain combine generative and discriminative computations in vision?. arXiv (2024). at <https://arxiv.org/abs/2401.06005>
Rapid Physical Predictions from Convolutional Neural Networks. Neural Information Processing Systems, Intuitive Physics Workshop (2016). at <http://phys.csail.mit.edu/papers/9.pdf>
Rapid Physical Predictions - NIPS Physics Workshop Poster (1.47 MB)
Oscillations, neural computations and learning during wake and sleep. Current Opinion in Neurobiology 44C, (2017).
Temporal Grounding Graphs for Language Understanding with Accrued Visual-Linguistic Context. Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI 2017) (2017). at <c>
Incentives Boost Model-Based Control Across a Range of Severity on Several Psychiatric Constructs. Biological Psychiatry 85, 425 - 433 (2019).
Spoken ObjectNet: A Bias-Controlled Spoken Caption Dataset. (2021).
CBMM-Memo-128.pdf (2.91 MB)
]