Publication
CUDA-Optimized real-time rendering of a Foveated Visual System. Shared Visual Representations in Human and Machine Intelligence (SVRHM) workshop at NeurIPS 2020 (2020). at <https://arxiv.org/abs/2012.08655>
Foveated_Drone_SVRHM_2020.pdf (13.44 MB)
v1 (12/15/2020) (14.7 MB)
Biologically-plausible learning algorithms can scale to large datasets. International Conference on Learning Representations, (ICLR 2019) (2019).
gk7779.pdf (721.53 KB)
Theories of Deep Learning: Approximation, Optimization and Generalization . TECHCON 2019 (2019).
Subtasks of Unconstrained Face Recognition. (2014).
Leibo_Liao_Poggio_subtasks_VISAPP_2014.pdf (268.69 KB)
Invariant representations for action recognition in the visual system. Vision Sciences Society 15, (2015).
Invariant representations for action recognition in the visual system. Computational and Systems Neuroscience (2015).
On the Human Visual System Invariance to Translation and Scale. Vision Sciences Society (2017).
Eccentricity Dependent Deep Neural Networks for Modeling Human Vision. Vision Sciences Society (2017).
View-tolerant face recognition and Hebbian learning imply mirror-symmetric neural tuning to head orientation. (2016).
faceMirrorSymmetry_memo_ver01.pdf (3.93 MB)
Unsupervised learning of invariant representations with low sample complexity: the magic of sensory cortex or a new framework for machine learning?. (2014).
CBMM Memo No. 001 (940.36 KB)
Unsupervised learning of clutter-resistant visual representations from natural videos. (2014).
1409.3879v2.pdf (3.64 MB)
Theory of Deep Learning III: explaining the non-overfitting puzzle. (2017).
CBMM-Memo-073.pdf (2.65 MB)
CBMM Memo 073 v2 (revised 1/15/2018) (2.81 MB)
CBMM Memo 073 v3 (revised 1/30/2018) (2.72 MB)
CBMM Memo 073 v4 (revised 12/30/2018) (575.72 KB)
Theory of Deep Learning IIb: Optimization Properties of SGD. (2017).
CBMM-Memo-072.pdf (3.66 MB)
Theory III: Dynamics and Generalization in Deep Networks. (2018).
Original, intermediate versions are available under request (2.67 MB)
CBMM Memo 90 v12.pdf (4.74 MB)
Theory_III_ver44.pdf Update Hessian (4.12 MB)
Theory_III_ver48 (Updated discussion of convergence to max margin) (2.56 MB)
fixing errors and sharpening some proofs (2.45 MB)
Theory II: Landscape of the Empirical Risk in Deep Learning. (2017).
CBMM Memo 066_1703.09833v2.pdf (5.56 MB)
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?. (2016).
CBMM-Memo-058v1.pdf (2.42 MB)
CBMM-Memo-058v5.pdf (2.45 MB)
CBMM-Memo-058-v6.pdf (2.74 MB)
Proposition 4 has been deleted (2.75 MB)
Theoretical Issues in Deep Networks. (2019).
CBMM Memo 100 v1 (1.71 MB)
CBMM Memo 100 v3 (8/25/2019) (1.31 MB)
CBMM Memo 100 v4 (11/19/2019) (1008.23 KB)
System identification of neural systems: If we got it right, would we know?. (2022).
CBMM-Memo-136.pdf (1.75 MB)
Symmetry Regularization. (2017).
CBMM-Memo-063.pdf (6.1 MB)
Streaming Normalization: Towards Simpler and More Biologically-plausible Normalizations for Online and Recurrent Learning. (2016).
CBMM-Memo-057.pdf (1.27 MB)
Stable Foundations for Learning: a framework for learning theory (in both the classical and modern regime). (2020).
Original file (584.54 KB)
Corrected typos and details of "equivalence" CV stability and expected error for interpolating machines. Added Appendix on SGD. (905.29 KB)
Edited Appendix on SGD. (909.19 KB)
Deleted Appendix. Corrected typos etc (880.27 KB)
Added result about square loss and min norm (898.03 KB)
Single units in a deep neural network functionally correspond with neurons in the brain: preliminary results. (2018).
CBMM-Memo-093.pdf (2.99 MB)
SGD Noise and Implicit Low-Rank Bias in Deep Neural Networks. (2022).
Implicit Rank Minimization.pdf (1.76 MB)
SGD and Weight Decay Provably Induce a Low-Rank Bias in Deep Neural Networks. (2023).
Low-rank bias.pdf (2.38 MB)
Self-Assembly of a Biologically Plausible Learning Circuit. (2024).
CBMM-Memo-152.pdf (1.84 MB)
]