All Publications
2024
CBMM Memo No.
145
“Compositional Sparsity of Learnable Functions”. 2024. CBMM-Memo-145.pdf (1.25 MB) ,
CBMM Funded
2023
CBMM Memo No.
144
“The Janus effects of SGD vs GD: high noise and low rank”. 2023. The_Janus_effects_of_SGD_vs_GD__high_noise_and_low_rank-4.pdf (2.83 MB) Updated with appendix showing empirically that the main results extend to deep nonlinear networks (2.95 MB) ,
CBMM Funded
“Norm-based Generalization Bounds for Sparse Neural Networks”, in NeurIPS 2023, New Orleans, 2023. NeurIPS-2023-norm-based-generalization-bounds-for-sparse-neural-networks-Paper-Conference.pdf (577.69 KB) ,
CBMM Funded
Cervelli menti algoritmi. Sperling & Kupfer, 2023, p. 272. ,
CBMM Funded
CBMM Memo No.
143
“A Homogeneous Transformer Architecture”. 2023. CBMM-Memo-143.pdf (1.07 MB) ,
CBMM Funded
“System Identification of Neural Systems: If We Got It Right, Would We Know?”, Proceedings of the 40th International Conference on Machine Learning, PMLR, vol. 202. pp. 12430-12444, 2023. han23d.pdf (797.48 KB) ,
CBMM Funded
CBMM Memo No.
142
“Skip Connections Increase the Capacity of Associative Memories in Variable Binding Mechanisms”. 2023. CBMM-Memo-142.pdf (1.64 MB) ,
CBMM Funded
CBMM Memo No.
141
“Feature learning in deep classifiers through Intermediate Neural Collapse”. 2023. Feature_Learning_memo.pdf (2.16 MB) ,
CBMM Funded
“An empirical assay of view-invariant object learning in humans and comparison with baseline image-computable models”, bioRxiv, 2023. ,
CBMM Related
“For interpolating kernel machines, minimizing the norm of the ERM solution maximizes stability”, Analysis and Applications, vol. 21, no. 01, pp. 193 - 215, 2023. ,
CBMM Funded
“Dynamics in Deep Classifiers trained with the Square Loss: normalization, low rank, neural collapse and generalization bounds”, Research, 2023. research.0024.pdf (4.05 MB) ,
CBMM Funded
CBMM Memo No.
140
“SGD and Weight Decay Provably Induce a Low-Rank Bias in Deep Neural Networks”. 2023. Low-rank bias.pdf (2.38 MB) ,
CBMM Funded
CBMM Memo No.
139
“Norm-Based Generalization Bounds for Compositionally Sparse Neural Networks”. 2023. Norm-based bounds for convnets.pdf (1.2 MB) ,
CBMM Funded
2022
“NeuroDecodeR: A package for neural decoding analyses in R”, bioRxiv, 2022. ,
CBMM Funded
“Towards an objective characterization of an individual's facial movements using Self-Supervised Person-Specific-Models”, arXiv, 2022. ,
CBMM Funded
Lecture Notes in Computer ScienceComputer Vision – ECCV 2022Image2Point: 3D Point-Cloud Understanding with 2D Image Pretrained Models, vol. 13697. Cham: Springer Nature Switzerland, 2022, pp. 638 - 656. ,
CBMM Funded
“Primate Inferotemporal Cortex Neurons Generalize Better to Novel Image Distributions Than Analogous Deep Neural Networks Units”, in NeurIPS, 2022. ,
CBMM Related
“Representation Learning in Sensory Cortex: a theory”, IEEE Access, pp. 1 - 1, 2022. Representation_Learning_in_Sensory_Cortex_a_theory.pdf (1.17 MB) ,
CBMM Funded
CBMM Memo No.
137
“Understanding the Role of Recurrent Connections in Assembly Calculus”. 2022. CBMM-Memo-137.pdf (1.49 MB) ,
CBMM Funded
CBMM Memo No.
136
“System identification of neural systems: If we got it right, would we know?”. 2022. CBMM-Memo-136.pdf (1.75 MB) ,
CBMM Funded
“Aligning Model and Macaque Inferior Temporal Cortex Representations Improves Model-to-Human Behavioral Alignment and Adversarial Robustness”, bioRxiv, 2022. ,
CBMM Related
“Adversarially trained neural representations may already be as robust as corresponding biological neural representations”, arXiv, 2022. ,
CBMM Related
“A computational probe into the behavioral and neural markers of atypical facial emotion processing in autism”, The Journal of Neuroscience, pp. JN-RM-2229-21, 2022. ,
CBMM Related
“Neural Collapse in Deep Homogeneous Classifiers and the role of Weight Decay”, in IEEE International Conference on Acoustics, Speech and Signal Processing, Singapore, 2022. ,
CBMM Funded
“Brain-like functional specialization emerges spontaneously in deep neural networks”, Science Advances, vol. 8, no. 11, 2022. ,
CBMM Funded