Publication
Dynamics & Generalization in Deep Networks -Minimizing the Norm. NAS Sackler Colloquium on Science of Deep Learning (2019).
Dynamics and Neural Collapse in Deep Classifiers trained with the Square Loss. (2021).
v1.0 (4.61 MB)
v1.4corrections to generalization section (5.85 MB)
v1.7Small edits (22.65 MB)
Dreaming with ARC. Learning Meets Combinatorial Algorithms workshop at NeurIPS 2020 (2020).
CBMM Memo 113.pdf (1019.64 KB)
Double descent in the condition number. (2019).
Fixing typos, clarifying error in y, best approach is crossvalidation (837.18 KB)
Incorporated footnote in text plus other edits (854.05 KB)
Deleted previous discussion on kernel regression and deep nets: it will appear, extended, in a separate paper (795.28 KB)
correcting a bad typo (261.24 KB)
Deleted plot of condition number of kernel matrix: we cannot get a double descent curve (769.32 KB)
Do Deep Neural Networks Suffer from Crowding?. (2017).
CBMM-Memo-069.pdf (6.47 MB)
Distribution of Classification Margins: Are All Data Equal?. (2021).
CBMM Memo 115.pdf (9.56 MB)
arXiv version (23.05 MB)
Discriminative Template Learning in Group-Convolutional Networks for Invariant Speech Representations. INTERSPEECH-2015 (International Speech Communication Association (ISCA), 2015). at <http://www.isca-speech.org/archive/interspeech_2015/i15_3229.html>
Deep vs. shallow networks: An approximation theory perspective. Analysis and Applications 14, 829 - 848 (2016).
Deep vs. shallow networks : An approximation theory perspective. (2016).
Original submission, visit the link above for the updated version (960.27 KB)
A Deep Representation for Invariance and Music Classification. ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (IEEE, 2014). doi:10.1109/ICASSP.2014.6854954
A Deep Representation for Invariance And Music Classification. (2014).
CBMM-Memo-002.pdf (1.63 MB)
Deep Recurrent Architectures for Seismic Tomography. 81st EAGE Conference and Exhibition 2019 (2019).
Deep Learning: mathematics and neuroscience. (2016).
Deep Learning- mathematics and neuroscience.pdf (1.25 MB)
Deep Learning for Seismic Inverse Problems: Toward the Acceleration of Geophysical Analysis Workflows. IEEE Signal Processing Magazine 38, 89 - 119 (2021).
Deep Leaning: Mathematics and Neuroscience. A Sponsored Supplement to Science Brain-Inspired intelligent robotics: The intersection of robotics and neuroscience, 9-12 (2016).
Deep Convolutional Networks are Hierarchical Kernel Machines. (2015).
CBMM Memo 035_rev5.pdf (975.65 KB)
CUDA-Optimized real-time rendering of a Foveated Visual System. Shared Visual Representations in Human and Machine Intelligence (SVRHM) workshop at NeurIPS 2020 (2020). at <https://arxiv.org/abs/2012.08655>
Foveated_Drone_SVRHM_2020.pdf (13.44 MB)
v1 (12/15/2020) (14.7 MB)
Computational role of eccentricity dependent cortical magnification. (2014).
CBMM-Memo-017.pdf (1.04 MB)
Compression of Deep Neural Networks for Image Instance Retrieval. (2017). at <https://arxiv.org/abs/1701.04923>
1701.04923.pdf (614.33 KB)
Compositional sparsity of learnable functions. Bulletin of the American Mathematical Society 61, 438-456 (2024).
Compositional Sparsity of Learnable Functions. (2024).
This is an update of the AMS paper (230.72 KB)
Complexity Control by Gradient Descent in Deep Networks. Nature Communications 11, (2020).
s41467-020-14663-9.pdf (431.68 KB)
CNS (“Cortical Network Simulator”): a GPU-based framework for simulating cortically-organized networks. (2010).
cns.tar (1.46 MB)
MIT-CSAIL-TR-2010-013.pdf (389.38 KB)
(last version before switch to classdef syntax) (1.05 MB)
Classical generalization bounds are surprisingly tight for Deep Networks. (2018).
CBMM-Memo-091.pdf (1.43 MB)
CBMM-Memo-091-v2.pdf (1.88 MB)
Cervelli menti algoritmi. 272 (Sperling & Kupfer, 2023). at <https://www.sperling.it/libri/cervelli-menti-algoritmi-marco-magrini>
]