Publication
For interpolating kernel machines, the minimum norm ERM solution is the most stable. (2020).
CBMM_Memo_108.pdf (1015.14 KB)
Better bound (without inequalities!) (1.03 MB)


Dynamics and Neural Collapse in Deep Classifiers trained with the Square Loss. (2021).
v1.0 (4.61 MB)
v1.4corrections to generalization section (5.85 MB)
v1.7Small edits (22.65 MB)



Neural Collapse in Deep Homogeneous Classifiers and the role of Weight Decay. IEEE International Conference on Acoustics, Speech and Signal Processing (2022).
Feature learning in deep classifiers through Intermediate Neural Collapse. (2023).
Feature_Learning_memo.pdf (2.16 MB)

On Generalization Bounds for Neural Networks with Low Rank Layers. (2024).
CBMM-Memo-151.pdf (697.31 KB)
