Emergence of Sparse Representations from Noise

TitleEmergence of Sparse Representations from Noise
Publication TypeConference Poster
Year of Publication2023
AuthorsBricken, T, Schaeffer, R, Olshausen, B, Kreiman, G
Conference NameICML 2023
Date Published07/2023
Place PublishedHonolulu, HI

A hallmark of biological neural networks, which distinguishes them from their artificial counter- parts, is the high degree of sparsity in their acti- vations. This discrepancy raises three questions our work helps to answer: (i) Why are biologi- cal networks so sparse? (ii) What are the bene- fits of this sparsity? (iii) How can these benefits be utilized by deep learning models? Our an- swers to all of these questions center around train- ing networks to handle random noise. Surpris- ingly, we discover that noisy training introduces three implicit loss terms that result in sparsely firing neurons specializing to high variance fea- tures of the dataset. When trained to reconstruct noisy-CIFAR10, neurons learn biological recep- tive fields. More broadly, noisy training presents a new approach to potentially increase model inter- pretability with additional benefits to robustness and computational efficiency.

Citation Key5539

Associated Module: 

CBMM Relationship: 

  • CBMM Funded