Title | Emergence of Sparse Representations from Noise |
Publication Type | Conference Poster |
Year of Publication | 2023 |
Authors | Bricken, T, Schaeffer, R, Olshausen, B, Kreiman, G |
Conference Name | ICML 2023 |
Date Published | 07/2023 |
Place Published | Honolulu, HI |
Abstract | A hallmark of biological neural networks, which distinguishes them from their artificial counter- parts, is the high degree of sparsity in their acti- vations. This discrepancy raises three questions our work helps to answer: (i) Why are biologi- cal networks so sparse? (ii) What are the bene- fits of this sparsity? (iii) How can these benefits be utilized by deep learning models? Our an- swers to all of these questions center around train- ing networks to handle random noise. Surpris- ingly, we discover that noisy training introduces three implicit loss terms that result in sparsely firing neurons specializing to high variance fea- tures of the dataset. When trained to reconstruct noisy-CIFAR10, neurons learn biological recep- tive fields. More broadly, noisy training presents a new approach to potentially increase model inter- pretability with additional benefits to robustness and computational efficiency. |
URL | https://openreview.net/pdf?id=cxYaBAXVKg |
Citation Key | 5539 |
Associated Module:
CBMM Relationship:
- CBMM Funded