Publication
Measuring and modeling the perception of natural and unconstrained gaze in humans and machines. (2016).
CBMM-Memo-059.pdf (1.71 MB)
Loss landscape: SGD has a better view. (2020).
CBMM-Memo-107.pdf (1.03 MB)
Typos and small edits, ver11 (955.08 KB)
Small edits, corrected Hessian for spurious case (337.19 KB)
Learning Mid-Level Auditory Codes from Natural Sound Statistics. (2017).
MlynarskiMcDermott_Memo060.pdf (7.11 MB)
Learning Functions: When Is Deep Better Than Shallow. (2016). at <https://arxiv.org/pdf/1603.00988v4.pdf>
Learning An Invariant Speech Representation. (2014).
CBMM-Memo-022-1406.3884v1.pdf (1.81 MB)
Learning a natural-language to LTL executable semantic parser for grounded robotics. (2020). doi:https://doi.org/10.48550/arXiv.2008.03277
CBMM-Memo-122.pdf (1.03 MB)
The Janus effects of SGD vs GD: high noise and low rank. (2023).
Updated with appendix showing empirically that the main results extend to deep nonlinear networks (2.95 MB)
Small updates...typos... (616.82 KB)
I-theory on depth vs width: hierarchical function composition. (2015).
cbmm_memo_041.pdf (1.18 MB)
The Invariance Hypothesis Implies Domain-Specific Regions in Visual Cortex. (2014). doi:10.1101/004473
CBMM Memo 004_new.pdf (2.25 MB)
On Invariance and Selectivity in Representation Learning. (2015).
CBMM Memo No. 029 (812.07 KB)
The infancy of the human brain. (2016). doi:http://dx.doi.org/10.1016/j.neuron.2015.09.026
CBMM-Memo-053.pdf (1.51 MB)
Incorporating Rich Social Interactions Into MDPs. (2022).
CBMM-Memo-133.pdf (1.68 MB)
Implicit dynamic regularization in deep networks. (2020).
v1.2 (2.29 MB)
v.59 Update on rank (2.43 MB)
Image interpretation by iterative bottom-up top- down processing. (2021).
CBMM-Memo-120.pdf (2.83 MB)
Image interpretation above and below the object level. (2018).
CBMM-Memo-089.pdf (2.06 MB)
Human-Machine CRFs for Identifying Bottlenecks in Holistic Scene Understanding. (2014).
CBMM-Memo-020.pdf (1.89 MB)
How Important is Weight Symmetry in Backpropagation?. (2015).
1510.05067v3.pdf (615.32 KB)
How Deep Sparse Networks Avoid the Curse of Dimensionality: Efficiently Computable Functions are Compositionally Sparse. (2022).
v1.0 (984.15 KB)
v5.7 adding in context learning etc (1.16 MB)
A Homogeneous Transformer Architecture. (2023).
CBMM Memo 143 v2 (1.1 MB)
Holographic Embeddings of Knowledge Graphs. (2015).
holographic-embeddings.pdf (677.87 KB)
Hippocampal Remapping as Hidden State Inference. (2019). doi:https://doi.org/10.1101/743260
CBMM-Memo-101.pdf (12.78 MB)
Hierarchically Local Tasks and Deep Convolutional Networks. (2020).
CBMM_Memo_109.pdf (2.12 MB)
Group Invariant Deep Representations for Image Instance Retrieval. (2016).
CBMM-Memo-043.pdf (2.66 MB)
. The Genesis Story Understanding and Story Telling System A 21st Century Step toward Artificial Intelligence. (2014).
CBMM-Memo-019_StoryWhitePaper.pdf (894.38 KB)
On Generalization Bounds for Neural Networks with Low Rank Layers. (2024).
CBMM-Memo-151.pdf (697.31 KB)
]