Export 806 results:
Function approximation by deep networks. Communications on Pure & Applied Analysis 19, 4085 - 4095 (2020). 1534-0392_2020_8_4085.pdf (514.57 KB)
Gross means Great. Progress in Neurobiology 195, 101924 (2020).
Hierarchical neural network models that more closely match primary visual cortex tend to better explain higher level visual cortical responses . COSYNE (2020).
Hierarchical structure is employed by humans during visual motion perception. Proceedings of the National Academy of Sciences 117, 24581 - 24589 (2020).
Hierarchically Local Tasks and Deep Convolutional Networks. (2020). CBMM_Memo_109.pdf (2.12 MB)
Hippocampal remapping as hidden state inference. eLife 9, (2020).
Implicit dynamic regularization in deep networks. (2020). v1.2 (2.29 MB) v.59 Update on rank (2.43 MB)
Incorporating intrinsic suppression in deep neural networks captures dynamics of adaptation in neurophysiology and perception. Science Advances 6, eabd4205 (2020). gk7967.pdf (3.07 MB)
Infants represent 'like-kin' affiliation . Budapest Conference on Cognitive Development (2020).
Infants’ sensitivity to shape changes in 2D visual forms. Infancy 25, 618 - 639 (2020).
The inferior temporal cortex is a potential cortical precursor of orthographic processing in untrained monkeys. Nature Communications 11, (2020). s41467-020-17714-3.pdf (25.01 MB)
Integrative Benchmarking to Advance Neurally Mechanistic Models of Human Intelligence. Neuron 108, 413 - 423 (2020).
Learning a Natural-language to LTL Executable Semantic Parser for Grounded Robotics. (Proceedings of Conference on Robot Learning (CoRL-2020), 2020). at <https://corlconf.github.io/paper_385/>
Learning a natural-language to LTL executable semantic parser for grounded robotics. (2020). doi:https://doi.org/10.48550/arXiv.2008.03277 CBMM-Memo-122.pdf (1.03 MB)
Learning abstract structure for drawing by efficient motor program induction. Advances in Neural Information Processing Systems 33 pre-proceedings (NeurIPS 2020) (2020). at <https://papers.nips.cc/paper/2020/hash/1c104b9c0accfca52ef21728eaf01453-Abstract.html>
Learning Compositional Rules via Neural Program Synthesis. Advances in Neural Information Processing Systems 33 pre-proceedings (NeurIPS 2020) (2020). at <https://proceedings.neurips.cc/paper/2020/hash/7a685d9edd95508471a9d3d6fcace432-Abstract.html> 2003.05562.pdf (2.51 MB)
Learning from multiple informants: Children’s response to epistemic bases for consensus judgments. Journal of Experimental Child Psychology 192, 104759 (2020).
The logic of universalization guides moral judgment. Proceedings of the National Academy of Sciences (PNAS) 202014505 (2020). doi:10.1073/pnas.2014505117
Loss landscape: SGD has a better view. (2020). CBMM-Memo-107.pdf (1.03 MB) Typos and small edits, ver11 (955.08 KB) Small edits, corrected Hessian for spurious case (337.19 KB)
Minimal videos: Trade-off between spatial and temporal information in human and machine vision. Cognition (2020). doi:10.1016/j.cognition.2020.104263
The neural mechanisms of face processing: cells, areas, networks, and models. Current Opinion in Neurobiology 60, 184 - 191 (2020).
A neural network trained for prediction mimics diverse features of biological neurons and perception. Nature Machine Intelligence 2, 210 - 219 (2020).
A neural network trained to predict future video frames mimics critical properties of biological neuronal responses and perception. Nature Machine Learning (2020). 1805.10734.pdf (9.59 MB)
Online Developmental Science to Foster Innovation, Access, and Impact. Trends in Cognitive Sciences 24, 675 - 678 (2020).