Home Page Spotlights

CBMM logo
The paper briefly reviews several recent results on hierarchical architectures for learning from examples, that may formally explain the conditions under which Deep Convolutional Neural Networks perform much better in function approximation problems ...
CBMM logo
The human infant brain is the only known machine able to master a natural language and develop explicit, symbolic, and communicable systems of knowledge that deliver rich representations of the external world. ...
Screenshot of MIT News
Resource could yield linguistic insights, practical applications for non-native English speakers. Article by Larry Hardesty, MIT News Office, July 29, 2016.
CBMM logo
We introduce the Treebank of Learner English (TLE), the first publicly available syntactic treebank for English as a Second Language (ESL). The TLE provides manually annotated POS tags and Universal Dependency (UD) trees for 5,124 sentences ...
Screenshot of video player
Watch Demis Hassabis's CBMM Special Seminar talk " Towards General Artificial Intelligence"recorded on April 20, 2016, in MIT Seminar Room #54-100.
Figure 2 from CBMM Memo No. 51
Understanding language goes hand in hand with the ability to integrate complex contextual information obtained via perception. In this work, we present a novel task for grounded language understanding: disambiguating a sentence given a visual scene which
ed boyden
“If we imagine the brain as a computer, optogenetics is a key that allows us to send extremely precise commands. It is a tool whereby we can manipulate the brain with exquisite precision.” - Ed Boyden
Detail from special issue cover.
"The goal of this special issue was to explore some of the mathematical ideas and problems at the heart of deep learning. ..."
This work examines the impact of crosslinguistic transfer on grammatical errors in English as Second Language (ESL) texts.
Research figure from memo.
The primate brain contains a hierarchy of visual areas, dubbed the ventral stream, which rapidly computes object representations that are both specific for object identity and relatively robust against identity-preserving transformations ...
Research image: Table 1 from CBMM Memo No. 048
"How do people learn about complex functional structure? Taking inspiration from other areas of cognitive science, we propose that this is accomplished by harnessing compositionality: complex structure is decomposed into simpler building blocks. ..."
The Matter of Minds
MIT's Campaign for a Better World recently kicked off spotlighting the Center for Brains, Minds and Machines as a campaign priority.
Figure 2: Modeling the ventral stream of visual cortex using a multi-state fully recurrent neural network
Authors discuss relations between Residual Networks (ResNet), Recurrent Neural Networks (RNNs) and the primate visual cortex.
“On invariance and selectivity in representation learning,” Information and Inference: A Journal of the IMA 2016

Pages