Photo of Maia Fraser
September 17, 2019 - 4:00 pm
MIT Building 46-3002 (Singleton Auditorium)
Maia Fraser, Assistant Professor University of Ottawa
Abstract: Hierarchical learning is found widely in biological organisms. There are several compelling arguments for advantages of this structure. Modularity (reusable components) and function approximation are two where theoretical support is readily available. Other, more statistical, arguments...
September 10, 2019 - 11:30 am
By Chrissy Sexton staff writer In a new study published by the Society for Research in Child Development, experts have found that children are taking notice of how adults face challenges and the amount of effort they are willing to put into reaching their goals. The research suggests that these actions, along with the words of encouragement adults may use, have a significant effect on persistence among children.  “Our work shows that...
September 9, 2019 - 10:15 am
Study reveals brain regions that respond differently to the presence of background noise, suggesting the brain progressively hones in on and isolates sounds. by Sabbi Lall | McGovern Institute for Brain Research In a busy coffee shop, our eardrums are inundated with sound waves — people chatting, the clatter of cups, music playing — yet our brains somehow manage to untangle relevant sounds, like a barista announcing that our “coffee is ready,”...
August 30, 2019 - 10:00 am
Marine Biological Laboratories
10am-10:30am        State of CBMM
                                    Presented by Tomaso Poggio
10:30am-11am         CBMM  beyond 2023
                                     Presented by Tomaso Poggio
11am-11:15am          Learning Hub review...
infant in seat watching video of woman picking up ball
August 21, 2019 - 10:15 am
We view ourselves and others as causal agents who pursue goals and act efficiently to make things happen, but where do these intuitions come from? In a new paper funded by the Center for Brains, Minds and Machines and published in PNAS, Harvard University researchers and co-authors Shari Liu, Neon Brooks, and Elizabeth Spelke show that three-month-old human infants, who do not yet reach for objects, nevertheless understand that when other people...
July 24, 2019 - 9:45 am
Bringing together artificial intelligence and neuroscience promises to yield benefits for both fields. Chethan Pandarinath wants to enable people with paralysed limbs to reach out and grasp with a robotic arm as naturally as they would their own. To help him meet this goal, he has collected recordings of brain activity in people with paralysis. His hope, which is shared by many other researchers, is that he will be able to identify the patterns...
July 21, 2019 - 9:00 am
“We have all heard that Patrick died yesterday in his sleep. I am writing with tears in my eyes. This is a sad day for me and also for MIT and for CBMM. For me, Patrick and MIT have always been together. When I arrived at MIT from Germany in 1981, he was the director of the AI Lab where my office was.  In the last decade, Patrick was for me one of the most important people at MIT, somebody I always trusted and relied upon for advice and...
July 18, 2019 - 4:00 pm
MIT 46-5165
Gemma Roig
Task-specific Vision DNN Models and Their Relation for Explaining Different Areas of the Visual Cortex
Deep Neural Networks (DNNs) are state-of-the-art models for many vision tasks. We propose an approach to assess the relationship between visual tasks and their task-specific...
June 17, 2019 - 9:15 am
Learning to code involves recognizing how to structure a program, and how to fill in every last detail correctly. No wonder it can be so frustrating. A new program-writing AI, SketchAdapt, offers a way out. Trained on tens of thousands of program examples, SketchAdapt learns how to compose short, high-level programs, while letting a second set of algorithms find the right sub-programs to fill in the details. Unlike similar approaches for...
June 11, 2019 - 4:00 pm
Katharina Dobs & Ratan Murty
Murty talk title: Does face selectivity arise without visual experience with faces in the human brain?
Dobs talk title: Testing functional segregation of face and object processing in deep convolutional neural networks
Graphic of brain hemisphere
June 10, 2019 - 1:30 pm
Results of study involving primates suggest that speech and music may have shaped the human brain’s hearing circuits. Monday, June 10, 2019 NIH | National Institutes of Health In the eternal search for understanding what makes us human, scientists found that our brains are more sensitive to pitch, the harmonic sounds we hear when listening to music, than our evolutionary relative the macaque monkey. The study, funded in part by the National...
May 22, 2019 - 12:30 pm
ARLINGTON, VA, UNITED STATES Story by Warren Duffie Office of Naval Research How can the neuroscience behind emotion lead to new concepts of artificial intelligence? How does the human brain make fast decisions in real-world scenarios—and what impact does this have on human-to-human and human-machine interactions? These are just some of the questions being pondered by the 10 scientists and engineers recently announced as members of the 2019...
May 14, 2019 - 4:00 pm
Duncan Stothers, Will Xiao, Nimrod Shaham
Duncan Stothers-
Title: Turing's Child Machine: A Deep Learning Model of Neural Development
Turing recognized development’s connection to intelligence when he proposed engineering a ‘child machine’ that becomes intelligent through a developmental process, instead of top-down hand-...
  IMAGE: This figure shows natural images (right) and images evolved by neurons in the inferotemporal cortex of a monkey (left). Credit: Ponce, Xiao, and Schade et al./Cell
May 2, 2019 - 2:45 pm
To find out which sights specific neurons in monkeys "like" best, researchers designed an algorithm, called XDREAM, that generated images that made neurons fire more than any natural images the researchers tested. As the images evolved, they started to look like distorted versions of real-world stimuli. The work appears May 2 in the journal Cell. "When given this tool, cells began to increase their firing rate beyond levels we have seen before,...
A computer model of vision created by MIT neuroscientists designed these images that can stimulate very high activity in individual neurons.  Image: Pouya Bashivan
May 2, 2019 - 1:30 pm
Study shows that artificial neural networks can be used to drive brain activity. Anne Trafton | MIT News Office MIT neuroscientists have performed the most rigorous testing yet of computational models that mimic the brain’s visual cortex. Using their current best model of the brain’s visual neural network, the researchers designed a new way to precisely control individual neurons and populations of neurons in the middle of that network. In an...