Bringing together artificial intelligence and neuroscience promises to yield benefits for both fields.
Chethan Pandarinath wants to enable people with paralysed limbs to reach out and grasp with a robotic arm as naturally as they would their own. To help him meet this goal, he has collected recordings of brain activity in people with paralysis. His hope, which is shared by many other researchers, is that he will be able to identify the patterns of electrical activity in neurons that correspond to a person’s attempts to move their arm in a particular way, so that the instruction can then be fed to a prosthesis. Essentially, he wants to read their minds.
“It turns out, that’s a really challenging problem,” says Pandarinath, a biomedical engineer at Emory University and the Georgia Institute of Technology, both in Atlanta. “These signals from the brain — they’re really complicated.” In search of help, he turned to artificial intelligence (AI). He fed his brain-activity recordings to an artificial neural network, a computer architecture that is inspired by the brain, and tasked it with learning how to reproduce the data...
Read the full story on Nature's website using the link below.