Mind Control of Robot Arm

Two paralyzed patients successfully manipulate a robotic arm just by thinking about how they would move their own limbs if they could.

May 16, 2012
Jef Akst

Participant S3 drinking from a bottle using the DLR robotic armBRAINGATE2.ORG

Two patients who lost the use of their limbs (and the ability to speak) following brainstem strokes successfully reached out and touched a foam ball, thanks to a small array of electrodes implanted on their motor cortexes and a robotic arm that followed the command of their neurons, according to a Nature paper published today (May 16).

“These results are the first peer-reviewed demonstrations of 3 dimensional reaching and grabbing tasks using direct brain control of a robotic device,” study coauthor  Leigh R. Hochberg, who has appointments at Brown University, Harvard Medical School, and  Providence VA Medical Center, said at a press conference yesterday. “I believe that these are milestones in brain-computer interface research with exciting implications for neuroscience and neural rehabilitation.”

The device that made these advances possible, called BrainGate, made headlines in 2006 when patients successfully controlled a computer cursor. Since then, the system has been refined and connected to a robotic arm that can actually carry out the commands of the motor cortex.

Braingate arraybraingate2.org
Braingate array
BRAINGATE2.ORG

The electrode array—the size of a baby aspirin—records signals from dozens of motor cortex neurons, the activity patterns of which were calibrated to basic arm movements. The patients then simply think about moving their own arm, and computer algorithms translate their intentions to the robotic arm in front of them. One of the patients was even able to take her first sip of coffee (out of a bottle) on her own for the first time in 15 years (see video).

“The smile on her face when she did this is something that I, and I know our whole research team, will never forget,” said Hochberg.

Ultimately, such technology may allow patients to control prosthetic limbs, or even their own paralyzed limbs, though “there’s undoubtedly still much work to do,” Hochberg said. For example, the researchers hope to eventually make the implants wireless, so the patients do not have to be “plugged in” to use their limbs. Furthermore, the neural-interface system may need to be coupled with some sort of sensory feedback, to allow patients to sense how tightly they are grabbing something.

And, of course, there is the question of cost. “It remains to be seen whether a neural-interface system that will be of practical use to patients with diverse clinical needs can become a com­mercially viable proposition,” Andrew Jackson of the Institute of Neuroscience at Newcastle University wrote in an accompanying Nature commentary. “Nevertheless, the delight of a participant in Hochberg and colleagues’ study as she succeeds in drink­ing from a bottle for the first time in years should act as a powerful incentive for all in the field to address these challenges.”

L.R. Hochberg et al., "Reach and grasp by people with tetraplegia using a neurally controlled robotic arm," Nature, 372, doi:10.1038/nature11076, 2012.