Neuroprosthetic research began long before it solidified as an organized academic field of study. In 1973, University of California, Los Angeles, computer scientist Jacques Vidal observed modulations of signals in the electroencephalogram of a patient and wrote in Annual Review of Biophysics and Bioengineering: “Can these observable electrical brain signals be put to work as carriers of information in man-computer communication or for the purpose of controlling such external apparatus as prosthetic devices or spaceships?”1 While we don’t yet have mind-controlled spaceships, neural control of a prosthetic device for medical applications is now becoming commonplace in labs around the world.
Neuroprosthetics can be categorized as output neural interfaces, which convert the brain’s intentions to external actions, or as input neural interfaces, which take information from the environment and convert it into perceptions (think cochlear implant and bionic eye).
In its simplest form, a neuroprosthetic is a device that supplants or supplements the input and/or output of the nervous system. For decades, researchers have eyed neuroprosthetics as ways to bypass neural deficits caused by disease, or even to augment existing function for improved performance. Today, several different types of surgical brain implants are being tested for their ability to restore some level of function in patients with severe sensory or motor disabilities. In a very different vein, a company called Foc.us recently started selling simple, noninvasive brain stimulators to improve normal people’s attention while gaming. And perhaps the most visible recent demonstration of the power of neuroprosthetics was a spinal cord–injured patient using a brain-controlled exoskeleton to kick off the 2014 World Cup in Brazil. In short, tinkering with the brain has begun in earnest.
When connecting an external device to the human nervous system, researchers have traditionally used a setup that records brain signals from the user, computationally analyzes those signals to infer the user’s intentions, and then relays the information to an external effector that acts on those intentions. Inputs can be the firing of individual neurons in the brain, the cumulative voltages across areas of cortex encompassing millions of neurons, or the action potentials conducted by peripheral nerves anywhere in the body. In terms of output effectors, researchers have demonstrated that brain or nerve signals can be used to control computer cursor movements and robotic arms, or enable the reanimation of paralyzed limbs.
Such reversal of information transfer can also be beneficial for limb prostheses. Under normal circumstances, meaningful movements of the body can only be accomplished in conjunction with appropriate sensation of the limb or body part. It is this sense of pressure and joint position that enables a person to lift a paper cup without crushing it, for example, or allows someone to walk without injuring the joints of their feet. An absence of sensory feedback leads to significant clumsiness and inefficiency of movement; building such feedback into brain-controlled devices, whether visual, tactile, or proprioceptive in nature, enables a more elegant and effective control of the device. While this area of research is still young, researchers are beginning to create “bidirectional” brain-computer interfaces. And by linking a device to muscles in a patient’s chest to which both motor and sensory nerves from the shoulder of a lost arm have been rerouted, researchers have even developed a robotic arm prosthesis that incorporates tactile feedback. (See “Missing Touch,” The Scientist, September 2012.)
Although computer speeds initially limited control capabilities, devices keep increasing in power while decreasing in size, and microprocessor design and digital signal analysis now outpace neuroprosthetics’ requirements. Hardware is no longer a barrier to the development of more sophisticated devices. Technological advances in software and biomaterials are also enabling an accelerated pace of device development. Today, neuroprosthetics researchers are exploring applications in motor, sensory, visual, auditory, and speech areas, and devices range from early lab prototypes to implants in preliminary clinical trials involving patients most in need.
Brain signals from the scalp
Electroencephalography (EEG) is commonly used to study cortical electrophysiology, due to its noninvasive nature and ease of use in the clinical setting. Through electrodes placed on a subject’s scalp, an investigator is able to record the electrical rhythms resulting from complex interactions of neurons and support cells such as glia, averaged over several centimeters of the cortex. Such cortical activity can then be correlated with a cognitive action. When an individual repeatedly opens and closes his left hand, for example, there is a decrease in amplitude of a specific frequency band in the right motor cortex. (See table below.) A computer can identify these changes in amplitude, or spectral power, and correctly interpret that the person is intending to move his or her hand. The computer can then convert that detected intention to some type of output, such as controlling a cursor on a screen or opening and closing a robotic hand.
|EEG Band Type||Frequency||Associated Brain Region and Intentions|
Deeper regions such as the thalamus;
|Theta||4-8 Hz||Hippocampus; memory|
|Mu (a.k.a. Alpha)||8-13 Hz||Motor cortex; motor intentions|
|Beta||18-24 Hz||Motor cortex; motor intentions|
Local neural circuits in cortex; motor intentions,
auditory processing, and speech production
MIND READING: Electrical activity measured through electrodes placed on the scalp is categorized into different frequency bands. Changes in these electrical rhythms, averaged over several centimeters of the cortex, can be correlated with cognitive action and used as directives for neuroprosthetic devices.
EEG modulations are useful as a potential input for simple neuroprostheses because actual movements and imagined movements result in similar brain activity changes, as demonstrated almost 15 years ago by Dennis McFarland, a research scientist with the New York State Department of Health’s Wadsworth Center in Albany.2 Thankfully, many individuals affected by neurologic disorders such as amyotrophic lateral sclerosis (ALS), stroke-induced locked-in syndromes, or spinal-cord injury are still able to imagine movements, providing a basis for EEG-based devices to aid the movement-impaired. McFarland, Wadsworth’s Jonathan Wolpaw, and colleagues, for example, demonstrated the transformation of this brain signal to machine output by training subjects to control a virtual cursor displayed on a computer screen. Essentially, cursor velocity was derived in real time from changes in the EEG mu band, 8- to 13-Hz patterns of electrical activity that participants volitionally modulated by imagining up and down movements.3
Working with healthy volunteers in the lab or in clinical trials involving patients with ALS or spinal-cord injury, EEG can enable people to control cursors and select letters of the alphabet for communication, among many other practical tasks. One major shortcoming of the approach, however, is the physical separation of the cortical source signal and the scalp-based recording electrodes. This distance is occupied by the bone, scalp, and membranes, limiting signal resolution. In order for scalp-based electrodes to record measurable signals from the cortex, electrical potentials must be summated across an area of cortex at least 6 cm2. The spatial resolution of separable brain signals is thereby limited. Similarly, destructive averaging of temporally overlapping signals can cancel out higher frequencies associated with smaller cortical activations. In sum, while electroencephalography is easy to study and use, the limited signal resolution restricts EEG-based neuroprostheses to relatively simple control options.
Beyond the ability to infer intentions about limb and hand movement from neuronal activity, the neuronal signals picked up by these so-called intraparenchymal electrodes can be used to control a robotic limb. Research teams at the University of Pittsburgh, Stanford, Duke, and Brown have each performed real-time control of robotic arms based on the brain activity of macaque monkeys. By implanting electrode arrays into the animals’ motor cortex, for example, Andrew Schwartz of the University of Pittsburgh and colleagues found that the monkeys could manipulate a robotic arm well enough to feed themselves. By tapping into the motor cortex, the researchers recorded the primates’ movement intentions and used the information to control the movement of the robotic arm.4
In the past several years, this research has been translated to clinical trials led by Schwartz, Leigh Hochberg of Massachusetts General Hospital, Stanford’s Krishna Shenoy, and others. As it has in the monkeys, the level of control in humans has been steadily increasing in complexity and capability. The initial results showed that people fitted with intraparenchymal electrodes were capable of controlling simple three-dimensional movements; more recently, patients have proved able to drink from cups and pick up eggs.
The middle road
Another type of neural recording, known as electrocorticography (ECoG), measures brain activity from the surface of the cortex. Although invasive in nature, these signals may achieve the best balance of signal quality, durability, and reliability to enable a neuroprosthetic solution for the future. Because ECoG electrodes are placed directly on the cortical surface, the signals have excellent spatial and spectral resolution compared with EEG, which must pick up activity through the skull while reducing ambient muscle and environmental noise. And because the electrodes do not penetrate the brain tissue, there is little inflammatory or gliotic response to the implant. In addition to its mechanical and structural benefits, ECoG has a long history of use in clinical neurosurgery. These subdural electrode constructs have been employed for decades by neurosurgeons seeking to identify the location of seizures in the treatment of epilepsy.
In 2004, our group at Washington University in St. Louis demonstrated neuroprosthetic control by epilepsy patients undergoing invasive intracranial monitoring, which involved the implantation of electrodes on the surface of their frontal and temporal lobes to locate the source of their seizures. Piggybacking on the clinical procedure, we were able to directly measure human brain signals for neuroprosthetic application. The findings were dramatic: within minutes, patients acquired effective control of a computer cursor.
Using these ECoG techniques, we and other researchers have monitored brain activity during cognitive operations not possible in animal models, such as speech. In collaboration with Gerwin Schalk of the Wadsworth Center, we have demonstrated that it is possible to decode phonemes associated with real and imagined speech, and have shown that these speech-related signals can be used for simple device control.5 Instead of imaging a movement to control a cursor, patients simply imagine saying different phonemes to complete a two-choice task, such as choosing between two different highlighted targets. Patient control was nearly instantaneous, thanks in part to the fact that people routinely imagine talking to themselves and that the act of imagining word articulations comes very easily.
ECoG also has several signal-analysis advantages over noninvasive EEG, which has limited resolution for signals of 30–40 Hz. These higher-frequency rhythms, known as gamma rhythms, have been shown to carry substantial information for a number of cognitive operations, including the ability to decode specific elements of motor movements (such as which direction you want to move a joystick), auditory processing (reconstructing words being heard), and speech production (reconstructing the parts of words being spoken or imagined). With ECoG, all such cognitive operations have been used to demonstrate neuroprosthetic control, largely overcoming the limited resolution of EEG. While under my (Leuthardt’s) tutelage at Washington University, graduate student Charles Gaona identified different sub-bands in the high-gamma range. These sub-bands correlate with the performance of various cognitive tasks and could further expand the signal repertoire to decode human intentions.
In addition to providing finer frequency resolution, placing electrodes directly on the brain surface boosts spatial resolution, translating to improved discrimination of cognitive tasks, including individual finger movement (versus whole-hand movement) and phoneme articulation (versus general speaking).
Wei Wang of the University of Pittsburgh recently demonstrated the utility of an ECoG-based neuroprosthesis in a patient who had lost all motor control of his arms and legs after a spinal-cord injury. Wang’s group implanted a custom high-density ECoG array over the primary motor cortex, and, over the course of the four-week implant period, the patient rapidly progressed through a series of experiments demonstrating 2-D cursor, 3-D cursor, and, ultimately, 3-D robotic arm control.6 Such success points to a future in which neuroprosthetics provide new function and means of communication for diverse patient populations.
Outside the brain
Peripheral nerves outside of the brain and spinal cord offer additional attractive connection sites for neuroprosthetic devices. Because peripheral nerves are more easily exposed, devices that link to them carry less surgical risk. And this strategy helps maintain a consistent architecture while providing direct access to nerves involved in both sensory perception and motor intention.
But, similar to intracranial depth electrodes, the penetrating electrodes are subject to fibrosis and signal degradation. An alternative intraneural design is the sieve, or regenerative, electrode, which provides a stable and highly specific interface without long-term signal decay, thanks to nerve regeneration through small holes in the array. However, unlike penetrating electrode arrays, sieve electrodes require transecting the nerve to position the electrode in the cross section and encourage regenerating axons to grow through the device.
Extraneural designs avoid penetrating individual nerves, instead placing electrodes or other types of conducting materials around the nerve. While less invasive, this comes at the cost of reduced selectivity for different fibers within the nerve, making such devices poorly suited for the precise stimulation required to achieve fine motor control. This type of approach could enable a simple hand contraction, for example, but would be unlikely to control individual muscle groups in the hands. These designs may also result in compression injury, restricted blood flow, and poor contact properties. Nevertheless, extraneural designs have shown promise for applications requiring stimulation of the whole nerve.
Hardware is no longer a barrier to the development of more sophisticated devices. Today, neuroprosthetics researchers are exploring applications in motor, sensory, visual, auditory, and speech areas.
Researchers have also used electromyography (EMG) to monitor the electrical signature generated by muscle contractions, which serves as the source signal for device control. By monitoring the uninjured volitional control of muscles in the chest, neck, or shoulders, for example, scientists can convert the EMG activity into control signals for paralyzed or prosthetic limbs. In doing so, simple movements such as a shoulder shrug may be used to restore functional actions such as elbow flexing and hand grasping, or to directly control robotic prostheses.
Peripheral neuroprosthetic devices can also help recover sensory and autonomic functions. Silvestro Micera’s group at the École Polytechnique Fédérale de Lausanne’s Center for Neuroprosthetics in Switzerland demonstrated the success of intraneural electrodes interfaced with the median and ulnar nerves for restoring sensory feedback to an individual who had lost an arm 10 years earlier. Motor control was achieved by translating EMG activity from muscles more proximal to the site of amputation. A sense of touch was then added by testing electrical stimuli on the skin of the opposite, remaining limb using a range of signals from light touch to pain and applying these measurements to “translate” pressure recordings from the attached neuroprosthetic device into electrical signals transmitted directly into the nerves through the electrodes. After some training, the user was able to accurately control the motor movements of a robotic arm and to receive sensory feedback from mechanical touch sensors.
Currently, neural engineering is pushing the forefront of neuroprosthetics. As research yields additional insights into how neurons in the brain and peripheral nerves underpin human intention and perception, new ways to effectively interface with the human nervous system will emerge. This evolution of technical and clinical capability will involve numerous disciplines, including basic neuroscience, engineering, computer science, and neurosurgery. It is an exciting time for neural engineers who stand poised to significantly impact patients who have suffered neurologic injury.
Eric C. Leuthardt is an associate professor at Washington University School of Medicine in St. Louis, where he directs the Center for Innovation in Neuroscience and Technology. Leuthardt is also author of the thriller RedDevil 4. Jarod L. Roland is a fourth-year neurosurgery resident and Wilson Z. Ray is an assistant professor of neurosurgery at Washington University in St. Louis.
- J.J. Vidal, “Toward direct brain-computer communication,” Ann Rev Biophys Bioeng, 2:157-80, 1973.
- D.J. McFarland et al., “Mu and beta rhythm topographies during motor imagery and actual movements,” Brain Topogr, 12:177-86, 2000.
- J.R. Wolpaw et al., “An EEG-based brain-computer interface for cursor control,” Electroencephalogr Clin Neurophysiol, 78:252-59, 1991.
- M. Velliste et al., “Cortical control of a prosthetic arm for self-feeding,” Nature, 453:1098-1101, 2008.
- E.C. Leuthardt et al., “Using the electrocorticographic speech network to control a brain-computer interface in humans,” J Neural Eng, 8:036004, 2011.
- W. Wang et al., “An electrocorticographic brain interface in an individual with tetraplegia,” PLOS ONE, 8:e55344, 2013.