<p/>

Imagine controlling a computer with just your mind. It sounds like a frivolous and futuristic convenience, but such technology could provide disabled or "locked in" patients the ability to communicate and gain control over their environments. A number of companies and researchers are developing these so-called brain-machine interfaces (BMIs, also called brain-computer interfaces, or BCIs), and though the technology is in its infancy, progress has been made.

Last December, Foxborough, Mass.-based Cyberkinetics Neurotechnology Systems announced that its BrainGate implantable electrode array enabled a patient with quadriplegia to control a computer and television set using only his thoughts. The 24-year-old man, the first of five patients approved by the FDA to test BrainGate in clinical trials, can also read E-mail and play video games using the device.

Others are using similar technologies to understand how brain signals control movement or respond to internal and external stimuli. Such experiments have the...

NEURAL IMPLANTS

Cyberkinetics' 16-mm2 BrainGate chip, based on technology developed by Richard Normann at the University of Utah, contains 100 needle-like silicon electrodes, which are inserted into the motor cortex and connected to wires that exit the brain through the scalp. Signals picked up by the electrodes are amplified and sent to an analog-to-digital converter, turning the simultaneous firings of 100 individual neurons into a stream of digital data, says Cyberkinetics CEO Tim Surgenor. When the subject thinks about moving a limb in a particular direction, the system translates the resulting signal into an actual motion – in this case, the movement of a cursor on a computer screen.

Looking ahead, the company is working to provide patients with still greater control. "One milestone for us might be to learn how to interpret five or six hand gestures that a person might make with their hand as a 'button,"' says Surgenor. At Duke University in Durham, NC, Miguel Nicolelis' laboratory has demonstrated that macaque monkeys bearing implants to the parietal and primary motor cortices can learn, via visual feedback, to make a robotic arm reach, grasp, and grip.1

But the monkeys' control is imperfect; sometimes when monkeys attempt to grasp an object with a robotic arm, they knock it over instead. Mandayam Srinivasan, who collaborates with Nicolelis, attributes this to the relatively small number of neurons sampled by the device. So Srinivasan, director of the Touch Lab at the Massachusetts Institute of Technology, is working on an algorithm called "continuous shared control," in which the robots contain tactile sensors that provide local intelligence – akin to a cat's whiskers. Controlling the robot via a combination of local intelligence and monkey brainpower, says Srinivasan, "seems to make the tasks a lot more reliable than having solely brain-based control."

Cybernetics cofounder John Donoghue, a neurobiologist at Brown University, has even more ambitious plans. He would like to combine BrainGate with functional electrode stimulation (FES), in which partially paralyzed patients achieve some control of paralyzed limbs by stimulating electrodes placed in the muscles of those limbs. Current FES systems rely on external signals, Donoghue says, "but now imagine if we could hook up the sensor directly to this FES system. Now by thought alone these people could be controlling their arm muscles."

NONINVASIVE ALTERNATIVES

Donaghue's enthusiasm notwithstanding, not every neuroscientist favors implantable devices. "There seems to have been in the media and even among scientists to some extent recently an infatuation with sticking things into the brain, which is interesting and understandable unless you take into account whose brain it is. If it's your own brain it may not be quite as desirable," says neurologist Jonathan Wolpaw of the Wadsworth Center of the New York State Department of Health.

Wolpaw, one of a growing number of researchers developing noninvasive alternatives to neural implants, recently published evidence that normal and spinal-injured human patients could move a cursor on a computer screen by controlling their own electroencephalogram (EEG) patterns, specifically, those EEG features called mu and beta rhythms.2

Wolpaw notes that the level of control displayed in this experiment far surpasses that of a similar 1994 study.3 He attributes this advance in part to improvements in signal processing, but mostly to a new adaptive algorithm that adjusts the EEG features on which it focuses based on the user's past performance.

<p>MONKEY SEE, ROBOT DO:</p>

Courtesy of PLOS Biology

Experimental setup of Miguel Nicolelis' closed-loop BMI. Using visual feedback, monkeys learned to control a robot arm without moving their hands. Other researchers are adding tactile feedback to the system to improve robot dexterity.

While at the University of Illinois in the 1980s, Emanuel Donchin, now professor of psychology at the University of South Florida, developed another noninvasive alternative to neural implants, based on the P300 component of event-related brain potentials. P300s are elicited by stimuli that are rare or otherwise meaningful to the subject – for example, seeing his or her own name on a list of 100 unfamiliar names. Donchin and then-student Larry Farwell demonstrated that the P300 can be harnessed to help a BCI to identify which of 36 characters and symbols displayed on a screen a disabled user has selected.4

Patients won't be using these systems to type their memoirs anytime soon. According to a 2000 study, Donchin's system generates just 7.8 characters per minute with 80% accuracy.5 And, like their implantable counterparts, EEG-based methods can be cumbersome, so widespread adoption will likely require wireless technologies.

"A lot of the patients we're working with ... don't necessarily want to have an unsightly electrode cap on their head when they go out in public," says biomedical engineer Dawn Taylor of Case Western Reserve University in Cleveland. Her work includes both invasive and noninvasive methods. (Donchin notes that his P300 BCI can be implemented in a device the size of a Palm Pilot).

Proponents of invasive technologies argue that noninvasive scalp recordings are surrogates that do not actually read motion intent from neurons. "You can't record or get your movement intentions from recording from the scalp; what you get is the overall brain state," says Brown's Donoghue. "The implantable technology actually goes after and records the brain cells and the very information that's related to what you want to do."

Wolpaw counters that neither method directly taps into the neuronal activity associated with an actual movement. That is, they do not "read minds" but instead allow the mind to control something external to it. "Once you make that activity directly responsible for an artificial output, performance changes. The brain adapts."

Researchers familiar with both technologies say scalp recordings may ultimately be the more limited of the two. "I think improvements in the decoding algorithms and the adaptive-training algorithms are going to continue, but in the long run, it's likely that the intracortical electrodes that get individual action potential from multiple cells will most likely be able to relay more complex information, and be used for more complex tasks," says Taylor.

TOWARD BIMODAL CHIPS

At the moment, BMI technologies in use in intact brains generally operate in one of two modes, says neurobiologist Naweed Syed of the University of Calgary: "Either you are able to stimulate brain cells or to record their collective activities." Syed is one of a group of researchers developing technologies that can do both.

Syed has grown snail neurons on a silicon chip and induced them to form a network whose activity was manipulated at the single-cell level through the chip.6 A capacitor located underneath one cell induces activity in that cell, while a transistor attached to the chip detects electrical activity in a second cell, providing evidence that the chip can "talk" to the cell and vice versa. "That is a critical step for a successful interfacing of electronic devices with the brain cells, whereby the machine talks to the brain and the brain can talk back to the machine," he says.

Syed currently is developing chips interfaced with 10,000 to 16,000 neurons that can be used to study how various stimuli affect neural circuits. For example, he wants to understand how large numbers of interconnected neurons process information, and how the network responds to drugs or conditions that elicit an epileptic seizure. He cautions, though, that in its present state the technology cannot be used to interface an intact brain with a prosthetic device, as it requires a clean seal between chip and neuron. Use in human patients will require chips that "float" and do not require direct contact with brain cells, and that can be accessed wirelessly.

At the University of Florida, biomedical engineer Thomas DeMarse cultures rat neurons on 60-channel multielectrode arrays that, like Syed's, can be used both to record from and stimulate the cells. DeMarse recently garnered considerable media attention by interfacing his system with an F-22 fighter jet simulator and "training" the cells to manipulate the plane.

To turn a Petri dish into a pilot, he translated the plane's pitch and roll angles into high- and low-frequency stimulation pulses. For instance, suppose the aircraft's nose is pointing down. High-frequency pulses are sent to the neurons, increasing the action potential above a baseline level. Should the control surface move too far, low-frequency pulses are added to nudge the action potential, and the plane, back down.

<p>NEURONS IN THE COCKPIT:</p>

Courtesy of David Blankenship

Thomas DeMarse holds his "brain-in-a-dish", a collection of approximately 25,000 rat cortical neurons cultured on a 60-channel multielectrode array. In one breakthrough application, the cells were trained to pilot a flight simulator.

The point of this experiment is not to teach cells to fly, of course, but to learn something about neuronal networks. Using a combination of microscopy and direct recordings, DeMarse can study how the neurons perform computations and change connectivity when stimulated. DeMarse notes that the arrays, though not a whole brain, closely approximate the connectivity between neurons in the intact brain and allow researchers to study networks at higher spatial and temporal resolution than functional magnetic resonance imaging, for example. "The arrays that we use are sort of a compromise where you can look at, in great detail and in real time, what's happening within the neural networks in terms of the individual components," he says.

Now, in collaboration with University of Florida neurologist Paul Carney, DeMarse is investigating neurophysiological network changes leading to epilepsy in live rats. The team combines microelectrode recordings from the hippocampi of live rats with recordings from cultured neurons, which provide histological data and give better resolution. DeMarse says he and Carney hope eventually to identify abnormal patterns of brain activity that lead to an epileptic episode, and to stimulate neurons to revert to normal patterns.

At the University of Washington in Seattle, Tom Daniel's team implants small computer chips in the hawk moth, Manduca sexta, to study how the insect's nervous system controls flight. "What we've learned specifically in the case of hawk moths is that they need a considerable amount of sensory information to regulate flight, but they do it with lots of different modes, both mechanical sensors and visual sensors," explains Daniel.

Like DeMarse's sensors, Daniel's chips can both record and stimulate neural responses. "We can actually send signals to some of the nerve bundles to cause the muscles to contract ... [and] monitor the signals that the brain sends to the wings [and other areas], to understand how the animal responds to those stimuli," says computer scientist Chris Diorio, who collaborates with Daniel.

Ultimately of course, scientists and companies looking to aid disabled patients must find ways to translate these clever proofs-of-concept into real-world, practical applications. In their current configuration, implantable technologies such as BrainGate will likely never achieve widespread use. For one thing, they are unwieldy. BrainGate, says Donoghue, is "a fairly substantial cart with computers and signal processors." He and Surgenor aim to make the device fully implantable and wireless.

Neural implants also require major surgery. Cyberkinetics' chip is implanted during a three-hour procedure. A competing but similar product from Atlanta-based Neural Signals, called the Brain Communicator, requires 10 hours, according to the company's Web site.

Though consensus appears to favor implantable devices, economics (i.e., whether insurance companies will cover the cost of implantation) could dictate whether they or noninvasive devices become dominant. Says Donchin: "All the science in the world is not going to change the fact that if somebody has developed a wonderful HIV drug, it is going to be of no use to the sick poor in Nigeria. And the same is going to be true for paralyzed ALS patients who do not have the thousands of dollars to pay for an invasive procedure for implanting a prosthesis."

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!