Tapping the Mind

Science magazine, 24 January 2003

Amyotrophic lateral sclerosis (ALS) can trap the mind inside an immobile body. It destroys the nerves that control muscles, eventually leaving patients without the ability to speak or even flick their eyes to one side. In the past few years, however, researchers have started to equip a few such “locked-in” patients, including those paralyzed by stroke or other diseases, with communication devices that unlock their minds.

For decades, science-fiction writers have envisioned computers that communicate directly with the brain. Now a rapidly expanding clique of researchers is making it a reality. A few laboratories started developing these so-called brain-computer interfaces (BCIs) in the 1980s and have been refining them since then (Science, 29 October 1999, p. 888). Now several dozen teams have entered the field. Together they’re improving upon early BCI models and coming up with new ways to read brain signals. According to BCI pioneer Jonathan Wolpaw of the Wadsworth Center, part of the New York State Department of Health in Albany, “it’s a very exciting time; a lot of people are getting involved.”

Most BCIs read brain waves, the electrical impulses created by neural activity that can be detected—albeit fuzzily—through the scalp. By diligently controlling their mental activity, patients can choose letters to spell words, guide a cursor, or direct crude robots. But a rival circle of scientists is rapidly advancing a type of BCI that is implanted inside the brain. Such devices tap into the more detailed neural signals relayed by individual neurons. The most sophisticated of these implanted BCIs have recently enabled monkeys to play video games and even manipulate robotic arms. Whether the implanted devices will actually lead to more versatile and workable BCIs than the external type is a matter of fierce debate.

In the past few years, brain-wave BCI technologies have been advancing rapidly, providing faster spelling, better cursor control, and headway into prosthetics, environmental-control devices, and smart wheelchairs. The advances are fueled in part by cheaper and more sophisticated computer hardware and software, which has given BCI researchers access to portable machines that perform complex mathematical manipulations on the fly. Good old-fashioned funding helps, too: The National Institutes of Health awarded $3.3 million in late 2002 to a partnership headed by the Wadsworth group to further develop software that can test several BCI systems to see which is best for a patient. Researchers can also use the software to build and test their own brain-tapping technologies.

In addition, the Defense Advanced Research Projects Agency (DARPA) recently awarded a Duke University research team $26 million to improve its implanted BCI technique. A DARPA spokesperson says the agency is interested in technology that might, for example, enable soldiers to push buttons with their brains, giving them speedier control of submarines and aircraft and enabling them to more adeptly manipulate robotic arms that move munitions.

So far, very few patients have had access to a BCI. A few scattered labs have conducted tests on one or more severely disabled patients, and one research team has tried its BCI technology on a record 11 patients so far. Most of the experimental subjects tested during the development of various BCIs have been healthy individuals. However, advancing technology and software innovations have put BCIs on the cusp of becoming more widely available.

Developing an array of useful applications, in particular, is critical to bringing BCIs past the experimental stage and into regular use in people’s homes—the field’s next big challenge. “A lot of good work has been done in the past 15 years to build a foundation” for BCIs, says computer scientist Melody Moore of Georgia State University in Atlanta. “Now, it’s time to build the house.”

Tens of thousands stand to benefit from BCI technologies. Initial beneficiaries will be people who are almost totally paralyzed: some ALS patients, who number 30,000 in the United States alone; people with severe forms of cerebral palsy; and patients who have suffered severe strokes or accidents, among others. As BCI technology improves, it is expected to become useful to people who are less severely disabled, such as quadriplegics who want to operate a wheelchair or a robot.

Surfing brain waves

In the brain, billions of neurons are continuously sucking in and spewing out ions, creating tiny electrical currents. A detector called an electroencephalogram (EEG) can measure the sum of these subtle sparks, millions at a time, by means of electrodes affixed to the scalp.

Niels Birbaumer, a psychologist at the University of Tübingen in Germany, was one of the first to find that people can control certain brain waves. He focused on so-called slow wave cortical potentials: gradual voltage changes that emanate from the brain’s exterior section, known as the cerebral cortex, and occur over seconds. In the early 1990s, Birbaumer and his team created a speller that patients learn to control using positive or negative slow waves to choose between two banks of letters. Once selected, a bank splits in two, continuing a process of elimination to reveal the wished-for letter. In March 1999, Birbaumer and his colleagues reported that after 2 months of training for about an hour a day, two ALS patients on respirators learned to write messages at a rate of about two characters per minute.

Wolpaw trained his eye on another set of brain currents, EEG rhythms with frequencies between 8 and 12 hertz known as mu waves, and beta rhythms, which have about double the frequency of mu rhythms. Both emanate from the part of the brain’s surface that mediates sensation and movement, the sensorimotor cortex. Wolpaw and his colleagues, including Wadsworth psychologist Dennis McFarland and program coordinator Theresa Vaughan, developed a system that enables a person to move a cursor up or down by raising or lowering the amplitude of a mu or beta rhythm. Usually a person first learns to do this by imagining moving a hand or other body part up or down. Healthy subjects, the team reported in 1994, could use mu and beta rhythms to direct a cursor—somewhat crudely, but with up to 70% accuracy—in two dimensions to one of four large targets at the corners of a computer screen.

Since then, the Wadsworth team has improved its techniques for homing in on the desired EEG frequencies, translating those signals into cursor movements, and tuning the BCI to individual users. These advances have given users much more precise control of the cursor. In work now in press, McFarland, Wolpaw, and their colleagues show that college students can use the BCI to nudge a cursor a precise distance up or down to land on one of four icons, a step toward the goal of a brain-wave mouse. “Once you can control a mouse, the whole world of software opens up to you,” Wolpaw says. Wadsworth neuroscientist Irina Goncharova is now leading an effort to test this BCI on patients with mild or moderate ALS at Drexel University Hospital in Philadelphia.

But both Wolpaw’s and Birbaumer’s techniques require weeks to months of training to teach a person how to control their brain waves. In contrast, a BCI control technique developed by psychologist Emanuel Donchin, now at the University of South Florida in Tampa, requires almost no training. Donchin and his graduate student Larry Farwell, then at the University of Illinois, Urbana-Champaign, in the mid-1980s based their prototype BCI on the so-called P300 wave, a brief voltage increase that peaks about 300 milliseconds after the onset of certain surprising or unexpected events.

Donchin and Farwell devised a grid containing the letters of the alphabet and typing functions such as space and backspace, which appear in rows and columns that flash randomly on the screen. A person focuses on a letter in the grid and mentally indicates “that’s it!” whenever the row or column containing the letter is illuminated. This happens about 1 out of 6 flashes, making the event somewhat surprising and therefore likely to elicit a P300 wave. The computer then identifies the letter by finding the intersection of the row and column that produced the wave.

Recent tests on college students indicate that the system could be used to “type” nearly eight characters per minute with 80% accuracy. Donchin and his team are now starting to test their BCI with severely disabled patients.

Another BCI gives users split-second control over a mobile robot. Instead of analyzing a particular EEG component, such as mu rhythms or slow waves, José del R. Millán at the Dalle Molle Institute for Perceptual Artificial Intelligence in Martigny, Switzerland, developed a BCI that analyzes overall EEG signals at eight scalp locations. It relies on the fact that thinking vastly different thoughts will produce different EEG patterns. Using a neural network algorithm, a computer learns to distinguish among three such thoughts—say, mental arithmetic, visualizing a spinning cube, or imagining arm movements—and is programmed to perform a specific command based on the mental pattern it detects.

In unpublished work, two healthy individuals this past spring learned to use Millán’s BCI to manipulate a pocket-sized wheeled robot, a stand-in for a smart wheelchair. They could make it scoot forward, turn right, turn left, or stop, and thus were able to direct it through a model house with surprising speed. “The striking finding is that subjects can do this with brain control in only 35% more time than it would take if they were simply pressing a key,” Millán says. Millán programmed his BCI to issue commands twice a second, so users can make decisions about where to go on the fly and, say, avoid overshooting an entryway.

Computer communication

One big drawback of these technologies is that they are incompatible, making them difficult to combine and slowing their development. Most of the systems started out so inflexible that adding a new feature was agonizingly difficult. For instance, Wolpaw and his team, including software engineer Gerwin Schalk, found an EEG signal that appeared when a person made a mistake while using their BCI, a discovery that could be used to design a quick “erase” option for their system. But adding such an option would have meant extensive reprogramming.

So in February 2000, Wolpaw, Schalk, and McFarland teamed up with Birbaumer and Tübingen software engineer Thilo Hinterberger to build BCI2000, a flexible, universal BCI platform on which various brain waves could be selected and new applications could be easily built. They share it widely, making it easier for newcomers to enter the field. Ultimately, they hope that nurses or family caregivers will use the Windows-based system with minimal training, eliminating the need for scarce experts to accompany BCI software. “We need a worldwide user-friendly system that a reasonably intelligent person can download from the Internet for free,” Birbaumer says.

The basic framework for BCI2000 is now complete, and Wolpaw expects details to be published soon. It has four easily adaptable modules that handle the four essential functions of a BCI. One takes the raw brain signal, amplifies it, and encodes it digitally. Another extracts the desired features of the brain signal, such as a mu rhythm or P300 signal, and translates that signal into a command, such as movement of a cursor in a certain direction. The third controls a device, say, one that navigates the Internet or operates a prosthetic arm. And the fourth allows a user to start and stop the BCI and to specify details, such as the speed, of its operation.

The software has already had an impact on the field. In spring 2002, South Florida’s Donchin needed to upgrade his BCI, which was incompatible with state-of-the-art PCs, so he could start testing disabled patients. Donchin met with Wolpaw and Schalk at a BCI conference in Rensselaerville, New York, in June 2002 and described his predicament. Schalk volunteered to do the necessary programming on BCI2000. Within 2 weeks, Schalk managed to get Donchin’s BCI up and running, enabling Donchin to bring it to New York City to test it on his first patient (this author’s father) in September.

Georgia State’s Moore and her colleagues are using BCI2000 to develop an environmental-control system that allows a user to turn on and off lights, a television set, and a radio with brain waves. They’ve also built a communication system in which a person can select words from a list of nouns, verbs, and objects, and that will predict words and even conversations, potentially providing faster communication than traditional spellers allow. Their Web browser causes a cursor to hop from one Web link to the next in response to altered brain signals.

So far these prototype applications have been largely tested on simulated brain signals, but Moore has just started testing them on healthy volunteers and will soon include patients with spinal cord injuries or early-stage ALS. Eventually, Moore plans to run all of these applications on a laptop mounted to a smart wheelchair under development that will also be controlled by brain waves.

Direct line from the brain

Many researchers believe that BCIs that rely on fuzzy brain-wave signals are of limited value. These devices listen in on the accumulated hums of millions of neurons after they merge and pass through the skull, akin to listening to a crowd in a baseball stadium from the parking lot. EEG-based BCIs “do not extract the actual information in our brains—for example, our concept of a word,” says neuroscientist John Chapin of the State University of New York Health Sciences Center in Brooklyn.

In contrast, a relatively new generation of BCI researchers implants electrodes inside the brain to pick up the chatter of single neurons, something like eavesdropping on the conversation of a couple inside the stadium from a few seats away. These signals, Chapin and others contend, comprise the actual brain code for movement and thought.

Neurologist Philip Kennedy, head of Atlanta-based Neural Signals, and his neurosurgeon colleagues Ron Bakay and Princewill Ehirim are so far the only team to create a BCI with electrodes implanted in a human. Their most successful patient, a Georgia dry-wall contractor named Johnny Ray who had suffered a massive stroke that left him almost totally paralyzed, learned to tune his neural signals to operate a cursor, enabling him to spell and hit icons for statements such as “I’m hungry.” (Ray, whom Kennedy calls the first human cyborg, used the BCI for 4 years before his death in spring 2002.)

Ray communicated through a novel electrode technology Kennedy invented: In the brain, chemicals lining a glass cone coax neurons to grow through the electrode and link to recording wires. This anchors the electrode and allows stable recording. In the current design, Kennedy and his colleagues implant two electrodes into their patients’ brains; they are working toward implanting eight at a time. Other research teams are testing devices that extract information from dozens to hundreds of brain cells at a time. More massive arrays, some researchers believe, are the key to advanced prosthetics.

In the mid-1990s, Chapin, then at the Medical College of Pennsylvania-Hanemann School of Medicine in Philadelphia, and Miguel Nicolelis, at Duke, threaded 46 hair-thin wires inside a rat’s brain and taught the animal to use its thoughts alone to tip a lever and receive a drop of water. A computer depressed the lever whenever the nerve signals picked up by the microwires displayed a pattern like that present when the rat moved the lever with its paw.

In spring 2000, Nicolelis, Chapin, and their colleagues implanted a more extensive array inside the brains of two owl monkeys. They taught the monkeys to operate a joystick with their hands, maneuvering a cursor, or to reach out with their arms to grab a piece of fruit and bring it to their mouths. A simple formula, the team discovered, could predict from the electrical activity of 100 neurons a monkey’s hand position milliseconds later. They translated these natural neuronal patterns into instructions for a robot arm—and watched the robot obediently mimic the monkey’s arm movements.

Such systems have limitations. The monkey has to move its arm to produce the correct brain signal pattern, which won’t work for paralyzed people. In addition, the monkey has no idea that its brain signals are controlling a machine, and so it cannot learn to improve its robotic performance.

Recently, a group led by neuroscientist Andrew Schwartz, formerly at Arizona State University, addressed these issues. They tied monkeys’ hands down and had them play a game in which their brain signals directly controlled a cursor. After 2 to 3 weeks of practice, one of the monkeys could hit the correct target nearly every time. The researchers linked observable changes in the firing patterns of 64 neurons to the animal’s improved skills, indicating that practicing the brain-wave game was honing the responses of the cells (Science, 7 June 2002, p. 1829). “A monkey can squeeze a lot of information from a minimal neuronal signal,” Schwartz says. He emphasizes that learning, guided by the game’s visual feedback, was key to this ability.

Similarly, in work presented at the 2002 Society for Neuroscience meeting in November, Nicolelis’s team, while collecting data from 86 motor cortex neurons, taught a macaque monkey to use a joystick to quickly position a cursor inside a target. The scientists then disconnected the joystick—although the monkey could still handle it—and ran the game off decoded neural signals. The monkey appeared to learn to manipulate the cursor just by thinking and eventually stopped moving its hands altogether.

Recently, the Duke researchers have added a new robot with a gripper hand into the loop. When the monkey moves the cursor toward the on-screen target, the robot will reach for an object. The monkey will also receive tactile feedback—from small vibrators attached to its skin—to indicate the force with which the gripper grasps the object. The faster the vibrations, the higher the force. The researchers hope that this will enable a monkey to learn to pick up an object without crushing or dropping it. This experiment, says Nicolelis, should have “a tremendous impact on what you do to control prosthetic devices.” Using tactile feedback would be particularly useful in ALS patients, who retain some sensation even after most motor neurons are destroyed.

In a similar effort, Schwartz, now at the University of Pittsburgh, and his collaborators have hooked a monkey’s brain up to a robotic arm. If the monkey is allowed to view the robot and the food on the computer screen, it can get the robot to reach out and retrieve the food.

In both Schwartz’s and Nicolelis’s experiments, the monkeys were first trained by practicing the movement with their hands, something paralyzed people cannot do. But Nicolelis is optimistic that this hurdle can be overcome, because people can be trained with verbal instructions. “We hope that we can show a visual trajectory to a human or tell him just to think about executing a movement,” and that thought alone will elicit coherent patterns of neuronal activity in the motor cortex, Nicolelis says.

Many other hurdles must be overcome before implanting such arrays in humans—not least of them establishing that implanted electrodes are safe. As for stability, the microwires in the macaques were still picking up signals from the vast majority of the initial crop of neurons 1 year later. In Schwartz’s case, some of the electrodes have lasted up to 3 years. “But we need to know that’s the rule,” Schwartz says.

Meanwhile, Schwartz and a University of Michigan team are developing electrodes designed to be easier to implant and to interact more securely and safely with natural brain tissue. Brown University’s John Donoghue is also working with researchers at Cyberkinetics of Providence, Rhode Island, on a novel silicon array of 100 microelectrodes that he says will make extracting neural signals much easier.

But some researchers in the field wonder if trying to implant such arrays in the human brain to get motor instructions might be overkill. When operating a prosthetic limb, for instance, the user could just tell it to lift, lower, or open or close its hand—or even grasp an object at a certain location—and let robotics do the rest. “BCIs just need to convey intent,” Wolpaw contends, and not the details of how a brain would coordinate movements. Indeed, neuroscientist Gert Pfurtscheller and his team at the Graz University of Technology in Austria have already demonstrated that a quadriplegic patient fitted with a prosthetic left hand learned to use mental imagery along with a scalp-based BCI to open and close the hand. After 5 months of training, the patient picked up an apple with his new hand and ate it.