Skip to main content

News & Media

News & Media Front Page

Monkeys Control Robot Arm Via Brain Signals

Monkeys Control Robot Arm Via Brain Signals
Monkeys Control Robot Arm Via Brain Signals

Contact

Duke Health News Duke Health News
919-660-1306

DURHAM, N.C. - Duke University Medical Center researchers and their
colleagues have tested a neural system on monkeys that enabled the
animals to use their brain signals, as detected by implanted
electrodes, to control a robot arm to reach for a piece of food. The
scientists even transmitted the brain signals over the Internet,
remotely controlling a robot arm 600 miles away.

According to the scientists, their recording and analysis
system, in which the electrodes remained implanted for two years in one
animal, could form the basis for a brain-machine interface that would
allow paralyzed patients to control the movement of prosthetic limbs.
Their finding also supports new thinking about how the brain encodes
information, by spreading it across large populations of neurons and by
rapidly adapting to new circumstances.

In an article in the Nov. 16, 2000, Nature, Miguel Nicolelis,
associate professor of neurobiology, and his colleagues described how
they tested their system on two owl monkeys - implanting arrays of as
many as 96 electrodes, each less than the diameter of a human hair,
into the monkeys' brains.

The technique they used, called "multi-neuron population
recordings" was developed by co-author John Chapin and Nicolelis. It
allows large numbers of single neurons to be recorded separately, and
then combines their information using a computer coding algorithm.

The scientists implanted the electrodes in multiple regions of
the brain's cortex, including the motor cortex from which movement is
controlled. The scientists then recorded the output of these electrodes
as the animals learned reaching tasks, including reaching for small
pieces of food.

The scientists fed the mass of neural signal data generated
during many repetitions of these tasks into a computer, which analyzed
the brain signals to determine whether it was possible to predict the
trajectory of the monkey's hand from the signals. In this analysis, the
scientists used simple mathematical methods to predict hand
trajectories in real-time as the monkeys learned to make different
types of hand movements.

Said Chapin, who is at the State University of New York Health
Science Center, "In a previous paper [published in the July 1, 1999,
Nature Neuroscience], we found that rats were able to use their
neuronal population activity to control a robot arm, which they used to
bring water to their mouths. At the beginning of the experiments, the
animals had to press down a lever to generate the brain activity needed
to move the robot arm. Over continued training, however, their lever
movements diminished while their brain activity remained the same."

Said Nicolelis, "We found two amazing things, both in the
earlier rat studies and in our new studies on these primates. One is
that the brain signals denoting hand trajectory shows up simultaneously
in all the cortical areas we measured. This finding has important
implications for the theory of brain coding which holds that
information about trajectory is distributed really over large
territories in each of these areas - even though the information is
slightly different in each area.

"The second remarkable finding is that the functional unit in
such processing does not seem to be a single neuron," Nicolelis said.
"Even the best single-neuron predictor in our samples still could not
perform as well as an analysis of a population of neurons. So, this
provides further support to the idea that the brain very likely relies
on huge populations of neurons distributed across many areas in a
dynamic way to encode behavior."

Once the scientists demonstrated that the computer analysis
could reliably predict hand trajectory from brain signal patterns, they
then used the brain signals from the monkeys - as processed by the
computer - to allow the animals to control a robot arm moving in three
dimensions. They even tested whether the signals could be transmitted
over a standard Internet connection, controlling a similar arm in MIT's
Laboratory for Human and Machine Haptics - informally known as the
Touch Lab.

Said co-author Mandayam Srinivasan, director of the MIT
laboratory, "When we initially conceived the idea of using monkey brain
signals to control a distant robot across the Internet, we were not
sure how variable delays in signal transmission would affect the
outcome. Even with a standard TCP/IP connection, it worked out
beautifully. It was an amazing sight to see the robot in my lab move,
knowing that it was being driven by signals from a monkey brain at
Duke. It was as if the monkey had a 600-mile-long virtual arm."

Besides Nicolelis, Srinivasan and Chapin, other co-authors of
the paper were, from Duke, Johan Wessberg, Christopher Stambaugh,
Jerald Kralik, Pamela Beck and Mark Laubach; and from MIT, Jung Kim and
James Biggs. The scientists' work is supported by the National
Institutes of Health, National Science Foundation, Defense Advanced
Research Projects Agency and the Office of Naval Research.

"The reliability of this system and the long-term viability of
the electrodes lead us to believe that this paradigm could eventually
be used to help paralyzed people restore some motor function,"
Nicolelis said.

"This system also offers a new paradigm to study basic
questions of how the brain encodes information. For example, now that
we've used brain signals to control an artificial arm, we can progress
to experiments in which we change the properties of the arm or provide
visual or tactile feedback to the animal, and explore how the brain
adapts to it. Understanding such adaptation will allow us to make
inferences about how the brain normally encodes information."

Nicolelis and his colleagues will soon begin such "closed-loop"
experiments, in which movement of the robot arm generates tactile
feedback signals in the form of pressure on the animals' skin. Also,
they are providing visual feedback by allowing the animal to watch the
movement of the arm. The scientists' experiments with learning in rats
that were reported in Nature last July have already indicated that the
analysis system can detect adaptive brain changes associated with
learning.

Such feedback studies could also potentially improve the
ability of paralyzed people to use such a brain-machine interface to
control prosthetic appendages, said Nicolelis. In fact, he said, the
brain could prove extraordinarily adept at using feedback to adapt to
such an artificial appendage.

"One most provocative, and controversial, question is whether
the brain can actually incorporate a machine as part of its
representation of the body," he said. "I truly believe that it is
possible. The brain is continuously learning and adapting, and previous
studies have shown that the body representation in the brain is
dynamic. So, if you created a closed feedback loop in which the brain
controls a device and the device provides feedback to the brain, I
would predict that as people or animals learn to use the device, their
brains will basically dedicate neuronal space to represent that device.

"If such incorporation of artificial devices works, it would
quite likely be possible to augment our bodies in virtual space in ways
that we never thought possible," Nicolelis said. "For example, in our
modest experiment at using brain wave patterns to control the robot arm
over the Internet, if we extended the capabilities of the arm by
engineering in feedback - such as visual, force or texture - such
closed-loop control might result in the remote arm being incorporated
into the body's representation in the brain. Once you establish a
closed loop, you're basically telling the brain that the external
device is part of the body representation. The major question in my
mind now is what is the limit of such incorporation."

Besides experimenting with such feedback systems, Nicolelis and
his colleagues are planning to increase the number of implanted
electrodes, with the aim of achieving 1,000-electrode arrays. They are
also developing a "neurochip" that will greatly reduce the size of the
circuitry required for sampling and analysis of brain signals.

"We envision that this neurochip can become an essential
component of the type of hybrid-brain-machine interfaces that may one
day be used to restore motor function in paralyzed patients," said
Nicolelis. "These activities will serve as the backbone of a new Center
for Neural Analysis and Engineering currently being created at Duke."

News & Media Front Page