ABOVE: Participant Joe Hamilton uses his mind to control a prosthetic hand to pick up a small block.
EVAN DOUGHERTY, UNIVERSITY OF MICHIGAN ENGINEERING

Traditional upper limb prosthetics, which often consist of two hooks controlled by a cable to another body part, require people who’ve lost a hand or arm to learn to manipulate a tool that’s connected to their body, rather than to control a prosthetic built to work as their lost limb once did. Even modern strategies that work more intuitively to harness signals coming from remaining nerves and muscles to move a robotic prosthesis might not distinguish between different movement signals—such as those to move the first versus little fingers—or require frequent recalibration.

In a study published this week (March 4) in Science Translational Medicine, researchers describe a strategy that creates a so-called regenerative peripheral nerve interface using muscle grafts connected to amputees’ remaining peripheral...

Just by thinking about it, both subjects could do all the tasks with more than 94 percent accuracy on the first day they tried it.

“Their use of taking the nerve and putting it into a little piece of muscle to use the muscle to help amplify the signal, so that you can get a robust signal to control your arm reliably for long periods of time—that’s just brilliant,” says Bradley Greger, a neuroscientist and engineer at Arizona State University who did not participate in the study. “This is the path to really helping people,” he adds. “Getting something to work in the lab for a month is great, and it helps us understand. But getting . . . a limb that will help a person for years in the real world, that’s a different problem or a harder problem, and they’re making inroads into that problem.”

This project began about 12 years ago, when Paul Cederna, a plastic surgeon at the University of Michigan, and colleagues got a grant from the United States Department of Defense to develop biointegrated prosthetic devices to enhance functional recovery after limb loss. The researchers proposed to incorporate patients’ remaining bones with prosthetics and generate peripheral nerve interfaces in the same way that others had done in the past: by passing a probe into the nerves. But at a meeting soon after the grant was awarded, the agency’s program director said that he wanted something other than “stuffing nails in nerves,” Cederna recalls. “It was that moment that had us start to begin thinking differently about how to interface with the peripheral nervous system, and that’s where the idea of the regenerative peripheral nerve interface started,” Cederna says.

They tried a variety of things, including conductive polymers, tissue engineering, and cell-based approaches, but eventually simplified their strategy. The current regenerative peripheral nerve interface (RPNI) consists of a small piece of muscle from the patient’s thigh wrapped around the end of a divided peripheral nerve, where the axons have been separated into bundles. The nerves then grow into and reinnervate the muscle grafts over about three months. The technique has the benefit of preventing phantom pain and neuroma pain, which is caused by nerves that grow unchecked into a sensitive bundle after amputation. When a signal comes down the nerve, it makes the muscle contract, and because muscle contractions create large electrical signals, the tiny signal gets amplified 10- to 100-fold in the muscle.

A few years into the project, Cederna’s group started to collaborate with Cindy Chestek, a biomedical engineer at the University of Michigan. Chestek and her team work on brain-machine interfaces. Electrical signals coming directly from the brain tend to be larger than those coming from peripheral nerves, but amplification via the RPNI made the signals large enough that they could apply similar algorithms for interpreting the activity of brain neurons in prosthetics to signals from peripheral nerves instead. After successful experiments detecting input from the peripheral nerves in macaques, the research team was ready to move into people.

See “Vibrations Restore Sense of Movement in Prosthetics” 

Participant Karen Sussex uses her mind to control a prosthetic hand to pick up a can of tomato paste.
Robert Coelius, University of Michigan Engineering

Four patients with varying degrees of hand or arm amputation agreed to undergo RPNI implantation to treat their pain. They had between three and nine RPNIs placed during surgery, most of them in the ulnar, median, and radial nerves. The researchers could detect muscle contractions in the RPNIs visually via ultrasound when they asked the subjects to imagine moving their fingers. In two of the subjects, the team also detected electrical activity in the implanted muscles corresponding to stimulation by the nerve.

The other two subjects also agreed to have wires placed through the skin to test the control of a robotic hand. To train the algorithms to associate muscle activity with movement, patients first performed a mirrored series of tasks while wearing a glove on their intact hand to record their finger position, while also recording the electrical activity of the muscles in their affected limb. This allowed the computer to understand the relationship between that electrical activity and the intended movement.

“We ask them to follow along a small number of movements and we have everything we need to let them start controlling their fingers, so all of the learning is in the algorithm,” says Chestek. “We don’t ask them to learn how to repurpose movements in order to control that hand.”

Then, researchers attached a robotic hand to the two subjects’ amputated limbs and asked them to move the prostheses in a variety of ways, including flexing the thumb, making a fist, and touching a target. Just by thinking about it, both subjects could do all the tasks with more than 94 percent accuracy on the first day they tried it. Plus, the algorithms they trained on day one worked up to 300 days later without any retraining or additional calibration.

“They think about moving their biological hand . . . just the way they did for twenty years when they used to have it, and researchers or clinicians can record or interpret the electrical signals that are being generated ultimately by that thought and translate them into action,” says Gregory Clark, a biomedical engineer at the University of Utah who did not participate in the work. Possible next steps, he says, include making the set-up wireless, which would eliminate the need for wires to pass through the skin to communicate with the prosthesis, and combining motor control with sensory integration, so that a prosthetic hand could not only move, but also help the user explore the world just like a biological hand.

“We have been able to provide patients sensory feedback through the regenerative peripheral nerve interfaces” in previous work, Cederna says, so incorporating both sensing and movement is not out of the realm of possibility. Even without that integration, he says, just upon seeing the hand move, patients start to think of it differently, as though it is actually theirs and not “some tool strapped to their body. It’s actually become part of them, and so we’re really excited about the opportunity of adding the sensory feedback to that because that will only incorporate that device even more into their person.”


P.P. Vu et al., “A regenerative peripheral nerve interface allows real-time control of an artificial hand in upper limb amputees,” Science Translational Medicinedoi:10.1126/scitranslmed.aay2857, 2020. 

Abby Olena is a freelance journalist based in Alabama. Find her on Twitter @abbyolena.

Interested in reading more?

The Scientist ARCHIVES

Become a Member of

Receive full access to more than 35 years of archives, as well as TS Digest, digital editions of The Scientist, feature stories, and much more!