Touching moments in prosthetics: New bionic limbs that can ‘feel’
By Payal Dhar for STUFF
Doctoral student Jacob George, left, and professor Greg Clark examine the LUKE arm, a motorized and sensorized prosthetic that has been in development for more than 15 year
Phantom pain was all that Keven Walgamott had left of the limb he lost in an accident over a decade ago – until he tried on the LUKE Arm for the first time in 2017, and told researchers that he could “feel” again. The arm is a motorised and sensorised prosthetic that has been in development for over 15 years by a team at the University of Utah in the US.
Researchers around the world have been developing prosthetics that closely mimic the part of the human body they would replace. This goes beyond the cosmetic and even the functional; these are bionic body parts that can touch and feel, and even learn new things.
“Touch isn’t a single sense,” said Gregory Clark, associate professor of biomedical engineering at the University of Utah and lead researcher of the study. “When you first touch objects with a natural hand, there’s an extra burst of neural impulses.”
The brain then “translates” these into characteristics such as firmness, texture and temperature, all of which are crucial in deciding how to interact with the object, he said. In other words, by using the LUKE Arm (named after the “Star Wars” hero Luke Skywalker, and manufactured by Deka), Walgamott, of West Valley City, Utah, was able to “feel” the fragility of a mechanical egg, just as he would have with a natural limb. He could pick it up and transfer it without damaging it.
As he performed everyday activities with the prosthetic – such as holding his wife’s hand, sending a text message and plucking grapes from a bunch – Walgamott told researchers that it felt like he had his arm back. Even his phantom pain was reduced.
“When the prosthetic hand starts to feel like the user’s real hand, the brain is tricked into thinking that it actually is real,” Clark said. “Hence, the phantom limb doesn’t have a place to live in the brain anymore. So it goes away – and with it, goes the phantom pain.”
Clark’s team were able to achieve these results by stimulating the sensory nerve fibres in a “biologically realistic” manner, he said. Using a computer algorithm as a go-between, they were able to provide a more biologically realistic digital pulse similar to what the brain normally receives from a native arm.
“Participants can feel over 100 different locations and types of sensation coming from their missing hand,” Clark said. “They can also feel the location and the contraction force of their muscles – even when muscles aren’t there. That’s because we can send electrical signals up the sensory fibres from the muscles, so the brain interprets them as real.”
The critical component of a prosthetic powered by thought would be the communication between the brain and a robotic body part – called the brain-computer interface (BCI).
The LUKE Arm uses a neural interface, but in other mind-controlled prosthetics, brain implants are used to send instructions to a robotic limb, much like how neurons transmit messages from the brain to a muscle. But this means precision brain surgery and all the attendant risks, not to mention the expense and recovery time.
This might be about to change.
Bin He, professor and head of biomedical engineering at Carnegie Mellon University, and his colleagues have been working on a noninvasive high-precision BCI, and reported a breakthrough in June: a “mind-controlled robotic arm … that demonstrates for the first time, to our knowledge, the capability for humans to continuously control a robotic device using noninvasive EEG signals.”