New prosthetic limbs go beyond the functional to allow people to ‘feel’ again

Are these guys applying PCT without knowing it? This was reprinted in today’s Albuquerque Journal.

Ted

From the Washington Post:

image001337.jpg

···

[ New

prosthetic limbs go beyond the functional to allow people to ‘feel’ again](https://www.washingtonpost.com/health/new-prosthetic-limbs-go-beyond-the-functional-to-allow-people-to-feel-again/2019/12/13/ac2fac10-d4ca-11e9-86ac-0f250cc91758_story.html)

(Stephan Walter for The Washington Post)

By Payal Dhar

Dec. 14, 2019 at 6:00 a.m. MST

Phantom pain was all that Keven Walgamott had left of the limb he lost in an accident over a decade ago — until hhe tried on the
LUKE Arm
for the first time in 2017, and told researchers that he could “feel� again. The arm is a motorized and sensorized prosthetic that has been in development for over 15 years by a team at the University of Utah.

Researchers around the world have been developing prosthetics that closely mimic the part of the human body they would replace. This goes beyond the cosmetic and even the functional; these are bionic body parts that
can touch and feel, and even learn new things.

“Touch isn’t a single sense,� said Gregory Clark, associate professor of biomedical engineering at the University of Utah and lead researcher of
the study
. “When you first touch objects with a natural hand, there’s an extra burst of neural impulses.�

AD

The brain then “translates� these into characteristics such as firmness, texture and temperature, all of which are crucial in deciding how to interact with the object, he said. In other words, by using the LUKE Arm
(named after the “Star Wars� hero Luke Skywalker, and manufactured by Deka ), Walgamott, of West Valley City, Utah, was able to “feel� the fragility of
a mechanical egg, just as he would have with a natural limb. He could pick it up and transfer it without damaging it.

Doctoral
student Jacob George, left, and professor Greg Clark examine the LUKE Arm, a motorized and sensorized prosthetic that has been in development for over 15 years. (Dan Hixson/University of Utah College of Engineering)

As he performed everyday activities with the prosthetic — such as holding his wiife’s hand, sending a text message and plucking grapes from a bunch — Walgamott told reesearchers that it felt like he had his arm back.
Even his phantom pain was reduced.

“When the prosthetic hand starts to feel like the user’s real hand, the brain is tricked into thinking that it actually is real,� Clark said. “Hence, the phantom limb doesn’t have a place to live in the brain anymore.
So it goes away — aand with it, goes the phantom pain.â€?

AD

Clark’s team was able to achieve these results by stimulating the sensory nerve fibers in a “biologically realistic� manner, he said. Using a computer algorithm as a go-between, they were able to provide a more biologically
realistic digital pulse similar to what the brain normally receives from a native arm.

“Participants can feel over 100 different locations and types of sensation coming from their missing hand,â€? Clark said. “They can also feel the location and the contraction force of their muscles — even when mmuscles
aren’t there. That’s because we can send electrical signals up the sensory fibers from the muscles, so the brain interprets them as real.�

The critical component of a prosthetic powered by thought would be the communication between the brain and a robotic body part — called the brain-computer interface (BCI).

AD

The LUKE Arm uses a neural interface, but in other mind-controlled prosthetics, brain implants are used to send instructions to a robotic limb, much like how neurons transmit messages from the brain to a muscle. But
this means precision brain surgery and all the attendant risks, not to mention the expense and recovery time.

This might be about to change.

Bin He, professor and head of biomedical engineering at Carnegie Mellon University, and his colleagues have been working on a noninvasive high-precision BCI, and reported a breakthrough
in June
: a “mind-controlled robotic arm . .  .
that demonstrates for the first time, to our knowledge, the capability for humans to continuously control a robotic device using noninvasive EEG signals.�

The term noninvasive is key. Noninvasive BCIs have shown promising results but only in performing distinct actions — for example, pushing a button. Whhen it comes to a sustained, continuous action such as tracking a
cursor on a computer screen, noninvasive BCIs have resulted in jerky, disjointed movements of the robotic prosthesis. In He and his team’s demonstration, the subject controlled with their mind a robotic arm to track a cursor on a computer screen, and the prosthetic
finger was able to follow the cursor in a smooth, continuous path — just as a real finger would. What is more interesting is that, while they used a computer-wired EEG cap on the subject in the lab, He said that it is not necessary.

AD

A smartphone app programmed with EEG recordings and wireless electrodes could streamline the process for everyday use, He said.

This could pave the way for thought-controlled robotic devices by decoding “intention signals� from the brain without needing invasive and risky brain surgery, He said.

Just as our native limbs are trained to perform various actions — basic ones suchh as grasping or walking, to precision ones such as neurosurgery or ballet — prosthetics, too, have to be calibrated for specific uses.
Engineers at the lab of Joseph Francis, associate professor of biomedical engineering at the University of Houston, have been working on a BCI that can autonomously
update
using implicit feedback from the user.

“We are moving toward an autonomous system that will learn to perform new actions as per the user’s intentions with the least supervision from outside, and enable the user to control the prosthetic more independently,�
said Taruna Yadav, a PhD student who is part of Francis’s team.

AD

In 2018, a bionic hand developed in a collaboration between the Imperial College London and the University of Göttingen used a human-machine interface that interpreted the wearer’s intentions and sent commands to the
artificial limb. The team used machine learning-based techniques to interpret neural signals from the brain to improve the
performance of prosthetic hands.

“Our main goal is to let patients control the prosthetic as though they were their biological limbs,� said Dario Farina, lead author of the paper
about their findings
. “This new technology takes us a step closer to achieving this.�

Would there, then, be a difference in the way that bionic body parts would work for an amputee vs. a paralyzed person? No, Yadav said.

“In both cases, it will still read the user’s neural activity and generate a command to control a prosthetic limb,� she said. “However, the time and effort required to learn to control the BCI output may differ.�

AD

Whereas paralysis may present a higher likelihood of nerve or spinal cord damage. He said that “a noninvasive BCI should apply to both, as long as the subject’s cognitive function is intact. The details of the BCI system
can be tailored to the particular needs of the situation.�

And the next frontier? If BCIs and other neural interfaces provide a means to connect our brains with external devices that extend the function of our body, and if they can make paralyzed patients walk again or restore
a body part that has been lost to disease or injury, would it be theoretically possible to develop bionic add-ons that could bestow superhuman abilities?

“In a sense, yes,� Clark said. “Indeed, we already do. Glasses restore normal vision to the nearsighted. But telescopes and microscopes allow us to see what would be otherwise unseeable. Canes assist in walking after
injury, but fiberglass vaulting poles allow us to clear superhuman heights.

AD

“In clinical applications, exoskeletons provide important assistive technologies after spinal cord injury or stroke,� Clark said. “Yet they can be used to increase the power and endurance of intact individuals.�

But in other ways, bionic parts are no match for nature.

“For all its merits, the LUKE Arm contains only 19 sensors, and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from
or to the brain,� Clark said. “That’s a lot, but both are impoverished compared with the thousands of motor and sensory channels of the human body, or its natural functional capabilities.�

The field of biomedical engineering, Clark added, exists to improve nature when it goes awry. “But we also try to understand and use nature to improve engineering — and ouurselves,â€? he said.

Unite Humanity

Against

Climate Calamity!