New prosthetic limbs go beyond the functi onal to allow people to ‘feel’ again

Thanks, Rick.

“There is no question that knowing PCT could help organize their efforts and make the development of such prostheses much more coherent and efficient.�

Now, how can we get them to understand this? IOW, all, how can we get PCT out of our silo?

Best

Ted

···

Unite Humanity

Against

Climate Calamity!

From: Richard Marken
Sent: Thursday, December 26, 2019 10:22 AM
To: csgnet csgnet@lists.illinois.edu
Subject: Re: New prosthetic limbs go beyond the functional to allow people to ‘feel’ again

** UNM-IT Warning:** This message was sent from outside of the LoboMail system. Do not click on links or open attachments unless you are sure the content is safe.
(2.3)

[Rick Marken 2019-12-26_09:16:08]

[Bruce Nevin 20191225.16:30ET]

Ted Cloak Dec 24, 2019, 6:13 PM –

Are these guys applying PCT without knowing it? This was reprinted in today’s Albuquerque Journal.

Richard Marken Dec 24, 2019, 6:48 PM –

Yes!

BN: The input side has nice subtlety, and they
talk about “intention signals”, but they have it “read the user’s neural activity and generate a command to control a prosthetic limb,” and they’re putting a neural net into that limb to perform functions of the spinal cord and motor areas of the brain:

RM: It’s hard to tell exactly what they are doing based on the description in the article. But I do think they are “applying PCT without knowing it” simply because they are providing sensory feedback based on motor output. Based on this
quote from the article:

Participants can feel over 100 different locations and types of sensation coming from their missing
hand,â€? Clark said. “They can also feel the location and the contraction force of their muscles — even
wwhen muscles aren’t there. That’s because we can send electrical signals up the sensory fibers from the
muscles, so the brain interprets them as real.�

It looks like they are providing sensory feedback regarding the actual consequences of the motor “commands” (how the prosthetic arm has moved) but also simulated sensory feedback about the muscle contraction forces that, if the muscles
were actually there, would produce that result.

RM: I think they are unknowingly applying the basic principle of PCT which is that behavior is the control of perception. Think about it from the point of view of the wearer of the prosthetic. Before the development of this device all the
wearer of an arm prosthesis could do was try to control the visual position of the arm. The prosthesis just generated movement in proportion to neural output signals; there were no proprioceptive sensory consequences of that output, which are the main perceptions
we control when we move our arms.

RM: This new device allows the person wearing the prosthesis to control proprioceptive perceptions of their arm, which is the usual way it’s done. Of course, they have to do some engineering to make sure the variation in these perceptual
signals are in the “correct” polarity relative to the effect of the neural output signals on movement of the prosthesis. This is necessary so that the feedback loop is negative; so when I intend for the perception to increase it increases, and vice versa.
I believe this kind of “tuning” of the loop is what is going on with the development of the neural net architectures. But apparently they have the device tuned pretty well because, as they say, in a tracking task “the prosthetic finger was able to follow the
cursor in a smooth, continuous path”, something that would be impossible if all that could be controlled was a perception of the visual perception of finger position – which was all they could control when the prosthesis provided no proprioceptive sensory
feedback.

RM: There is no question that knowing PCT could help organize their efforts and make the development of such prostheses much more coherent and efficient. But they are doing pretty well without it.

Best

Rick

Just as our native limbs are trained to perform various actions — basic onees such as grasping or walking, to precision ones such as neurosurgery or ballet — prosthhetics, too,
have to be calibrated for specific uses. Engineers at the lab of Joseph Francis, associate professor of biomedical engineering at the University of Houston, have been working on a BCI that can autonomously update using implicit feedback from the user.

It’s pretty clear that updating the BCI is done by training a neural net. The 2018 London/Göttingen hand “interpreted the wearer’s intentions and sent commands to the artificial
limb… it used machine learning-based techniques
[=neural nets] to interpret neural signals from the brain to improve the performance of prosthetic hands.” This is why they say the same prosthetic would serve an amputee and someone with incapacitated
motor neural functions.

If they make more of that subtlety available on the output side and let the wearer’s brain do the reorganizing they’d be closer.

And however fine and diverse and subtle the inputs and outputs, it’s still a very pixillated channel:

“For all its merits, the LUKE Arm contains only 19 sensors, and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from or to the brain,� Clark
said. “That’s a lot, but both are impoverished compared with the thousands of motor and sensory channels of the human body, or its natural functional capabilities.�

I’ve attached a PDF that those of us without gigantic screens can read.

/Bruce

Richard S. Marken

"Perfection is achieved not when you have nothing more to add, but when you

have nothing left to take away.�

                            --Antoine de Saint-Exupery