New prosthetic limbs go beyond the functional to allow people to ‘feel’ again

Yes!

image001337.jpg

···

Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery

[Bruce Nevin 20191225.16:30ET]

Ted Cloak Dec 24, 2019, 6:13 PM --Â

 Are these guys applying PCT without knowing it? This was reprinted in today’s Albuquerque Journal.

Richard Marken Dec 24, 2019, 6:48 PM –

Yes!

The input side has nice subtlety, and they talk about “intention signals”, but they have it “read the user’s neural activity and generate a command to control a prosthetic limb,” and they’re putting a neural net into that limb to perform functions of the spinal cord and motor areas of the brain:

Just as our native limbs are trained to perform various actions — basic ones such as graspinng or walking, to precision ones such as neurosurgery or ballet — prrosthetics, too, have to be calibrated for specific uses. Engineers at the lab of Joseph Francis, associate professor of biomedical engineering at the University of Houston, have been working on a BCI that can autonomously update using implicit feedback from the user.

It’s pretty clear that updating the BCI is done by training a neural net. The 2018 London/Göttingen hand “interpreted the wearer’s intentions and sent commands to the artificial limb… it used machine learning-based techniques [=neural nets] to interpret neural signals from the brain to improve the performance of prosthetic hands.” This is why they say the same prosthetic would serve an amputee and someone with incapacitated motor neural functions.

If they make more of that subtlety available on the output side and let the wearer’s brain do the reorganizing they’d be closer.

And however fine and diverse and subtle the inputs and outputs, it’s still a very pixillated channel:

“For all its merits, the LUKE Arm contains only 19 sensors, and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from or to the brain,� Clark said. “That’s a lot, but both are impoverished compared with the thousands of motor and sensory channels of the human body, or its natural functional capabilities.�

I’ve attached a PDF that those of us without gigantic screens can read.

prosthetic-arm.pdf (454 KB)

···

/Bruce

[Rick Marken 2019-12-26_09:16:08]

[Bruce Nevin 20191225.16:30ET]

Ted Cloak Dec 24, 2019, 6:13 PM --Â

 Are these guys applying PCT without knowing it? This was reprinted in today’s Albuquerque Journal.

Richard Marken Dec 24, 2019, 6:48 PM –

Yes!

BN: The input side has nice subtlety, and they talk about “intention signals”, but they have it “read the user’s neural activity and generate a command to control a prosthetic limb,” and they’re putting a neural net into that limb to perform functions of the spinal cord and motor areas of the brain:

RM: It’s hard to tell exactly what they are doing based on the description in the article. But I do think they are “applying PCT without knowing it” simply because they are providing sensory feedback based on motor output. Based on this quote from the article:Â

Participants can feel over 100 different locations and types of sensation coming from their missing
hand,� Clark said. “They can also feel the location and the contraction force of their muscles — even
when muscles aren’t there. That’s because we can send electrical signals up the sensory fibers from the
muscles, so the brain interprets them as real.â€?Â

It looks like they are providing sensory feedback regarding the actual consequences of the motor “commands” (how the prosthetic arm has moved) but also simulated sensory feedback about the muscle contraction forces that, if the muscles were actually there, would produce that result.Â

RM: I think they are unknowingly applying the basic principle of PCT which is that behavior is the control of perception. Think about it from the point of view of the wearer of the prosthetic. Before the development of this device all the wearer of an arm prosthesis could do was try to control the visual position of the arm. The prosthesis just generated movement in proportion to neural output signals; there were no proprioceptive sensory consequences of that output, which are the main perceptions we control when we move our arms.Â

RM: This new device allows the person wearing the prosthesis to control proprioceptive perceptions of their arm, which is the usual way it’s done. Of course, they have to do some engineering to make sure the variation in these perceptual signals are in the “correct” polarity relative to the effect of the neural output signals on movement of the prosthesis. This is necessary so that the feedback loop is negative; so when I intend for the perception to increase it increases, and vice versa. I believe this kind of “tuning” of the loop is what is going on with the development of the neural net architectures. But apparently they have the device tuned pretty well because, as they say, in a tracking task “the prosthetic finger was able to follow the cursor in a smooth, continuous path”, something that would be impossible if all that could be controlled was a perception of the visual perception of finger position – which was all they could control when the prosthesis provided no proprioceptive sensory feedback.Â

RM: There is no question that knowing PCT could help organize their efforts and make the development of such prostheses much more coherent and efficient. But they are doing pretty well without it.

BestÂ

Rick Â

···

Just as our native limbs are trained to perform various actions — basic ones such as graasping or walking, to precision ones such as neurosurgery or ballet — prosthetics, too, have to be calibrated for specific uses. Engineers at the lab of Joseph Francis, associate professor of biomedical engineering at the University of Houston, have been working on a BCI that can autonomously update using implicit feedback from the user.

It’s pretty clear that updating the BCI is done by training a neural net. The 2018 London/Göttingen hand “interpreted the wearer’s intentions and sent commands to the artificial limb… it used machine learning-based techniques [=neural nets] to interpret neural signals from the brain to improve the performance of prosthetic hands.” This is why they say the same prosthetic would serve an amputee and someone with incapacitated motor neural functions.

If they make more of that subtlety available on the output side and let the wearer’s brain do the reorganizing they’d be closer.

And however fine and diverse and subtle the inputs and outputs, it’s still a very pixillated channel:

“For all its merits, the LUKE Arm contains only 19 sensors, and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from or to the brain,� Clark said. “That’s a lot, but both are impoverished compared with the thousands of motor and sensory channels of the human body, or its natural functional capabilities.�

I’ve attached a PDF that those of us without gigantic screens can read.

/Bruce


Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery

[Bruce Nevin 20191226.16:58 ET]

Martin Taylor 2019.12.26.11.29 –

Yes, point well taken, and I agree, functionally it doesn’t matter where motor output signals are generated, and putting it in the prosthetic hand means it works for both classes of patients–bypassing intact neural functions in an amputee, and damaged ones in a paraplegic.

···

/Bruce

On Thu, Dec 26, 2019 at 11:35 AM Martin Taylor csgnet@lists.illinois.edu wrote:

[Martin Taylor 2019.12.26.11.29]

      [Bruce

Nevin 20191225.16:30ET]


It’s
pretty clear that updating the BCI is done by training a
neural net. The 2018 London/Göttingen hand " interpreted
the wearer’s intentions and sent commands
to the artificial limb …
it used machine
learning-based techniques [=neural nets] to interpret neural
signals from the brain to improve the performance of
prosthetic hands. " This is why they say the same
prosthetic would serve an amputee and someone with
incapacitated motor neural functions.

          If

they make more of that subtlety available on the output
side and let the wearer’s brain do the reorganizing they’d
be closer.

Bruce, given that the organic arm's local control systems

participate in a normal reorganization process as the organism
(human) matures, why would you want to deny the prosthetic arm the
same possibility? Or maybe you have an alternative to a neural
network in mind?

BTW, I have not read the subject article, which may well contain a

good answer. If so, I apologize for butting in.

Martin

[Bruce Nevin 20191226.17:21 ET]

Rick Marken 2019-12-26_09:16:08]

I may have sounded like I was doubting or disagreeing with your recognition. Just trying to suss it out. Yes, I agree, the need for feedback through sensors in the prosthetic hand is front and center in their account.

And Ted, I meant to say, it’s terrific hearing from you again!

···

/Bruce

On Thu, Dec 26, 2019 at 12:23 PM Richard Marken csgnet@lists.illinois.edu wrote:

[Rick Marken 2019-12-26_09:16:08]

[Bruce Nevin 20191225.16:30ET]

Ted Cloak Dec 24, 2019, 6:13 PM --Â

 Are these guys applying PCT without knowing it? This was reprinted in today’s Albuquerque Journal.

Richard Marken Dec 24, 2019, 6:48 PM –

Yes!

BN: The input side has nice subtlety, and they talk about “intention signals”, but they have it “read the user’s neural activity and generate a command to control a prosthetic limb,” and they’re putting a neural net into that limb to perform functions of the spinal cord and motor areas of the brain:

RM: It’s hard to tell exactly what they are doing based on the description in the article. But I do think they are “applying PCT without knowing it” simply because they are providing sensory feedback based on motor output. Based on this quote from the article:Â

Participants can feel over 100 different locations and types of sensation coming from their missing
hand,� Clark said. “They can also feel the location and the contraction force of their muscles — even
when muscles aren’t there. That’s because we can send electrical signals up the sensory fibers from the
muscles, so the brain interprets them as real.â€?Â

It looks like they are providing sensory feedback regarding the actual consequences of the motor “commands” (how the prosthetic arm has moved) but also simulated sensory feedback about the muscle contraction forces that, if the muscles were actually there, would produce that result.Â

RM: I think they are unknowingly applying the basic principle of PCT which is that behavior is the control of perception. Think about it from the point of view of the wearer of the prosthetic. Before the development of this device all the wearer of an arm prosthesis could do was try to control the visual position of the arm. The prosthesis just generated movement in proportion to neural output signals; there were no proprioceptive sensory consequences of that output, which are the main perceptions we control when we move our arms.Â

RM: This new device allows the person wearing the prosthesis to control proprioceptive perceptions of their arm, which is the usual way it’s done. Of course, they have to do some engineering to make sure the variation in these perceptual signals are in the “correct” polarity relative to the effect of the neural output signals on movement of the prosthesis. This is necessary so that the feedback loop is negative; so when I intend for the perception to increase it increases, and vice versa. I believe this kind of “tuning” of the loop is what is going on with the development of the neural net architectures. But apparently they have the device tuned pretty well because, as they say, in a tracking task “the prosthetic finger was able to follow the cursor in a smooth, continuous path”, something that would be impossible if all that could be controlled was a perception of the visual perception of finger position – which was all they could control when the prosthesis provided no proprioceptive sensory feedback.Â

RM: There is no question that knowing PCT could help organize their efforts and make the development of such prostheses much more coherent and efficient. But they are doing pretty well without it.

BestÂ

Rick Â

Just as our native limbs are trained to perform various actions — basic ones such as graasping or walking, to precision ones such as neurosurgery or ballet — prosthetics, too, have to be calibrated for specific uses. Engineers at the lab of Joseph Francis, associate professor of biomedical engineering at the University of Houston, have been working on a BCI that can autonomously update using implicit feedback from the user.

It’s pretty clear that updating the BCI is done by training a neural net. The 2018 London/Göttingen hand “interpreted the wearer’s intentions and sent commands to the artificial limb… it used machine learning-based techniques [=neural nets] to interpret neural signals from the brain to improve the performance of prosthetic hands.” This is why they say the same prosthetic would serve an amputee and someone with incapacitated motor neural functions.

If they make more of that subtlety available on the output side and let the wearer’s brain do the reorganizing they’d be closer.

And however fine and diverse and subtle the inputs and outputs, it’s still a very pixillated channel:

“For all its merits, the LUKE Arm contains only 19 sensors, and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from or to the brain,� Clark said. “That’s a lot, but both are impoverished compared with the thousands of motor and sensory channels of the human body, or its natural functional capabilities.�

I’ve attached a PDF that those of us without gigantic screens can read.

/Bruce


Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery