Computational Cost of PCT

[Ben Hawker 2017.10.16 14:35]

Hi all,

I recall mentions of claims of PCT being a computationally efficient method of control.

Does anyone have a reference for this? I am struggling to find such.

Thanks,

Ben


Virus-free. www.avg.com

[From Dag Forssell 2017.10.16 0942 EDT]

Ben, what does "compute" mean to you?

Many scientists, cognitive and otherwise, are seduced by digital computers, computers that do indeed "compute". Some people think our brains must be supercomputers.

Much of Powers' insight came from his familiarity with analog computers, devices that do not "compute" at all. In B:CP, Powers laid out how neurons, and groups of neurons, combining, splitting, comparing neural currents could carry on all the functions necessary with no digital computing at all.

Perhaps the efficiency you refer to means no computing at all.

Models of PCT are cleverly programmed for digital computers to simulate analog computing. Powers clearly spelled out how to do that.

Did you search CSGnet for several key words at the same time in order to possibly find a discussion of this topic?

Best, Dag

···

[Ben Hawker 2017.10.16 14:35]

Hi all,

I recall mentions of claims of PCT being a computationally efficient method of control.

Does anyone have a reference for this? I am struggling to find such.

Thanks,
Ben

[Ben Hawker 2017.10.16 15:24]

···

On 16 October 2017 at 14:51, Dag Forssell csgarchive@pctresources.com wrote:

[From Dag Forssell 2017.10.16 0942 EDT]

Ben, what does “compute” mean to you?

Many scientists, cognitive and otherwise, are seduced by digital computers, computers that do indeed “compute”. Some people think our brains must be supercomputers.

BH: I do completely agree, and a good portion of my research is finding that biomimetic approaches to the physical aspects of a system (as well as the cognitive elements) are necessary. However, I wasn’t the one that claimed it! It’s simply a claim I’ve been hearing go around, which would imply PCT has a lower computational cost when simulated than other solutions.

Much of Powers’ insight came from his familiarity with analog computers, devices that do not “compute” at all. In B:CP, Powers laid out how neurons, and groups of neurons, combining, splitting, comparing neural currents could carry on all the functions necessary with no digital computing at all.

BH: Then perhaps said idea of “computational efficiency” came from someone claiming that on an analog computer, PCT would consume less resources of the analog computer over time than other typical solutions? Maybe that’s where the claim came from?

Perhaps the efficiency you refer to means no computing at all.

Models of PCT are cleverly programmed for digital computers to simulate analog computing. Powers clearly spelled out how to do that.

Did you search CSGnet for several key words at the same time in order to possibly find a discussion of this topic?

BH: I feel it’s best not to get lost in the semantics of the term “compute”… computational efficiency is a term used to mark the effort a system must go under in order to solve a problem. So, on a computer, this is perhaps how much processing it takes over time to solve the problem. The metric would be different but it is still doable on an analog computer, if comparing two approaches solving the same problem. So, whether one disputes whether we really compute, whether computers (either digital or analog) compute, the broader term of computational efficiency is still applicable.

BH: I’m assuming that you haven’t heard of this claim before? Any pointers as to academics or people that have or may have even made this claim would be very helpful!

Thanks,

Ben

[Ben hawker 2017.10.16 3:58]

I just remembered the claim is present on Wikipedia:

https://en.wikipedia.org/wiki/Perceptual_control_theory

“Engineering control theory is computationally demanding, but as the preceding section shows, PCT is not.”

···

On 16 Oct 2017 3:39 pm, “B Hawker” bhawker1@sheffield.ac.uk wrote:

[Ben Hawker 2017.10.16 15:24]

On 16 October 2017 at 14:51, Dag Forssell csgarchive@pctresources.com wrote:

[From Dag Forssell 2017.10.16 0942 EDT]

Ben, what does “compute” mean to you?

Many scientists, cognitive and otherwise, are seduced by digital computers, computers that do indeed “compute”. Some people think our brains must be supercomputers.

BH: I do completely agree, and a good portion of my research is finding that biomimetic approaches to the physical aspects of a system (as well as the cognitive elements) are necessary. However, I wasn’t the one that claimed it! It’s simply a claim I’ve been hearing go around, which would imply PCT has a lower computational cost when simulated than other solutions.

Much of Powers’ insight came from his familiarity with analog computers, devices that do not “compute” at all. In B:CP, Powers laid out how neurons, and groups of neurons, combining, splitting, comparing neural currents could carry on all the functions necessary with no digital computing at all.

BH: Then perhaps said idea of “computational efficiency” came from someone claiming that on an analog computer, PCT would consume less resources of the analog computer over time than other typical solutions? Maybe that’s where the claim came from?

Perhaps the efficiency you refer to means no computing at all.

Models of PCT are cleverly programmed for digital computers to simulate analog computing. Powers clearly spelled out how to do that.

Did you search CSGnet for several key words at the same time in order to possibly find a discussion of this topic?

BH: I feel it’s best not to get lost in the semantics of the term “compute”… computational efficiency is a term used to mark the effort a system must go under in order to solve a problem. So, on a computer, this is perhaps how much processing it takes over time to solve the problem. The metric would be different but it is still doable on an analog computer, if comparing two approaches solving the same problem. So, whether one disputes whether we really compute, whether computers (either digital or analog) compute, the broader term of computational efficiency is still applicable.

BH: I’m assuming that you haven’t heard of this claim before? Any pointers as to academics or people that have or may have even made this claim would be very helpful!

Thanks,

Ben

[From Rick Marken (2017.10.16.0815)]

[Ben hawker 2017.10.16 3:58]

I just remembered the claim is present on Wikipedia:Â

https://en.wikipedia.org/wiki/Perceptual_control_theory

“Engineering control theory is computationally demanding, but as the preceding section shows, PCT is not.”

RM: The reason PCT is considered computationally efficient compared to other approaches to control is that it does not involve computation of estimated future states of the environment. Most non-PCT approaches involve predictive control which requires computation of inverse kinematics. This is quite computationally intensive since it  involves continuous inversions of what are often substantial sized matrices. This is not only computationally intensive, whether done digitally or analog, but also unfeasible and unnecessary . This is discussed by Powers in the first chapter of LCS III.

Best

Rick

···

On 16 Oct 2017 3:39 pm, “B Hawker” bhawker1@sheffield.ac.uk wrote:

[Ben Hawker 2017.10.16 15:24]

On 16 October 2017 at 14:51, Dag Forssell csgarchive@pctresources.com wrote:

[From Dag Forssell 2017.10.16 0942 EDT]

Ben, what does “compute” mean to you?

Many scientists, cognitive and otherwise, are seduced by digital computers, computers that do indeed “compute”. Some people think our brains must be supercomputers.
Â

BH: I do completely agree, and a good portion of my research is finding that biomimetic approaches to the physical aspects of a system (as well as the cognitive elements) are necessary. However, I wasn’t the one that claimed it! It’s simply a claim I’ve been hearing go around, which would imply PCT has a lower computational cost when simulated than other solutions.

Much of Powers’ insight came from his familiarity with analog computers, devices that do not “compute” at all. In B:CP, Powers laid out how neurons, and groups of neurons, combining, splitting, comparing neural currents could carry on all the functions necessary with no digital computing at all.

BH: Then perhaps said idea of “computational efficiency” came from someone claiming that on an analog computer, PCT would consume less resources of the analog computer over time than other typical solutions? Maybe that’s where the claim came from?Â

Perhaps the efficiency you refer to means no computing at all.

Models of PCT are cleverly programmed for digital computers to simulate analog computing. Powers clearly spelled out how to do that.

Did you search CSGnet for several key words at the same time in order to possibly find a discussion of this topic?

BH: I feel it’s best not to get lost in the semantics of the term “compute”… computational efficiency is a term used to mark the effort a system must go under in order to solve a problem. So, on a computer, this is perhaps how much processing it takes over time to solve the problem. The metric would be different but it is still doable on an analog computer, if comparing two approaches solving the same problem. So, whether one disputes whether we really compute, whether computers (either digital or analog) compute, the broader term of computational efficiency is still applicable.

BH: I’m assuming that you haven’t heard of this claim before? Any pointers as to academics or people that have or may have even made this claim would be very helpful!

Thanks,

Ben

Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery

[From Adam Matic]

There is the arm model paper (pdf attached) where cascaded neg. feedback loops are claimed to be a simpler solution to arm control than inverse dynamics and inverse kinematics. No formal comparison, though.

I’d argue against the wikipedia example for computational simplicity. There are non-PCT inverted pendulum control schemes identical to the one in LCSIII, that one particular example on wiki is more complex.

1990 W.T. Powers, A model of kinesthetically and visually controlled arm movement.pdf (162 KB)

···

On Mon, Oct 16, 2017 at 5:00 PM, B Hawker bhawker1@sheffield.ac.uk wrote:

[Ben hawker 2017.10.16 3:58]

I just remembered the claim is present on Wikipedia:

https://en.wikipedia.org/wiki/Perceptual_control_theory

“Engineering control theory is computationally demanding, but as the preceding section shows, PCT is not.”

On 16 Oct 2017 3:39 pm, “B Hawker” bhawker1@sheffield.ac.uk wrote:

[Ben Hawker 2017.10.16 15:24]

On 16 October 2017 at 14:51, Dag Forssell csgarchive@pctresources.com wrote:

[From Dag Forssell 2017.10.16 0942 EDT]

Ben, what does “compute” mean to you?

Many scientists, cognitive and otherwise, are seduced by digital computers, computers that do indeed “compute”. Some people think our brains must be supercomputers.

BH: I do completely agree, and a good portion of my research is finding that biomimetic approaches to the physical aspects of a system (as well as the cognitive elements) are necessary. However, I wasn’t the one that claimed it! It’s simply a claim I’ve been hearing go around, which would imply PCT has a lower computational cost when simulated than other solutions.

Much of Powers’ insight came from his familiarity with analog computers, devices that do not “compute” at all. In B:CP, Powers laid out how neurons, and groups of neurons, combining, splitting, comparing neural currents could carry on all the functions necessary with no digital computing at all.

BH: Then perhaps said idea of “computational efficiency” came from someone claiming that on an analog computer, PCT would consume less resources of the analog computer over time than other typical solutions? Maybe that’s where the claim came from?

Perhaps the efficiency you refer to means no computing at all.

Models of PCT are cleverly programmed for digital computers to simulate analog computing. Powers clearly spelled out how to do that.

Did you search CSGnet for several key words at the same time in order to possibly find a discussion of this topic?

BH: I feel it’s best not to get lost in the semantics of the term “compute”… computational efficiency is a term used to mark the effort a system must go under in order to solve a problem. So, on a computer, this is perhaps how much processing it takes over time to solve the problem. The metric would be different but it is still doable on an analog computer, if comparing two approaches solving the same problem. So, whether one disputes whether we really compute, whether computers (either digital or analog) compute, the broader term of computational efficiency is still applicable.

BH: I’m assuming that you haven’t heard of this claim before? Any pointers as to academics or people that have or may have even made this claim would be very helpful!

Thanks,

Ben

[from Rick Marken (2017.10.16.0830)]

[From Adam Matic]

There is the arm model paper (pdf attached) where cascaded neg. feedback loops are claimed to be a simpler solution to arm control than inverse dynamics and inverse kinematics. No formal comparison, though.

I’d argue against the wikipedia example for computational simplicity. There are non-PCT inverted pendulum control schemes identical to the one in LCSIII, that one particular example on wiki is more complex.

RM: I agree. Computational simplicity is not what uniquely recommends PCT as a model of human behavior. It is certainly a necessary characteristic Of any theory of behavior — it has to bbe a mechanism that can plausibly be carried out by the nervous system – – but it is certainly not sufficient.

Best

Rick

···

On Mon, Oct 16, 2017 at 5:00 PM, B Hawker bhawker1@sheffield.ac.uk wrote:

[Ben hawker 2017.10.16 3:58]

I just remembered the claim is present on Wikipedia:Â

https://en.wikipedia.org/wiki/Perceptual_control_theory

“Engineering control theory is computationally demanding, but as the preceding section shows, PCT is not.”

On 16 Oct 2017 3:39 pm, “B Hawker” bhawker1@sheffield.ac.uk wrote:

[Ben Hawker 2017.10.16 15:24]

On 16 October 2017 at 14:51, Dag Forssell csgarchive@pctresources.com wrote:

[From Dag Forssell 2017.10.16 0942 EDT]

Ben, what does “compute” mean to you?

Many scientists, cognitive and otherwise, are seduced by digital computers, computers that do indeed “compute”. Some people think our brains must be supercomputers.
Â

BH: I do completely agree, and a good portion of my research is finding that biomimetic approaches to the physical aspects of a system (as well as the cognitive elements) are necessary. However, I wasn’t the one that claimed it! It’s simply a claim I’ve been hearing go around, which would imply PCT has a lower computational cost when simulated than other solutions.

Much of Powers’ insight came from his familiarity with analog computers, devices that do not “compute” at all. In B:CP, Powers laid out how neurons, and groups of neurons, combining, splitting, comparing neural currents could carry on all the functions necessary with no digital computing at all.

BH: Then perhaps said idea of “computational efficiency” came from someone claiming that on an analog computer, PCT would consume less resources of the analog computer over time than other typical solutions? Maybe that’s where the claim came from?Â

Perhaps the efficiency you refer to means no computing at all.

Models of PCT are cleverly programmed for digital computers to simulate analog computing. Powers clearly spelled out how to do that.

Did you search CSGnet for several key words at the same time in order to possibly find a discussion of this topic?

BH: I feel it’s best not to get lost in the semantics of the term “compute”… computational efficiency is a term used to mark the effort a system must go under in order to solve a problem. So, on a computer, this is perhaps how much processing it takes over time to solve the problem. The metric would be different but it is still doable on an analog computer, if comparing two approaches solving the same problem. So, whether one disputes whether we really compute, whether computers (either digital or analog) compute, the broader term of computational efficiency is still applicable.

BH: I’m assuming that you haven’t heard of this claim before? Any pointers as to academics or people that have or may have even made this claim would be very helpful!

Thanks,

Ben

Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery

[Martin Taylor 2017.10.16.11.03]

Not me. But I would like to know what you think that claim might

mean, because as I was told in school, sometimes a good way to
answer a question is to ask the right question. A good question
often does have an answer, perhaps an obvious one, whereas a poor
question invites propaganda in place of an answer.
When you say “efficiency” you imply there is some kind of a
standard, as there is for thermodynamic efficiency, or as is done
when a subject’s perceptual ability is compared with the ability of
a theoretical mathematical “ideal observer”. Or maybe “efficiency”
implies some metric of work done by one method as opposed to the
work that would be done by another method in performing the same
task.
If the task is to control some variable, do you include the work in
setting up the ability (reorganization, for PCT), or do you consider
only the work done in the specific task, on the grounds that the
already performed reorganization is reusable for many other tasks?
Others have mentioned the Arm2 demo, showing that reorganization can
lead to smooth coordination, but that coordination is the “task” of
the Arm, which acts on nothing in its environment. Nevertheless,
even though the reorganization of the Arm is done with no specific
task in mind, yet the reorganized arm could readily be incorporated
into systems that do act on the environment in order to control
perceptions of properties of the environment.
PCT fundamentally describes a massively parallel structure with
thousands or millions of little computing units, each of which does
nothing very complicated. Is that a waste of resources compared to a
few big powerful computing units, or is it an efficient way to avoid
having to learn how to build the monsters and then actually build
them? Indeed, it is a complex problem for a neural system to do the
simple arithmetic that is trivially easy for computers. PCT isn’t
very efficient at controlling, say, the answer to “what is the value
of x if 3x+2x+1=0”. Computers are much more efficient
than perceptual control systems for that kind of task. Neural
systems are much better than computers at tasks such as looking at
two people and choosing the more handsome. Who (or which) is more
efficient?
If you talk about simulation, again you have an issue of scale and
prior organization, and must ask what is an index of “efficiency”.
Is it computational complexity, in which a method that uses O(log N)
operations to perform a task with N entities is more efficient than
one that uses O(N) operations and much more efficient than one that
uses O(N2) operations, even though for N<K the ordering of the
number of operations might be reversed for some small K?
Let’s think about that last. Any time a variable in the environment
changes value, there is some effect on some other variables in the
environment. In the case of PCT, these are called side-effects when
the changing variable is the magnitude of output to the environment.
When N variables are controlled, a PCT simulation doesn’t compute
the influences of those side-effects pairwise. Some other approach
(say Predictive Coding) may need to do so. PCT would have a
complexity O(N), while the other approach would be O(N).
Even if the other approach uses much less computation than PCT
(though I can’t imagine one that would), PCT would still be the more
efficient.
All of that (and much more, if you want) is just a way of saying
that unless the question is refined, there’s no one answer to the
question of whether PCT is computationally efficient. When you say
efficient compared to X in task Y, then it can be answered. I know
of no publications that have addressed this issue. Mostly they deal
with one variable and talk about the need or non-need to compute
joint angles. In the case of PCT, those computations don’t happen,
so PCT is more efficient in that respect. And so on and so forth…
Lots of words, little help in your search.
Martin

···

On 2017/10/16 9:36 AM, B Hawker wrote:

[Ben Hawker 2017.10.16 14:35]

Hi all,

      I recall mentions of claims of PCT being a computationally

efficient method of control.

      Does anyone have a reference for this? I am struggling to

find such.

2

2

[From Bruce Abbott (2017.10.16.1200 EDT)]

Rick Marken (2017.10.16.0815) –

[Ben hawker 2017.10.16 3:58]

I just remembered the claim is present on Wikipedia:

https://en.wikipedia.org/wiki/Perceptual_control_theory

“Engineering control theory is computationally demanding, but as the preceding section shows, PCT is not.”

RM: The reason PCT is considered computationally efficient compared to other approaches to control is that it does not involve computation of estimated future states of the environment. Most non-PCT approaches involve predictive control which requires computation of inverse kinematics. This is quite computationally intensive since it involves continuous inversions of what are often substantial sized matrices. This is not only computationally intensive, whether done digitally or analog, but also unfeasible and unnecessary . This is discussed by Powers in the first chapter of LCS III.

BA: The term “engineering control theoryâ€? to which the writer of the Wikipedia article refers is a bit too broad.  What the writer probably meant is sometimes called “modern control theoryâ€? to distinguish it from the classical approaches, and includes techniques, such as Kalman filtering, that are computationally intensive and therefore unlikely to be carried out by biological nervous systems.

Bruce

[From Rupert Young (2017.10.16 18.40)]

  I may be splitting hairs, but I'd say PCT is computationally

lightweight rather than efficient. You could have two algorithms
doing the same computation, multiplying matrices, for example,
where one is more efficient than the other.

  With PCT, it is lightweight, because it is doing things in a

different (simpler) way than conventional approaches. With ball
catching, contrast Rick’s control of optical velocities
() with
Justin’s predictive modelling approach
()

Regards,

Rupert

···

http://www.mindreadings.com/ControlDemo/CatchXY.htmlhttps://www.youtube.com/watch?v=R6pPwP3s7s4
On 16/10/2017 14:36, B Hawker wrote:

[Ben Hawker 2017.10.16 14:35]

Hi all,

      I recall mentions of claims of PCT being a computationally

efficient method of control.

      Does anyone have a reference for this? I am struggling to

find such.

Thanks,

Ben


Virus-free.
www.avg.com

[From Rick Marken (2017.10.16.2130)]

···

Rupert Young (2017.10.16 18.40)

  RY: With PCT, it is lightweight, because it is doing things in a

different (simpler) way than conventional approaches. With ball
catching, contrast Rick’s control of optical velocities
(http://www.mindreadings.com/ControlDemo/CatchXY.html ) with
Justin’s predictive modelling approach
(https://www.youtube.com/watch?v=R6pPwP3s7s4).

RM: Nice point Rupert. I’ll just add that not only is the control of input approach to modeling catching more “lightweight” it is also the only one that can work reliably in a disturbance prone environment. I think Justin the robot might have trouble dealing with catching objects that change directions in midstream – objects like Frisbees. You can see how the PCT model deals with this in my Frisbee catching version of the catching simulation: http://www.mindreadings.com/ControlDemo/FrisbeeCatchXY.htmlÂ

RM: Justin might be able to handle the rather mild disturbances to the trajectories in this simulation but I’m pretty sure if would have a hard time with some of the sometimes very abruptly changing (and thus virtually unpredictable) trajectories that the model handles with ease in the toy helicopter interception study that I did the modeling for: https://www.dropbox.com/s/eymkj4bxuorpyuy/Chasin%27Choppers.pdf?dl=0

BestÂ

Rick

Rupert

On 16/10/2017 14:36, B Hawker wrote:

[Ben Hawker 2017.10.16 14:35]

Hi all,

      I recall mentions of claims of PCT being a computationally

efficient method of control.

      Does anyone have a reference for this? I am struggling to

find such.

Thanks,

Ben


Virus-free.
www.avg.com


Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery

[Ben Hawker 2017.10.17 10:25]

Hi all,

Thanks everyone. People have come to the same conclusion as me it seems, that they mean computationally lightweight rather than the mysterious term “efficient”. I’ll have a read of the first LCS III chapter and see if I can find anything.

Best,

Ben

···

On 17 October 2017 at 05:30, Richard Marken rsmarken@gmail.com wrote:

[From Rick Marken (2017.10.16.2130)]

Rupert Young (2017.10.16 18.40)

  RY: With PCT, it is lightweight, because it is doing things in a

different (simpler) way than conventional approaches. With ball
catching, contrast Rick’s control of optical velocities
(http://www.mindreadings.com/ControlDemo/CatchXY.html ) with
Justin’s predictive modelling approach
(https://www.youtube.com/watch?v=R6pPwP3s7s4).

RM: Nice point Rupert. I’ll just add that not only is the control of input approach to modeling catching more “lightweight” it is also the only one that can work reliably in a disturbance prone environment. I think Justin the robot might have trouble dealing with catching objects that change directions in midstream – objects like Frisbees. You can see how the PCT model deals with this in my Frisbee catching version of the catching simulation: http://www.mindreadings.com/ControlDemo/FrisbeeCatchXY.htmlÂ

RM: Justin might be able to handle the rather mild disturbances to the trajectories in this simulation but I’m pretty sure if would have a hard time with some of the sometimes very abruptly changing (and thus virtually unpredictable) trajectories that the model handles with ease in the toy helicopter interception study that I did the modeling for: https://www.dropbox.com/s/eymkj4bxuorpyuy/Chasin%27Choppers.pdf?dl=0

BestÂ

Rick

Rupert

On 16/10/2017 14:36, B Hawker wrote:

[Ben Hawker 2017.10.16 14:35]

Hi all,

      I recall mentions of claims of PCT being a computationally

efficient method of control.

      Does anyone have a reference for this? I am struggling to

find such.

Thanks,

Ben


Virus-free.
www.avg.com


Richard S. MarkenÂ

"Perfection is achieved not when you have nothing more to add, but when you
have nothing left to take away.�
                --Antoine de Saint-Exupery