[Hans Blom, 930621]
(Bill Powers (930620.1600 MDT))
The slogan "it's all perception" is much too static. It has a
connotation of having to act in an a priori circumprescribed
way given a set of perceptions and a set of top-level (very
slowly changing; innate?) reference levels.
I agree. Perceptions are learned, too. There is a reason for the
slogan, however. It's to remind us that each of us sits inside
one of these gadgets, and that ALL we know either of the world or
of our own actions and inner being consists of perceptions.
Agreed. This is the important point that CSG makes and that I fully
agree with. In places, I disagree with some of the details of your
models, however (you notice how each time I stress the 'optimal' usage
of what is perceived), but that does not detract from the importance
of this basic point.
We can tune our responses finer and finer, and reach ever
higher qualities of response and perception.
I'm not sure how you mean this, but it sounds like one of the
concepts we're trying to destroy. "Response" is the conceptual
opposite of "control." It implies a blind reaction to an input,
and carries overtones of jab-and-jump psychology.
I am talking in terms of the response of a control system to, say, a
step change of a reference level or of a disturbance. As you know, the
response might be overly damped or oscillatory. By a high quality
response I mean one that is approximately critically damped. Maybe you
show a great deal of self-control when your response is critically
damped
Control over control is self-control, perceiving your own
perceptions is self-perception, consciousness.
That sounds nice, but I don't believe it. If you diagram a system
that senses the stability of a control system and adjusts
parameters to control stability, you do not have a system
controlling itself: you have a system controlling something about
a different system. If you perceive your own perceptions, one
subsystem is perceiving the perceptions originating in another
subsystem, and most likely interpreting them in a different way.
The moment you say "I am thinking," you have denied the
statement: the system that is aware of the thinking is not
thinking, it is making a statement about a system that is
thinking. The "I" of which you speak is never the "I" that
speaks.
I do not believe in a monolithic "I", and neither do your models. Just
look at that "society of mind". There is a lot of parallel processing
going on in that brain of ours, and it is quite possible that one part
talks about what another part is thinking. In a mentally healthy
person, I assume that the parts are more or less in mutual contact; in
multiple personality disorder they are not.
The only way to make sense of self-reflexive ideas is to treat a
person as if that person were solid, like a potato: only the
whole person perceives and acts.
Demonstrably false. In split brain patients, one hand of the patient
may caress his wife while the other hand hits her. In my experience,
many people (all?) show such (psychological?) splits to some extent.
Inconsistencies, due to 'conflict'.
Loop gain, in PCT, is not "internal to the device."
In a hifi amp, it is. That was my example.
I suspect that
you haven't yet understood just how the PCT diagram differs from
the standard engineering one.
Yes, I think I do. But sometimes I need to make a principle clear
using the simplest example that I can think of, and I think that this
kind of reductionism is the basis of science. If you cannot clarify a
principle using a simple example, you will certainly be unclear when
things get complex. Anyhow, thanks for the repeat. One comment to this
section:
This separation is always important in detailed control-system
design, but especially so when the effector is coupled to the
controlled quantity loosely or through complex intervening
processes. Then we clearly would expect the effector output to be
changing far more than the controlled quantity is changing.
That would be bad in any control system, especially if the load can
vary. An optimal control system needs to know as exactly as possible
what the effects of its OUTPUTS are. This can be done by SENSING THE
ACTIONS OF THE EFFECTORS (by sensors in or near the effectors) and OB-
SERVING THE REACTION OF THE OUTSIDE WORLD to the actions of the effec-
tors (in any sensory modality). I therefore hypothesize that human
effectors have must have built-in sensors. These should sense what the
effectors do, not through a (changing) environment as your model
shows, but as intimately and directly as possible.
Note that this directly solves the adaptation (learning) problem as
well: correlating the effector's action with the world's reaction
'identifies' ('systems identification') the world. This is how
adaptive control systems work, and this is how I suppose a human
works. This is also where control theory meets information theory.
Our effectors do in fact have built-in and/or built-on sensors, as you
know. In my opinion, they have this additional function above the one
that you describe, the comparator function.
When we see the controlled variable separated from the effector
output, we can much more easily understand that the visible
behavior of an organism is really just its actuator output
Not when the actuator is loosely coupled to its environment (through
some layers of skin tissue, for example). An analogy is a battery with
a large internal resistance. Its 'visible' voltage greatly depends on
its load.
(Martin Taylor 930621 11:10)
It seems to me that just as a power gain is an essential element of
the outflow side of a control system, so a corresponding power loss
is an essential element of the inflow (perceptual) side. The
perceiving of the state of a CEV should not contribute as a disturb-
ance any more than it must (Heisenberg showed that it must, to some
extent).
How true. Very familiar, too. In my blood pressure controller, the
arterial pressure must be measured. Imagine what a thick needle in a
thin blood vessel can do to disturb the circulatory system and thence
the pressure measured!
On the other hand, the output power of the control
system wants to have maximum effect on the CEV, or as tight coupling
and as high power gain as is feasible, given the information limita-
tions on the perceptual side.
Feel that micromanipulator carefully tear a single cell away from its
surrounding tissue as your hand squeezes on the macroscopic counter-
part of the micro-pincer. Power GAIN? No! Very carefully scale the
power down! Almost Virtual Reality...
(Rick Marken (930621.1000))
could you just tell us -- does the sensory input to a control
system contain information regarding how "plant" outputs should
vary in order to control the sensed variable?
Hans Blom (930620) replies --
Sorry, but I decline to be arbiter. As a teacher (my other role) I
have to say that having this discussion is by far more fruitful than
knowing who is right.
But just out of curiosity, who is right? I'm sure we'll go on
discussing it anyway, even after we find out. Or is your point that
neither side is right? Or that both are right?
My experience says that servomechanism theory and information theory
are orthogonal. Neither has any use for the other. In the field that
is called 'optimal control theory', however, where you deal with noisy
sensors, noisy actuators and a noisy and/or a changing environment, it
becomes critically important to try to identify the characteristics of
the environment's reactions to your actions as accurately as possible.
Optimal control theory is very much an extension of the combination of
servomechanism theory and information theory. And therefore not very
attractive for people who refuse to think in terms of stochastic
differential equations and their ilk, I might say
It shows, once again, that PCT is trying to just make a simple point;
that is, behavior is the process if controlling INPUT perceptual
variables. That is a simple point, but it is basic.
Yes, but not only. See above.
(Tom Bourbon (930621.1301))
Identifying our flockness turned it into a controlled variable [CSG-
L], for each of us. From something that had never been a controlled
variable, we created one.
Nicely put! Yet, no single individual can 'control for' CSG-L. But
where two or three are gathered in the name of CSG-L...
(Tom Bourbon (930621.1323))
Take one example of
control, as it is recreated or predicted by PCT models, and show me,
in the results of simulations, how making more "information"
available "within the control system" improves the recreations and
predictions from the model.
Let me give you a practical example from my blood pressure controller.
The arterial blood pressure decrease delta_p due to an infusion flow
rate i can be modelled as
delta_p = sensitivity * i
where sensitivity is a constant that describes an individual's sensi-
tivity for the drug used. It can vary by a factor of 80. That is just
the static response, that is obtained as soon as a stable pressure is
established. The dynamics of the process can be modelled by a delay
time of about one minute and a dominant time constant of about one
minute as well, in case you're interested. I will leave out other
gruesome details, such as an often (but not always!) very pronounced
non-linearity of the response and frequent sensor malfunction due to
drawing blood through the same line that the pressure is measured
with.
In this controller it was crucial to estimate the the individual's
sensitivity (not necessarily very accurately; within a factor of two
was good enough) to obtain a step-response that did not oscillate yet
was fast enough to satisfy the anesthesiologist. Note that the sens-
itivity was not known and could not be established beforehand and had
to be established while controlling. I dare you to design a servo-
mechanism that can control under these circumstances. I tried but
failed miserably.
Then show me that those results generalize, with no further tinkering
with the model, to new conditions, with unpredictably different
disturbances and targets. That is not much to ask. Just improve on
the performance of a single-level, single-loop PCT model.
We did. With slight modifications (different choices for sensitivity
range, delay time, time constant), we got the system going with
another drug. That was also the control of blood pressure. Now we are
working on a very similar 'robust' control system for muscle relaxa-
tion, with excellent prospects.
(Tom Bourbon (930621.1348))
In a reply, Bill addressed your idea that loop gain is internal to
the device. (It is not.)
In a hifi power amp, it is. See above.
But analogue is what living
systems are all about. Neural currents and hormonal fluxes are
analogue, through and through.
No, they are not. They are MODELLED as such. Neural currents come in
units called action potentials, and hormones come in units called
molecules. Using 'fluxes' in a MODEL is fine, as long as you do not
forget that a model necessarily is a simplification of reality.
Greetings,
Hans