Influence;Levels; Welcome James Nord

[From Bill Powers (960622.1045 MDT)]

Chris Cherpas (960621.0946 PT) --

     While the 11 levels are often intuitively appealing, I keep looking
     for variables that would constrain what the next level to develop
     in a developing organism would be (or better, what it _must_ be if
     that's possible).

You might try just looking at your own experiences, which is what I did.
All perceptions that I would classify as configurations, for example,
seem to require the presence of, and to be composed of, sensations which
are not themselves configurations. And every sensation seems to be
composed of a collection of different intensities. Every transition is a
transition from one set of configurations, intensities, or sensations to
another set of them. To this you can add that in order to control a
higher perception, you must be able to affect and change the lower
perceptions of which it is made. So all that remains is to examine every
perception of every possible kind, and trace its lineage to other
perceptions which, presumably, are of lower order. The old Gestalt
psychologists did some of this sort of subjective analysis, although
they didn't include the idea of varying one type of perception as the
means of controlling a higher type.

This sort of analysis provides basic observational data which any
mathematical model of perception has to fit. The problem with many
models of perception is that they begin with the mathematics, which then
generates its own kind of structure without regard to the structure that
actually exists. There's a strong tendency among mathematical modelers
to look for a single unit of perceptual organization which can then just
be repeated for any number of levels. This sort of model has a certain
esthetic appeal, and of course it is much easier to produce than a model
that has to explain something different at every level. Basically you
solve one problem, and then just say "etc". However, I doubt very much
that the same operations are involved at every level of real perception,
or that nature simply says "etc."

     While not sufficient to explain the development of a higher level,
     controlling perceptions involving bigger bandwidth seems to be the
     (at least typical) result.

Martin Taylor's reply to this idea (960621 15:30), while still
contaminated :=) by information theory, is a good one. The actual
bandwidth of higher-level systems is less than that of lower systems --
that is, the speed with which changes in the relevant variables can be
perceived or opposed is less than at lower levels. This is largely
because higher-level perceptions, by their very nature, can't exist over
short periods of time (consider "sequence"). However, because higher
systems in general use many lower systems in parallel, the mechanics of
their behavior may seem to involve higher bandwidths -- quicker
reactions to disturbances.

     By positing the "system concept" level as the highest level, PCT
     seems to be saying that the (cyclic or other) interactions of
     functional structures represent the limit to what kinds of
     perceptions we can control. But what about the process through
     which these functional structures change (e.g., evolution,
     learning, developing new levels in the hierarchy)?.

System concepts are only the highest level of perception I have been
able to catch myself using (and infer that others use). The fact that I
can notice them implies that there must be at least one higher level,
but since I have no place to stand from which to see it, I can't
identify it. Perhaps someone else with more levels could.

The process of change in an individual organization is subsumed under
the concept of reorganization (which see in B:CP). I don't conceive of
this as the "highest" system, because it must be fully functioning from
birth or before in order to account for the growth of the hierarchy of
control. And I don't see it as intelligent, because it must work before
the brain has any understanding of the external world or itself at any
level. I see it as a separate system, as it were off to one side of the
growing hierarchy. It is concerned only with controlling intrinsic
variables related to life support and perhaps other genetically
transmissible considerations.

···

-----------------------------------------------------------------------
Martin Taylor 960621 14:30 --

RE: influence.

     The "disturbing influence" in this case is then dist(t) = integral
     (disturbance torques) dt,

So you see the velocity as containing two distinct components, one due
to the disturbing variables and one to the output variable. This is more
or less what I thought.

This decomposition of the velocity, however, requires perceiving at
least the output force as well as the controlled velocity, and it also
requires knowing that the velocity changes in proportion to the output
force (in a linear system). These perceptual capabilities and this
knowledge of the nature of the feedback function are not part of the
basic ECU in PCT, although in Hans Blom's model they do exist even if
they're not explicitly spelled out (they're built into the simulation
program).

So even though an external analyst who can perceive these things can
compute the missing component of the velocity, the control system, the
simple ECU, cannot. This means that the ECU itself can't distinguish the
component of the velocity _perception_ that is due to the external
disturbing variables from the component due to the output variable.

When we describe the basic principle of operation of the basic ECU, or
build one, therefore, we can't make its operation depend on being able
to separate the components of the perceptual signal. The ECU works only
with the ENTIRE perceptual signal, the unseparated sum of its conceptual
components. It subtracts the ENTIRE perceptual signal from the reference
signal; it does not subtract just the component due to independent
disturbing variables. The effects of its own ongoing actions are just as
much part of the perceptual signal as the effects of independent
disturbing variables are: those effects get subtracted, too.

In speaking of how the ECU works, we often use a simplification that
makes the description shorter. We say that the output of the system
mirrors the disturbing variables. However, this is not strictly true.
The feedback function and the disturhance function may be quite
different from each other, and nonlinear to boot. But even more
important, this mirroring must contain an offset sufficient to bring the
perception to the same level as the reference signal. In other words,
the sum of output effects and disturbance effects is not zero, but is a
magnitude of the perceptual signal that matches the (generally nonzero)
magnitude of the reference signal, as translated through the input
function. Since the reference signal is given from above independently
of the operation of the control system, we can't say that the output and
disturbance effects sum to any particular net effect. They sum to
whatever net effect is currently specified by the reference signal (with
suitable approximations noted).

So it turns out that to deduce the component of the velocity that is due
to external disturbing forces, we must know not only the form of the
feedback function and the magnitude of the output variable, but the
current setting of the reference signal as well. Obviously, the simple
ECU has none of these pieces of knowledge. And since IT can't use these
pieces of knowledge in generating its control behavior, we can't use the
same pieces of knowledge in explaining how it works. We can't say that
the fact that there is a component of the perceptual signal that is due
to an external disturbance has any role in the process of control.

This is not to say that given knowledge of the various functions and
signals, an omniscient external observer could not deduce at least the
sum of the influences (in your sense) caused by (otherwise unknown)
external disturbances. However, this does not mean that the control
system does the same thing.

One of the beauties of the basic control process is that it can work
WITHOUT this kind of elaborate analysis and decomposition of the input.
I don't doubt that the external analyst could design a control system
that does use the knowledge that the analyst has, computing the
disturbing influence, calculating the output needed to oppose that
influence, and producing the output required to make the input match a
reference value. That is essentially how Hans Blom's version of a
control system works. But this is not the elementary control unit of
PCT: it is a MUCH more elaborate kind of design which, to be built,
would require many sensors of a kind that the simple ECU does not need,
and which is capable of extremely complex mathematical operations, such
as taking the inverse of a transfer function, which are not necessary in
the PCT model. Perhaps such elaborate designs do exist somewhere in the
brain, but they are completely unnecessary for most control processes.

I understand what you're saying: that the analyst can compute the
component of the perceptual signal that represents an influence of some
external independent disturbing variables. If some way were known of
computing the information flows in a closed-loop system (and as of
today, I have seen no such computation), it would probably be possible
to state how much of the technically-defined information in the
perceptual signal, in bits or bits per second, is due to the external
disturbances at least in the aggregate. But this is only a computational
exercise if the system itself is not separating that information out and
using it as the basis for constructing its output. The analyst has a
receiver for that kind of information, but if the control system does
not, then that information is significant only to the analyst.
-----------------------------------------------------------------------
James R. Nord (960622) --

Welcome aboard, James. I don't recall your letter of 23 years ago -- I
hope you'll forgive me. But I am very glad to know of your many parallel
experiences and ideas, and to welcome you to this discussion. It seems
that you have more of a running start than most people coming to PCT
have.

I have met all the cybernetics people you mention but Wiener, Ashby, and
Gray Walters. In fact, throughout the development of PCT both before and
after my book, I tried many times to get the model across, attending
meetings of the ASC and often receiving what seemed to be strong
support. But in fact, the CSG was formed at an annual ASC meeting in
Philadephia -- largely in response to the total lack of interest shown
in our presentations. PCT was seen as both competition and an
implication that there was something that the gods of cybernetics may
not have understood -- neither implication being welcome.

The basic problem was always that cyberneticists operated several levels
of philosophical abstraction above the level where PCT exists. Very few
of them had anything but a layman's notion of what a control system is
and how it works. The general attitude I encountered was "Oh, we know
all about that servomechanism stuff; nobody does that any more." But the
truth was that hardly anybody in cybernetics after (perhaps) Wiener ever
DID do that stuff. If there were ever any technical people in the ASC,
they quickly dropped out (or submerged themselves) and the dilettantes
and nuts took over. Have you every tried to conduct a poster session
with a computer demo of control theory while a troop of people tap-
danced in the middle of the hall? I have. By the time I started
seriously trying to introduce PCT to cybernetics, I found that I spent
most of my time defending my "behaviorist" position and trying to
convince people that "hierarchy" and "control" were not propagandistic
terms from a Fascist agenda. At one Gordon Conference, I had to listen
to Heinz von Foerster sneering at hierarchical control theory by
chanting "the PUUUURpose of the PUUUURpose of the PUUUUrpose." I have
not had uniformly good experiences with the ASC and its luminaries. It's
a shame, really, because in the background on both sides there seemed to
be a willingness to reach some common ground. But too much happened to
undermine the process.

I'll let others try to answer your questions. You will find some
interest in them. And I reciprocate your appreciation -- with your
background you'll slip right into our group without a ripple. Of course
if you want to make some ripples, that's fine, too.
-----------------------------------------------------------------------
Best to all,

Bill P.

[Martin Taylor 960624 16:00]

Bill Powers (960622.1045 MDT)

I'm going to quote initially no more than is necessary for you to identify
the message to which I am responding:

Martin Taylor 960621 14:30 --

RE: influence.

    The "disturbing influence" in this case is then dist(t) = integral
    (disturbance torques) dt,

So you see the velocity as containing two distinct components, one due
to the disturbing variables and one to the output variable. This is more
or less what I thought.

This decomposition of the velocity, however, requires perceiving at
least the output force as well as the controlled velocity, and it also
requires knowing that the velocity changes in proportion to the output
force (in a linear system). These perceptual capabilities and this
knowledge of the nature of the feedback function are not part of the
basic ECU in PCT,...

It may surprise you to know that I am with you 100%. Which doesn't make the
notion that the disturbance influences the CEV by a quantifiable amount any
the less valid. And I agree completely with Rick's posting (except
isofar as he imputes a different understanding to me) in:

+Rick Marken (960622.1040)

+The altitude control system (in you) is just one of many control systems
+controlling many variables -- some simple (like altitude, force, etc) and
+some complex (like the state of relationships, such as "covarying",
+"independent", "informative", between other variables). The variable
+that a control system controls does not inform that system about anything;
+it is just _controlled_ by that system.

I think I'm becoming clearer about the way in which we have for so long
been talking past one another. The misapprehension about input-output
relationships is only part of it. It never occurred to me that I might
have put that one to rest by pointing out that the disturbance waveform
carries information about the perceptual signal, just as much as the
reverse. And it is quite obvious that the control system never partitions
the fluctuations in the CEV into components due to its output and to the
disturbing variable. But, just because it was so obvious to me, I could
not see that you guys persisted in thinking that the statement "the
perceptual signal carries information about the disturbing influence"
could (and perhaps should) be translated into: "the control system
partitions the fluctuations of the perceptual signal into components
due to the disturbing variable, which it opposes, and components due to
its own output, which it ignores."

I know that you have essentially said that I believed this translation
to be appropriate, many times. And many times I have said that it isn't.
I have metaphorically _howled_ that the whole business only works because
of control--because failures of control can be perceived and corrected;
because the output "mirroring" the disturbing influence is a side-effect
of control. You two never believe I mean it. It doesn't sink in that there
_has to be_ a way in which what I say can be reconciled with what I say.

Likewise, I have never truly believed you thought I was so stupid as to
think that the single-valued perceptual signal could carry two values at
once, despite that you have actually said so, many times.

But the import of that has never before sunk in. It is putting together
the two different misapprehensions that has led me to think that _maybe_,
just maybe, there is a reconciliation in sight.

To try to pave the way to a reconciliation, can I try a few propositions
out on you?

1: The control system has no need to try in any way to partition the
causes of the fluctuations in the CEV.

2: The perceptual signal carries information about the output signal
(and vice-versa).

3: No physical control system can control against disturbances that
fluctuate erratically faster than some upper limiting rate.

I guess those will do for now. I anticipate agreement with 1 and 3, and
disagreement or puzzlement over 2. If you do disagree with 2, it would be
healthy if we can probe the underlying assumptions that lead to the
disagreement. If you agree with 2, we can proceed from there.

···

---------------
In relation to 3, consider Bruce Gregory's problem with controlling the
altitude of his plane.

In fact, the plane did a better job of flying itself than I did in
"helping" it.

+Yes. You have a control dynamics problem (your altitude control gain is
+too high, probably). Very common when learning to control.

His efforts to control were leading to a positive feedback condition
because the lags involved in the loop were too long. He may have taken
too long to perceive that the plane was descending instead of
ascending, or too long to produce the appropriate change in his own
output. I suspect the former, because it leads to a self-limiting form
of positive feedback, the perception becoming quicker when the changes
are more dramatic. (His description points to positive feedback, the
altitude excursions being smaller when he took his hands off the controls,
and it was obviously self-limiting because he's still alive).

+"Informativeness" is a _perception_ that, as you showed in you example,
+can itself be controlled.

Yes, indeed. But that's another topic for discussion, which probably shouldn't
get mixed up with "the information carried by the perceptual signal"
discussion.

Back to Bill Powers:

When we describe the basic principle of operation of the basic ECU, or
build one, therefore, we can't make its operation depend on being able
to separate the components of the perceptual signal.

This is an example of the kind of thing I took for granted was an agreed
position, because there's no way a single scalar perceptual signal could
carry the split components separately without the excess baggage of a
multiplexer-demultipler system that is not in the ECU diagram. And I guess
I didn't pay comments like this the attention I should have done, other
than to respond: I KNOW that!

So it turns out that to deduce the component of the velocity that is due
to external disturbing forces, we must know not only the form of the
feedback function and the magnitude of the output variable, but the
current setting of the reference signal as well.

As was the case in our "Magical Mystery Function" that was able to
reconstruct the disturbing influence to the degree that control was
sustained.

Obviously, the simple ECU has none of these pieces of knowledge.

Agreed, but as I said before, anyone simulating the action of the ECU, and
that means any modeller, must use those pieces of knowledge. It's exactly
the same as saying that you can't use information about the mass, radius,
and modulus of elasticity of a billiard ball in computing how it will
bounce, because the billiard ball itself knows none of its properties.

So, if you demand that I simulate the action of a billiard ball without
using information about its mass, radius, and modulus of elasticity, I
won't be able to do it. That doesn't mean these things are unimportant to
the way the ball behaves--that it does not use "information" about them
in its actions. Let me repeat: Physical things do what they do. Analysts
perceptions are where the properties are "known."

One of the beauties of the basic control process is that it can work
WITHOUT this kind of elaborate analysis and decomposition of the input.

I think that this notion of component segregation as an element of
"information about" is a real barrier to achieving a common understanding.
Let's get rid of it once and for all.

If some way were known of
computing the information flows in a closed-loop system (and as of
today, I have seen no such computation), it would probably be possible
to state how much of the technically-defined information in the
perceptual signal, in bits or bits per second, is due to the external
disturbances at least in the aggregate. But this is only a computational
exercise if the system itself is not separating that information out and
using it as the basis for constructing its output. The analyst has a
receiver for that kind of information, but if the control system does
not, then that information is significant only to the analyst.

Or the designer, or the person interesting in getting a different kind
of insight into the behaviour of a hierarchy in which the nature of
control is (correctly:-) seen as a reduction of the information available
internally from the disturbing influences.

Incidentally, "information flows" is a metaphor with a lot of baggage.
It suggests a one-way, time-directed traffic. But "after" can convey
information about "before" or "before can equally convey information
about "after." There's no "flow." In a causal system, it may prove
convenient to say that X influences Y and not the reverse, and to
treat that causal influence as an information flow, but it's not necessary
to do so.

In an informational analysis, if A conveys information about B, it is
probable that B also conveys information about A. That doesn't mean either
A or B can be partitioned into components AsubB and BsubA that uniquely
identify one another, and components Asub(notB) and Bsub(notA) that are
independent of each other.

Let me ask you a question: Do you think that the ECU "knows" its gain function
or its loop delay? Would it work differently if those had different values
when checked by an outside analyst? Is it significant only to the analyst
that the values are compatible with the loop being stable, or do you
consider that consequence to be significant for the ECU?

My own answer is that it is how the ECU behaves _as we perceive it to behave_.
And whether that is significant is a matter for the person doing the
perceiving.

Martin