Ashby's Law of Requisite Variety

[From Chad Green (2013.1.3.1504 EST)]

EJ: Thus, the unknown (to the control loop) & unpredicted variation from the
reference signal accumulates in the preceptual signal. What is left in the
whirlpool, so to speak, is the remaining uncertainty deriving from the
disturbance function, & this accumulates in inverse form in the net output
(& by this I mean, the output as amplified or modified by the EFF.) So
there is a secondary reduction in uncertaintay, which means a gain in
information, at the point of that net output, as it more & more precisely
counteracts the net effect of any disturbances.

CG: In this context, could we replace "perceptual signal" with "value signal" in accordance with Edelman's theory of neuronal group selection, and while we're at it, rename this whirlpool to "value system"?

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"If you want sense, you'll have to make it yourself." - Norton Juster

[From Kent McClelland (2013.01.03.1440 CST)]

Hi Chad,

Could you give a little background on Edelman's theory of neuronal group selection for those of us who are unfamiliar with it, and also explain how you see it connecting with PCT? Thanks.

Kent

···

On Jan 3, 2013, at 2:05 PM, Chad Green wrote:

[From Chad Green (2013.1.3.1504 EST)]

EJ: Thus, the unknown (to the control loop) & unpredicted variation from the
reference signal accumulates in the preceptual signal. What is left in the
whirlpool, so to speak, is the remaining uncertainty deriving from the
disturbance function, & this accumulates in inverse form in the net output
(& by this I mean, the output as amplified or modified by the EFF.) So
there is a secondary reduction in uncertaintay, which means a gain in
information, at the point of that net output, as it more & more precisely
counteracts the net effect of any disturbances.

CG: In this context, could we replace "perceptual signal" with "value signal" in accordance with Edelman's theory of neuronal group selection, and while we're at it, rename this whirlpool to "value system"?

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"If you want sense, you'll have to make it yourself." - Norton Juster

[From Bill Powers (2013.03.1345 MST)]

So the question to ask is what uncertainty gets reduced by means of a
negative feedback control loop. Here I think the PCT answer is very
clear: The value of the perception becomes more & more certain, as it
is made to track the value of the reference. To take some liberties with
Shannon's communication language, the perceptual signal 'gets the
message' about what value the reference signal intended it to take.

BP: I am getting continually less certain about what "uncertain" means. If a perception is uncertain, doesn't that mean that its correspondence to some external situation is fluctuating unpredictably? I don't see how acting to make the perceptual signal match an internal reference has any effect on uncertainty at all. If the perceptual signal itself is a partly-unpredictable representation of some controlled variable, yet it's being kept in a match with the reference signal, wouldn't that imply that the system's action is also varying, and in just the way needed to maintain a nearly steady value of the perceptual signal? What if the unpredictable variations are just a consequence of unpredictable disturbances, faithfully and reliably reflected in perception? The actions of the system keep the disturbance from affecting the controlled variable and the perception of it, so the perception itself remains predictably in a match to the reference state, yet continual action is needed to prevent disturbances from affecting it very much.

It seems to me that a perceptual signal is just perceived; it's not evaluated as to its relation to reality. That's assuming that this relation is what is uncertain. If you see a pink elephant in a corner of the living room, then that's what you see -- what's uncertain about that? It seems to me that the relation of a perception to the actual environment is something that an external observer has to evaluate, if possible.

Another possible meaning of uncertain has to do with predicting future values of the perceptual signal, or future relationships of one perceptual signal to other perceptual signals, or any other kind of higher-order future aspect of the role played by the perceptual signal in the context of all others at all levels. But if that is the meaning, then how does the behaving system know that the lack of predictability is due to the way the represented variable is changing, rather than due to fluctuations in the functions that are sensing or computing the perceptual signals? Is it the process of perception that has an unpredictable component, or the thing that supposedly is being perceived? If some external variable is changing randomly, and the perceptual signal representing its state is changing in exact correspondence to those external changes, is there anything uncertain about the perceptual signal?

Best,

Bill P.

···

At 12:56 AM 12/21/2012 -0500, Erling Jorgensen wrote:

[Martin Taylor 2013.01.03.17.05]

I'm on the fifth rewrite of my attempt at a tutorial on information

and control, this time concentrating only on “uncertainty” and
“information”, which I am separating from a subsequent discussion of
control, but I can break off from it to answer this one question. In information theory analysis, “uncertain” has a precise meaning,
which overlaps the everyday meaning in much the same way as the
meaning of “perception” in PCT overlaps the everyday meaning of that
word.
If there is a system X, which could be a variable, a collection, or
the like, that can take on distinguishable values x_i, with
probabilities p(x_i)the uncertainty of X is __
by two formulae, one for discrete collections or variables
U(X) = - Sum_over_i (p(x_i) log(p(x_i)))
the other for the case in which “i” is a continuous variable. If “i”
is continuous, p(x_i) becomes a probability density and the sum
becomes an integral
U(X) = - Integral_over_i (p(x_i) log(p(x_i)) di).
That’s it. No more, no less. Those two equations define
“uncertainty” in the discrete and the continuous case.
The assignment of probability density in the case of a continuous
variable depends on the choice of unit for “i”, and so does the
measure of uncertainty. There are a few other technical issues with
the uncertainty of continuous variables, but calculations using
uncertainty work out pretty much the same whether the variable is
discrete or continuous.
Any philosophical considerations apply to the assignment of
probability and probability density values, and how you arrive at
the probability values. But there is no uncertainty about what
“uncertainty” means. I think Erling’s metaphoric paragraph captures pretty well an
intuitive notion of one aspect of what goes on in control.
Now I’ll get back to my series of attempts to make all this follow
intuitively with as little mathematics as I can get away with, so
that when I use it to analyze control it will all seem quite normal.
As a problem, this is roughly equivalent to doing the same for
Fourier analysis, so bear with me if I find I need a sixth rewrite
before posting it. Going through this exercise of making something
intuitive to other people that has been pretty intuitive with me for
nearly 60 years helps my own understanding.
Martin

···
  [From Bill Powers (2013.03.1345 MST)]




  At 12:56 AM 12/21/2012 -0500, Erling Jorgensen wrote:
    So the question to ask is what uncertainty

gets reduced by means of a

    negative feedback control loop.  Here I think the PCT answer is

very

    clear:  The value of the perception becomes more & more

certain, as it

    is made to track the value of the reference.  To take some

liberties with

    Shannon's communication language, the perceptual signal 'gets

the

    message' about what value the reference signal intended it to

take.

  BP: I am getting continually less certain about what "uncertain"

means.

defined

[From Bill Powers (2013.01.03.1740 MST)]

Martin Taylor 2013.01.03.17.05 –

BP: I am getting continually
less certain about what “uncertain” means.

MT:I’m on the fifth rewrite of my attempt at a tutorial on information
and control, this time concentrating only on “uncertainty” and
“information”, which I am separating from a subsequent
discussion of control, but I can break off from it to answer this one
question.
In information theory analysis, “uncertain” has a precise
meaning, which overlaps the everyday meaning in much the same way as the
meaning of “perception” in PCT overlaps the everyday meaning of
that word.
If there is a system X, which could be a variable, a collection, or the
like, that can take on distinguishable values x_i, with probabilities
p(x_i)the uncertainty of X is defined by two formulae,
one for discrete collections or variables

U(X) = - Sum_over_i (p(x_i) log(p(x_i)))

the other for the case in which “i” is a continuous variable.
If “i” is continuous, p(x_i) becomes a probability density and
the sum becomes an integral

U(X) = - Integral_over_i (p(x_i) log(p(x_i)) di).

That’s it. No more, no less. Those two equations define
“uncertainty” in the discrete and the continuous
case.

This formal definition doesn’t help me much, I’m afraid. I can see what
the calculation is, but I still don’t understand what the phenomenon is
to which the word “uncertainty” is attached. You’re showing how
to compute how much of it there is, but you’re not saying what
“it” is. What does the “everyday meaning” of
uncertainty have to do with the calculation? Are you saying that the
everyday meaning is obtained when a person has a perceptual input
function that performs one of the above calculations? Does your formula
apply when the system involved, such as a cruise control, does not
contain the circuitry necessary for experiencing uncertainty?

When you say that a variable “can take on” values with certain
probabilities, what is this process of “taking on”? Suppose the
variable is

v = A*sin(wt)

This variable “takes on” values between -1 and 1. If we measure
over an integral number of cycles, at discrete intervals t0 + i*dt, how
do we calculate the probability p(v[i])?

It would seem that there must be a random variable involved, and repeated
trials must be involved, in order for us to speak of
probabilities.

MT: Any philosophical
considerations apply to the assignment of probability and probability
density values, and how you arrive at the probability values. But there
is no uncertainty about what “uncertainty”
means.

BP: Well, given a data set I can certainly perform the calculations you
describe. But when I have a value for U(X), what is it the value OF,
other than that calculation?

Best,

Bill P.

[From Rick Marken (2013.01.03.1930)]

Bill Powers (2013.01.03.1740 MST)]

BP: I am getting continually less certain about what "uncertain" means.

MT: In information theory analysis, "uncertain" has a precise meaning...

... U(X) = - Sum_over_i (p(x_i) log(p(x_i)))

...U(X) = - Integral_over_i (p(x_i) log(p(x_i)) di).

BP: This formal definition doesn't help me much, I'm afraid. I can see what the
calculation is, but I still don't understand what the phenomenon is to which
the word "uncertainty" is attached.

RM: 2013 looks to be a very informative year;-)

Best

Rick

···

--
Richard S. Marken PhD
rsmarken@gmail.com
www.mindreadings.com

[Martin Taylor 2013.01.03.23.02]

[From Bill Powers (2013.01.03.1740 MST)]

  Martin Taylor 2013.01.03.17.05 --
      BP: I am getting

continually
less certain about what “uncertain” means.

    MT:...
    If there is a system X, which could be a variable, a collection,

or the
like, that can take on distinguishable values x_i, with
probabilities
p(x_i)the uncertainty of X is _defined _ by two
formulae,
one for discrete collections or variables

    U(X) = - Sum_over_i (p(x_i) log(p(x_i)))



    the other for the case in which "i" is a continuous variable.

If “i” is continuous, p(x_i) becomes a probability density and
the sum becomes an integral

    U(X) = - Integral_over_i (p(x_i) log(p(x_i)) di).



    That's it. No more, no less. Those two equations define

“uncertainty” in the discrete and the continuous
case.

  This formal definition doesn't help me much, I'm afraid. I can see

what
the calculation is, but I still don’t understand what the
phenomenon is
to which the word “uncertainty” is attached.

You often use the example of "energy" to illustrate an abstraction.

The calculation of “energy” depends on measuring various other
quantities, themselves abstractions. But everyone has a vague
intuitive idea about it, something that probably was not true 300
years ago. The same is true of “frequency”, or “average”, or
“power”.

  You're showing how

to compute how much of it there is, but you’re not saying what
“it” is.

"It" is what the computation produces. That's why I said "No more,

no less". Have you ever seen an “average”? It’s what a computation
produces from certain data, no more, no less. But as with
“uncertainty” or “energy”, people have a vague intuitive idea of “a
man of average height”. You CAN tie in these concepts with some
rather ill-defined everyday concept, and sometimes it helps if you
do that. But it can be misleading as well, just as using an
intuitive concept of “perception” can be misleading when considering
the implications of PCT.

  What does the "everyday meaning" of

uncertainty have to do with the calculation?

No more than the "everyday meaning" of "perception" has to do with

the meaning of “perception” in “Perceptual Control Theory”.

  Are you saying that the

everyday meaning is obtained when a person has a perceptual input
function that performs one of the above calculations?

No perceptual input system I ever heard of could make such a

computation. I suppose it would be possible to imagine one, but
usually it is a mathematical analyst performing numerical
calculations, or more commonly doing algebraic manipulations, that
would do the calculations. We do perceive our own uncertainty about
things, but I seriously doubt that the formal computation is
involved in creating such a perception.

  Does your formula

apply when the system involved, such as a cruise control, does not
contain the circuitry necessary for experiencing uncertainty?

"Uncertainty" is indeed something one experiences, a lot. But I

don’t think that has to do with any explicit calculation. I don’t
know why you bring up “experiencing” uncertainty. No control system
that “experiences” (i.e. has a perceptual function that produces a
value of) any perception also “experiences” another perception such
as that of “uncertainty”. A control system for which the perception
was simply “uncertainty” would not be experiencing any other
perception. The cruise control “experiences” various flight
parameters, none of which is “uncertainty” so far as I know. Nor
does it experience a spectrum of output variation, despite the fact
that a Fourier analysis of any of its output variables has such a
spectrum, which can be computed from a time series of its output
values.

  When you say that a variable "can take on" values with certain

probabilities, what is this process of “taking on”? Suppose the
variable is

  v = A*sin(wt)



  This variable "takes on" values between -1 and 1. If we measure

over an integral number of cycles, at discrete intervals t0 +
i*dt, how
do we calculate the probability p(v[i])?

Now you are getting to the place where the interesting problems

exist.

What is meant by "probability"? I could get into an interesting

discussion on that, which I hinted at in my message, but to do so
would, I think, be a side track. The computations of uncertainty
work the same no matter how you choose to think of “probability”,
provided you wind up with mutually exclusive possibilities whose
total probability sums to 1.0. It works the same for frequentists
and for subjectivists.

============Topic Change===========

What follows is a side-track into the "what is probability"

discussion, which I do not want to pursue while I’m working on the
uncertainty, information, and control series of messages. I put my
position here, and expect to leave it at that.

  It would seem that there must be a random variable involved, and

repeated
trials must be involved, in order for us to speak of
probabilities.

Here is where I deeply disagree with you, and with a whole school of

thinkers. Your statement reflects the “frequentist” or “objectivist”
school of thought about probability. I think that is odd, given your
strong assertion that all we know of the world is what we perceive,
not what is “objectively” out there in the real world. The word
“random” also raises a reddish flag, but I’m not going to go into
that other than to say that in most cases where that word is used,
it is used as a substitute for “I don’t know the influences that
determine what it will be”.

Does the statement "I think the probability is less than .001 that

the next bird I see in my garden will have blue wings and a red
breast" mean anything to you? If it does, you are not a frequentist,
because no random variable is involved and there can be no repeated
trials. Nor can there be a repeated trial for a more mundane
statement that “I think the probability is close to 0.5 that the
very next time I toss a coin it will come up heads.” Or “I think the
probability is about 0.2 that I will take a tour to South Africe
next Spring” (that happens to be a true statement, but if you had
asked in October, the probability would have been around 0.8). I
have never been to South Africa, so where are the repeated trials?

You might well say that I have tossed a coin many times and have

observed that heads have come up on about half the trials, but in
truth there were NO repeated trials, because each time I tossed a
coin, the surrounding circumstances were different. The individual
tosses were different in many, many, ways, all of which you might
judge to have had no influence on the fall of the coin. But it is
only your judgment that allows you to call the different coin tosses
“repeated trials”. There has never been a repeated trial of anything
at all, though people have judged that the differences among the
trials don’t matter enough to make a difference in the trial result.
Failures of such judgments have caused not a few scientific errors.

I say that the reason I can make the probability statement (0.5

heads) about the coin is that I have a mental model of what may or
may not influence the coin the next time I toss it, and nothing in
that model favours head or tail. If I give the coin to a conjurer, I
might change my assessment of the probability of a head on the next
toss if I know that the result makes a difference to the coin
tosser.

In your question about the sine wave, we have to make certain

assumptions. Does the person assessing the probabilities realize
that the measures are on a sine wave? Do the individual measures
come tagged with the time of their observation? We need to know on
what the probability is conditioned. All probabilities are
conditional. If the person realizes that the samples are from a sine
wave, the probability assessment will be based on that model. If
not, the probability assessment will most likely be based on the
idea that what has been is what will continue to be, and the
probabilities will be based on observed frequencies, just as would
be the case for someone in the “frequentist” camp.

The interesting question is not about the probability that a

particular value will turn up at some as yet unchosen time in the
past or future, but about the probability that a particular value
will turn up on the very next sample. The result will be very
different if the samples are time-tagged than if they are not.

    MT: Any philosophical

considerations apply to the assignment of probability and
probability
density values, and how you arrive at the probability values.
But there
is no uncertainty about what “uncertainty”
means.

  BP: Well, given a data set I can certainly perform the

calculations you
describe. But when I have a value for U(X), what is it the value
OF,
other than that calculation?

That is something I hope to be able to clarify. As I emphasised

above, it is, like “energy”, or “average”, or “perception”, or
“frequency” only the result of a computation, and is defined by the
form of the computation. It becomes a bit more when used in a real
life situation, just as do these other concepts.

Your questions are useful, because I had no idea that such issues

might cause you problems. I will have to address them more than I
had intended. Perhaps doing a parallel intuitive introduction to
Fourier analysis might make things clearer. I hesitate to do that,
though, because the current draft of an introduction only to
Uncertainty and Information is too long, even without addressing the
use of the concepts in control analysis.

Martin

[From Rick Marken (2013.01.05.1030)]

Martin Taylor (2013.01.03.23.02)--

BP: I am getting continually less certain about what "uncertain" means.

MT...

U(X) = - Sum_over_i (p(x_i) log(p(x_i)))

...That's it. No more, no less. Those two equations define "uncertainty" in the
discrete and the continuous case.

BP: You're showing how to compute how much of it [uncertainty] there is, but
you're not saying what "it" is.

MT: "It" is what the computation produces. That's why I said "No more, no less".
Have you ever seen an "average"? It's what a computation produces from
certain data, no more, no less.

RM: I think I can do a little better than that, with both "average"
and with the "uncertainty" [U(X)] measure of information. An "average"
(in terms of the mean defined as [Sum X] /N) is the "center of
gravity" of a set of numbers. If the frequency distribution of the
number set is thought of as its sitting on a seesaw then the average
is the position of the fulcrum that balances the seesaw. So "it"
(average) is the center of gravity of a set of numbers.

"Uncertainly" in the information theory sense, is the minimum number
of yes/no (binary) questions you would have to ask (on average) to
determine which of a set of N possible messages was actually sent.
This assumes that the receiver (guesser) knows the entire set of
possible messages and their probability of being sent. If there are
two possible messages (say "heads" and "tails") and each is equally
probable then you would need only one question, such as "Is it heads",
to determine what the message is. If the answer is "yes" you know it's
heads; if the answer is "no" you know it's tails.

The uncertainty measure for this coin toss message set is 1 bit
(uncertainty is measured in "bits", reflecting the fact that it is a
measure of binary -- yes/no -- questions) since U(X) in this case is
.5*log2(.5)+.5*log2(.5) = 1.0. So the uncertainty measure of
information says that the minimal number of yes/no questions you would
have to ask of this message set to determine which message was sent on
each trial is 1 question. The higher the value of U(X) the greater the
number of _binary_ questions you would have to ask, on average, to
determine what message was sent. Thus, when you get the message it
has reduced the number of questions you would have had to ask to find
out what that message is by U(X); when the coin turns up "heads", for
example, it has reduced the number of questions you would have had to
ask by 1.0. U(X) is thus as measure of the "amount" of information
contained in a message (like the message "heads") in terms of the
average number of binary questions that you _didn't_ have to ask in
order to find out what message was sent.

The uncertainly measure becomes fractional because it is a average of
the number of questions you would have to ask and also because this
average is biased by knowledge of the probability of each message in
the set. If, for example, you know that a die is fair so that the
probability of each side coming up is precisely 1/6 then on each roll
the side that comes up has reduced your uncertainty (the average
number of binary questions you would have to ask to find out which
side actually came up) by 15.51 bits.
If you know that the die is unfair so that the probability of, say,
1, is .8 while that of the other sides is .04 then on each roll the
side that comes up has reduced your uncertainty by 23.54 bits. The
unequal probabilities actually increase the binary number of questions
that would have to be asked, on average, to determine what "message"
was sent.

So "it" ( "uncertainty") is simply the average number of binary
questions required to determine which one of a known set of messages,
each with a known probability, was sent. The information in a message
is, thus, measured in terms of the number of questions that would
_not_ have to be asked in order to determine what the message is.

What all this has to do with control theory is quite beyond me so I
will leave that determination to the bears out there with very much
bigger brains.

Best

Ricky the Pooh

···

--
Richard S. Marken PhD
rsmarken@gmail.com
www.mindreadings.com

[From Rick Marken (2013.01.05.1045)]

Rick Marken (2013.01.05.1030)--

Correction!!! I did the uncertainty calculations here incorrectly (I
forgot to multiply probabilities and logs). So what I said here:

RM: If, for example, you know that a die is fair so that the
probability of each side coming up is precisely 1/6 then on each roll
the side that comes up has reduced your uncertainty (the average
number of binary questions you would have to ask to find out which
side actually came up) by 15.51 bits.

If you know that the die is unfair so that the probability of, say,
1, is .8 while that of the other sides is .04 then on each roll the
side that comes up has reduced your uncertainty by 23.54 bits. The
unequal probabilities actually increase the binary number of questions
that would have to be asked, on average, to determine what "message"
was sent.

is wrong. Each roll of a fair die reduces uncertainty by 2.58 bits
(not 15.51). Each roll of the biased die reduces uncertainly by only
1.19 bits (not 23.54). So a biased (unequal probability) message set
actually has a lower level of uncertainty than a fair (equal
probability) set. Which makes sense since you are less uncertain about
what's going to come up with a biased than an unbiased die.

Sorry about that but what do you expect from a bear with little brain.

Best

Ricky the Pooh

···

--
Richard S. Marken PhD
rsmarken@gmail.com
www.mindreadings.com

[Martin Taylor 2013.01.05.13.37]

[From Rick Marken (2013.01.05.1030)]

Martin Taylor (2013.01.03.23.02)--

BP: I am getting continually less certain about what "uncertain" means.
MT...
U(X) = - Sum_over_i (p(x_i) log(p(x_i)))
...That's it. No more, no less. Those two equations define "uncertainty" in the
discrete and the continuous case.
BP: You're showing how to compute how much of it [uncertainty] there is, but
you're not saying what "it" is.

MT: "It" is what the computation produces. That's why I said "No more, no less".
Have you ever seen an "average"? It's what a computation produces from
certain data, no more, no less.

RM: I think I can do a little better than that, with both "average"
and with the "uncertainty" [U(X)] measure of information. ...

You provide a pretty good intuitive explanation of "uncertainty about" a message. Since I would argue (and have argued in various forums) that uncertainty (and therefore "information") is always "about" something we could archive it. But I don't think it quite satisfies Bill's question, at least as I understood the question -- Bill will have to answer for himself.

I was not thinking of "messages" deliberately sent from a transmitter to a receiver. I was thinking of observations, perception, the fact that one's conscious perception, and the current value of a perceptual signal, though precise, does not necessarily precisely represent the current state of the environmental variable one is influencing in control. Imagine the blurry image of something if you have bad eyes and forgot your glasses. Your perception is what it is, with no ambiguity, the environmental variable is what it is, with no ambiguity, but given the perceptual value, the environmental variable could have a range of possible values. The probability distribution over that range allows the computation of an uncertainty value. Or, one perceives clearly and accurately the value of some environmental variable, but then looks away or goes to do something else. Over time, the value of the variable may change, and the probabilities of its possible current values can be computed as uncertainties that increase over time.

Your binary question approach explains why the logarithm in the uncertainty conputation is taken to the base 2, and uncertainty and information are usually recorded in "bits". I like it as an introductory explanation.

Martin

[Martin Taylor 2013.01.15.13.56]

[From Rick Marken (2013.01.05.1045)]

Rick Marken (2013.01.05.1030)--

Correction!!! I did the uncertainty calculations here incorrectly (I
forgot to multiply probabilities and logs). So what I said here:
...
Sorry about that but what do you expect from a bear with little brain.

Best

Ricky the Pooh

No problem. Your presentation was what mattered. Who doesn't make that kind of mistake? It could have passed unnoticed, but it's nice that you corrected it.

Martin

[From Bill Powers (2012.01.05.1035 MST)]

Martin Taylor 2013.01.03.23.02 –

BP earlier: This formal
definition doesn’t help me much, I’m afraid. I can see what the
calculation is, but I still don’t understand what the phenomenon is to
which the word “uncertainty” is attached.

MT: You often use the example of “energy” to illustrate an
abstraction. The calculation of “energy” depends on measuring
various other quantities, themselves abstractions. But everyone has a
vague intuitive idea about it, something that probably was not true 300
years ago. The same is true of “frequency”, or
“average”, or “power”.

BP: Yes. I’ve used such terms often, as most of us have. But now I find
myself looking at them and asking what I mean by them. This really
started when I entered college physics and the same questions came up for
the second time (the first time they came up was in high school physics,
but I believed everything I was taught then). When I asked the same
questions in college, the professor got impatient with me – he wasn’t
teaching a course in philosophy.

BP earlier: You’re showing how
to compute how much of it there is, but you’re not saying what
“it” is.

MT: “It” is what the computation produces. That’s why I said
“No more, no less”. Have you ever seen an “average”?
It’s what a computation produces from certain data, no more, no
less.

BP: It’s the “no more” part that is slowing me down. Aren’t all
these computed entities supposed to be proposals about the nature of
reality? I asked my professor in college if we really needed such terms
– wouldn’t it be possible to say everything we knew about the physical
world without using the term “energy?” As I remember it, that
is the one I was asking about then. For example if a certain number of
kilograms of force were used to move a paddle around and around for some
distance through one kilogram of water, that would raise the temperature
of the water by some amount, and we would find that the amount per
kilogram-meter remained constant. Why imagine that something called
energy has been transferred from the stirring paddle to the water?
There’s no way to measure the energy directly – we always have to look
at some specific example and then imagine the energy, don’t we?The
smart-assed kid, needless to say, didn’t get a straight answer about
that, other than “that’s the way it’s done in
physics.”

MT: But as with
“uncertainty” or “energy”, people have a vague
intuitive idea of “a man of average height”. You CAN tie in
these concepts with some rather ill-defined everyday concept, and
sometimes it helps if you do that. But it can be misleading as well, just
as using an intuitive concept of “perception” can be misleading
when considering the implications of PCT.

BP: Yes, and I don’t think I ever disputed the usefulness of such
short-cuts. It was only when I had taken psychology courses for a while,
and learned that this sort of thing was done in psychology a lot more
than in physics, that I learned about the word “reification,”
which was used almost as a name for a disorder.

BP: What does the “everyday
meaning” of uncertainty have to do with the
calculation?

MT: No more than the “everyday meaning” of
“perception” has to do with the meaning of
“perception” in “Perceptual Control
Theory”.

BP: I don’t think I agree with that. In PCT, the parts of the model
having to do with perception are intended to amount to an explanation of
what we can see directly: the observed fact of a world in and around us.
If that world is outside our skins or hidden deep inside, how is it that
we can know about it, when no knowledge can get into a brain, as far as
we know today, except in the form of neural signals? What does energy
mean that corresponds directly to something in the world I experience? I
can answer that question for the term “perception.”

BP earlier: Are you saying that
the everyday meaning is obtained when a person has a perceptual input
function that performs one of the above calculations?

MT: No perceptual input system I ever heard of could make such a
computation. I suppose it would be possible to imagine one, but usually
it is a mathematical analyst performing numerical calculations, or more
commonly doing algebraic manipulations, that would do the calculations.
We do perceive our own uncertainty about things, but I seriously doubt
that the formal computation is involved in creating such a
perception.

BP: Wait a minute. How can the analyst do the calculations without being
able to perceive them? Symbols are perceptions, aren’t they? The rules of
mathematics are perceptions, too, and when we do math we are using some
sort of neural network when we use the rules to manipulate the symbols,
or so I think. We first learn how to manipulate the physical symbols by
writing them on paper or a blackboard, moving them around in
configurations we call equations, canceling them on one side of the equal
sign by striking through them and then writing them as appropriate on the
other side in the right place. Then we learn to do all but the most
involved manipulations in imagination – we can just see how a factor
multiplying the whole numerator disappears if the same factor multiplies
the whole denominator. It’s like playing with blocks, but with different
rules.

BP earlier: Does your formula
apply when the system involved, such as a cruise control, does not
contain the circuitry necessary for experiencing
uncertainty?

MT: “Uncertainty” is indeed something one experiences, a lot.
But I don’t think that has to do with any explicit calculation. I don’t
know why you bring up “experiencing”
uncertainty.

BP: Because when you try to say what the word means, it’s that experience
of being uncertain (a perception) that you have to rely on. You look at
the solution of the equation, which says “x = 7.233,” and you
think “OK, something close to that but not exactly and not every
time.” And you wonder just how far x might get from 7.233 if you
took new data and calculated it again. Isn’t that how statistics got
started?

MT: No control system that
“experiences” (i.e. has a perceptual function that produces a
value of) any perception also “experiences” another perception
such as that of “uncertainty”.

BP: But you just said it does! When you write U(X), that perception (or
an associated meaning) is what you’re experiencing as its meaning.
I’m wondering now if you’re going through a stage of understanding of PCT
that is like one I went through. There was a time when I did NOT say
“It’s all perception.” I made exceptions for the familiar
things that made up my world – my thoughts, for example, or just my room
in the house where I lived. Some things were just there, and I had
perceptions about them. In here are my thoughts and interpretations and
goals and all that, and Out There are the things the thoughts,
interpretations and so on are about.

MT: A control system for which
the perception was simply “uncertainty” would not be
experiencing any other perception. The cruise control
“experiences” various flight parameters, none of which is
“uncertainty” so far as I know. Nor does it experience a
spectrum of output variation, despite the fact that a Fourier analysis of
any of its output variables has such a spectrum, which can be computed
from a time series of its output values.

BP: So are you saying that certain things exist, and you know they exist,
which are not perceptions? That’s what I mean by making exceptions. I
just gave you an example where this could happen: I wrote “x =
7.233” and then started talking about being uncertain of getting
that same number from new data. Right at that moment, I was treating the
relationship of equality as actually existing like a piece of data
independent of me (and whether I was aware of it or not, I also treated
the way I treated it as simply existing, not as something I perceived
myself doing). I was aware of being uncertain of the repeatability of
that relationship, and that is what I was describing as the content of my
thinking processes – but again, not as the content of a thinking
process.

MT: What is meant by
“probability”? I could get into an interesting discussion on
that, which I hinted at in my message, but to do so would, I think, be a
side track. The computations of uncertainty work the same no matter how
you choose to think of “probability”, provided you wind up with
mutually exclusive possibilities whose total probability sums to 1.0. It
works the same for frequentists and for subjectivists.

============Topic Change===========

What follows is a side-track into the “what is probability”
discussion, which I do not want to pursue while I’m working on the
uncertainty, information, and control series of messages. I put my
position here, and expect to leave it at that.

BP: Can’t have that – this is a background thought that you’re talking
about, and it explains why you are going to say what you’re going to say,
doesn’t it?

BP earlier: It would seem that
there must be a random variable involved, and repeated trials must be
involved, in order for us to speak of probabilities.

MT: Here is where I deeply disagree with you, and with a whole school of
thinkers. Your statement reflects the “frequentist” or
“objectivist” school of thought about probability. I think that
is odd, given your strong assertion that all we know of the world is what
we perceive, not what is “objectively” out there in the real
world. The word “random” also raises a reddish flag, but I’m
not going to go into that other than to say that in most cases where that
word is used, it is used as a substitute for “I don’t know the
influences that determine what it will be”.

BP: You’re quite right about what I said, but what I mean is what you
said: probability is simply a word we use when we don’t know all the
deterministic influences at work. But what is wrong with seeing this as
if frequencies of occurrance were the primary data?

MT: Does the statement “I
think the probability is less than .001 that the next bird I see in my
garden will have blue wings and a red breast” mean anything to you?
If it does, you are not a frequentist, because no random variable is
involved and there can be no repeated trials.

BP: Yes, it has meaning to me, but only in the sense that I would say we
can’t estimate the “actual” probability without some data about
how often any particular bird visits my garden at this time of the year.
I have no trouble translating that statement into perceptual terms, so I
know I’m really asking how often I could expect to experience – perceive
– something under the current perceived conditions.

What I mean by “random variable” is simply any variable that
varies for reasons I can’t observe, and in ways in which I see no
regularities.

MT: Nor can there be a
repeated trial for a more mundane statement that “I think the
probability is close to 0.5 that the very next time I toss a coin it will
come up heads.”

BP: So how did you find out that tossing coins in fact favors neither
heads nor tails? Either you tested this premise or you imagined a
principle that predicts equal numbers of each on repeated trials. In
fact, a properly-designed coin-tossing machine could be able to alter the
50-50 balance.

Or “I think the probability
is about 0.2 that I will take a tour to South Africe next Spring”
(that happens to be a true statement, but if you had asked in October,
the probability would have been around 0.8). I have never been to South
Africa, so where are the repeated trials?

BP: Now you’re not talking about calculated probabilities, but only about
imagined outcomes. Of course you can give yourself such numbers, but they
don’t have the same kind of meaning they would have in a formal
experiment. There is no basis other than your own desires and guesses for
picking a number.

MT: You might well say that I
have tossed a coin many times and have observed that heads have come up
on about half the trials, but in truth there were NO repeated trials,
because each time I tossed a coin, the surrounding circumstances were
different.

BP: Different enough to make a difference? If you simply blurt out a
guess about the probability, you can’t answer that question, but neither
can you say that the circumstances were sufficient different. And you
can’t say what deviations from “fairness” could be expected
under any circumstances. When you say you don’t know what is causing the
variations, you can’t also conclude “… and therefore heads are as
probable as tails.” The only way to find out what the distribution
of the perceived outcomes will be is to experiment – first to see if the
observed frequencies are stable, and then if they are equal.

MT: The individual tosses
were different in many, many, ways, all of which you might judge to have
had no influence on the fall of the coin. But it is only your judgment
that allows you to call the different coin tosses “repeated
trials”. There has never been a repeated trial of anything at all,
though people have judged that the differences among the trials don’t
matter enough to make a difference in the trial result. Failures of such
judgments have caused not a few scientific errors.

BP: True, but there is also a judgement involved in deciding whether any
variations in the causes of changes are large enough to matter. You’re
crossing the line between qualitative judgment (no trial can ever be
“repeated” exactly) and quantitative judgments (… but trials
can be repeated so as to minimize unexpected variations).

Many people make this mistake about PCT. It is sometimes carelessly said
that control systems “correct errors.” That leads people to
question the whole theory, because if errors are in fact corrected, the
first guess would be that there can be no output from the control system.
To understand control properly, you have to convert the qualitative idea
of correcting error to the quantitative form of making errors small
enough.

MT: I say that the reason I can
make the probability statement (0.5 heads) about the coin is that I have
a mental model of what may or may not influence the coin the next time I
toss it, and nothing in that model favours head or tail. If I give the
coin to a conjurer, I might change my assessment of the probability of a
head on the next toss if I know that the result makes a difference to the
coin tosser.

Aren’t you forgetting that when we estimate probabilities from data, we
are allowing influences in the (hypothetical) external world to act
according to whatever properties exist in that world? What you believe to
be the probabilities can be refined by doing experimentsm, and your
beliefs will then predict perceived outcomes better. A riori
probabilities are untrustworthy, or at least less trustworthy than those
derived from competently done experimentation.

MT: Your questions are useful,
because I had no idea that such issues might cause you problems. I will
have to address them more than I had intended. Perhaps doing a parallel
intuitive introduction to Fourier analysis might make things clearer. I
hesitate to do that, though, because the current draft of an introduction
only to Uncertainty and Information is too long, even without addressing
the use of the concepts in control analysis.

BP: I haven’t had much luck with making thing clearer through changing
the example by any large amount. What seems “parallel” to you
may not be, and in my experience is usually not, obvious to anyone
else.

Best,

Bill P.

[From Rick Marken (2013.01.05.1500)]

Martin Taylor (2013.01.05.13.37)--

MT: You provide a pretty good intuitive explanation of "uncertainty about" a
message.

RM: Thanks!

MT: I was not thinking of "messages" deliberately sent from a transmitter to a
receiver. I was thinking of observations, perception, the fact that one's
conscious perception, and the current value of a perceptual signal, though
precise, does not necessarily precisely represent the current state of the
environmental variable one is influencing in control. Imagine the blurry
image of something if you have bad eyes and forgot your glasses.

RM: Yes, I can see that this is where information theory would apply.
Of course,the environmental variable to which the perception
corresponds is defined by the perceptual function. So one has to know
what that function is -- which is equivalent to knowing the definition
of the controlled variable -- before it is possible to measure the
amount of information about the controlled environmental variable that
is transmitted to the perceptual signal through that glass darkly. So,
again, knowing the controlled variable is a sine qua non of research
on the properties (informational or otherwise) of control systems.

MT: Your binary question approach explains why the logarithm in the uncertainty
conputation is taken to the base 2, and uncertainty and information are
usually recorded in "bits". I like it as an introductory explanation.

RM: Thanks for pointing this out. I meant to mention it myself.
Shannon's choice of the base 2 logarithm certainly suggests that he
was thinking of measuring information in terms of the minimum number
of binary questions that would be needed to identify a message. A very
clever idea that was very likely based on his days spent playing 20
questions.

Best

Rick

···

--
Richard S. Marken PhD
rsmarken@gmail.com
www.mindreadings.com

[From Bill Powers (2013.01.06.1025 MST)]

Rick Marken (2013.01.05.1500) –

Martin Taylor
(2013.01.05.13.37)–

MT: You provide a pretty good intuitive explanation of
“uncertainty about” a

message.

RM: Of course,the environmental
variable to which the perception

corresponds is defined by the perceptual function. So one has to
know

what that function is – which is equivalent to knowing the
definition

of the controlled variable – before it is possible to measure the

amount of information about the controlled environmental variable
that

is transmitted to the perceptual signal through that glass darkly.
So,

again, knowing the controlled variable is a sine qua non of research

on the properties (informational or otherwise) of control
systems.

BP: I agree, and we should keep insisting on this. Without this facet of
any research project, it isn’t going to have much to do with PCT.

MT: Your binary question
approach explains why the logarithm in the uncertainty

conputation is taken to the base 2, and uncertainty and information
are

usually recorded in “bits”. I like it as an introductory
explanation.

RM: Thanks for pointing this out. I meant to mention it myself.

Shannon’s choice of the base 2 logarithm certainly suggests that he

was thinking of measuring information in terms of the minimum number

of binary questions that would be needed to identify a message. A
very

clever idea that was very likely based on his days spent playing 20

questions.

BP: Another requirement that we, or I, need to keep harping on is that
information theory applies most obviously at the lowest level of
perception: the lowest layer of abstraction. Is the message that is
received the same as the message that was sent? That sort of question can
be answered using standard ways of determining channel capacity, signal
to noise ratio, and whatever else applies at that level.

When we come to higher levels of perception, it’s not so easy to make
such measurements. If “don’t” is left out of a message because
of a brief burst of noise, how do we measure the information content of
Don’t hurry to reply as I am gone from my office.” Does
it make any sense to hurry if I am gone? Yes, “don’t” is just a
one-bit word, but the meaning of the received message involves much more
than one bit of information. Same goes for garbled sequences. At a low
level, sequences are understood in terms of transition probabilities, but
that is only a way to measure the fidelity of the message, not its
meaning. In one of my old math books, the author talks about how some
functions are sensitive to the order in which computations occur (as in
rotation matrices), and in a footnote he asks the reader to consider the
important of the order of elements in this sequence:

  1. Buy a collision insurance policy.

  2. Run your car into the car of a struggling young attorney.

Evaluating the information content of that sequence involves examining a
lot more than what is contained in the sequence.

Best,

Bill P.

[Martin Taylor 2013.01.06.17.56]

I don't think you should, because it isn't true that information

theory applies better to the lowest level of abstraction. Shannon
made that quite clear in his seminal monograph. The situation hasn’t
changed since then.
However, I do note that you didn’t say that information theory
applies better at the lower levels of abstraction. You said it was
obvious to you that it does. That’s a different statement. You can
harp on that one all you want.
Martin
PS. Here’s a paragraph from a chapter I wrote for a recent NATO
report:

···

[From Bill Powers (2013.01.06.1025 MST)]

  BP: Another requirement that we, or I, need to keep harping on is

that
information theory applies most obviously at the lowest level of
perception: the lowest layer of abstraction.

    It is important to recognize

that for
Shannon, what constituted a “message” can be treated at several
different
levels of perception. Suppose that the transmitter intended to
send a message
inviting the recipient to meet at a certain place and time. This
message had to
be translated into words, the words into a letter or sound
stream, and the
stream into a pattern of variation in an electric current. At
the receiving
end, the receiver might interpret the electrical pattern into a
sound or letter
stream. At one level of perception, the message would have been
correctly
received if the stream matched the transmitted stream. However,
if that were
all that happened at the receiving end, the message would not
have been
received. From the stream, the message still must be converted
into words, and
the words into an understanding that the transmitter wanted a
meeting, that the
meeting should be in a particular location, and that it should
be at a
particular time. Suppose the transmitter had identified the
location as “at
Jake’s” and the receiver knew of no place that would be so
identified. Would
the message have been correctly received at that level? It would
have been
correctly received in that all the words in it had been exactly
as transmitted,
but the meaning would
not have been
conveyed. The message would have conveyed information about when
the meeting
was requested, but it would not have conveyed information about
where. Without
considering the meaning of a message to the recipient, the
concept of information
makes no sense.

[From Chad Green (2013.01.07.1042 EST)]

BP: Another requirement that we, or I, need to keep harping on is that information theory applies most obviously at the lowest level of perception: the lowest layer of abstraction.

MT: I don't think you should, because it isn't true that information theory applies better to the lowest level of abstraction. Shannon made that quite clear in his seminal monograph. The situation hasn't changed since
then.

CG: Perhaps it doesn't compute from a PCT perspective because it's more a function of the unconscious mind(s) at work:

"According to a textbook on human physiology*, the human sensory system sends the brain about eleven million bits of information each second. However, anyone who has ever taken care of a few children who are all trying to talk to you at once can testify that your conscious mind cannot process anywhere near that amount. The actual amount of information we can handle has been estimated to be somewhere between sixteen and fifty bits per second. So if your conscious mind were left to process all that incoming information, your brain would freeze like an overtaxed computer" (p. 33). - Leonard Mlodinow (Subliminal: How Your Unconscious Mind Rules Your Behavior)

* M. Zimmerman, "The Nervous System in the Context of Information Theory," in Human Physiology, ed. R. F. Schmidt and G. Thews (Berlin: Springer, 1989), 166-73.

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"If you want sense, you'll have to make it yourself." - Norton Juster

Martin Taylor <mmt-csg@MMTAYLOR.NET> 1/6/2013 6:04 PM >>>

[Martin Taylor 2013.01.06.17.56]

[From Bill Powers (2013.01.06.1025 MST)]

BP: Another requirement that we, or I, need to keep harping on is that
information theory applies most obviously at the lowest level of
perception: the lowest layer of abstraction.

I don't think you should, because it isn't true that information theory
applies better to the lowest level of abstraction. Shannon made that
quite clear in his seminal monograph. The situation hasn't changed since
then.

However, I do note that you didn't say that information theory applies
better at the lower levels of abstraction. You said it was obvious to
you that it does. That's a different statement. You can harp on that one
all you want.

Martin

PS. Here's a paragraph from a chapter I wrote for a recent NATO report:

It is important to recognize that for Shannon, what constituted a
"message" can be treated at several different levels of perception.
Suppose that the transmitter intended to send a message inviting the
recipient to meet at a certain place and time. This message had to be
translated into words, the words into a letter or sound stream, and the
stream into a pattern of variation in an electric current. At the
receiving end, the receiver might interpret the electrical pattern into
a sound or letter stream. At one level of perception, the message would
have been correctly received if the stream matched the transmitted
stream. However, if that were all that happened at the receiving end,
the message would not have been received. From the stream, the message
still must be converted into words, and the words into an understanding
that the transmitter wanted a meeting, that the meeting should be in a
particular location, and that it should be at a particular time. Suppose
the transmitter had identified the location as "at Jake's" and the
receiver knew of no place that would be so identified. Would the message
have been correctly received at that level? It would have been correctly
received in that all the words in it had been exactly as transmitted,
but the /meaning/ would not have been conveyed. The message would have
conveyed information about when the meeting was requested, but it would
not have conveyed information about where. Without considering the
meaning of a message to the recipient, the concept of information makes
no sense.

[Martin Taylor 2013.01.07.10.56]

[From Chad Green (2013.01.07.1042 EST)]

BP: Another requirement that we, or I, need to keep harping on is that information theory applies most obviously at the lowest level of perception: the lowest layer of abstraction.

MT: I don't think you should, because it isn't true that information theory applies better to the lowest level of abstraction. Shannon made that quite clear in his seminal monograph. The situation hasn't changed since
then.

CG: Perhaps it doesn't compute from a PCT perspective because it's more a function of the unconscious mind(s) at work:

That doesn't make sense. Most of PCT is concerned with "the unconscious mind", inasmuch as very few perceptions are ever made conscious among the many we do control or could control. PCT at present has little to say about conscious perception beyond outright speculation.

"According to a textbook on human physiology*, the human sensory system sends the brain about eleven million bits of information each second. However, anyone who has ever taken care of a few children who are all trying to talk to you at once can testify that your conscious mind cannot process anywhere near that amount. The actual amount of information we can handle has been estimated to be somewhere between sixteen and fifty bits per second. So if your conscious mind were left to process all that incoming information, your brain would freeze like an overtaxed computer" (p. 33). - Leonard Mlodinow (Subliminal: How Your Unconscious Mind Rules Your Behavior)

* M. Zimmerman, "The Nervous System in the Context of Information Theory," in Human Physiology, ed. R. F. Schmidt and G. Thews (Berlin: Springer, 1989), 166-73.

In PCT terms, he is (or should be) referring to the bandwidth of control, not to the processing of perceptual input. The limit on output is much lower than the limit on input. One speculation is that what is made conscious is related in some way to the multiplexing of control across different perceptions that might be controlled. The "care of a few children" is a case in point that ties the estimate to multiplexing control over a limited-bandwidth output system, not to the limits of perceptual processing.

Martin

[From Chad Green (2013.01.07.1111 EST)]

Actually, I was hoping that someone from this list would be more familiar with Edelman's work. Here is the source of my question:

Human Adaptability: Future Trends and Lessons from the Past
By Charles E. Oxnard, Leonard P. Freedman
pp. 44-51

In hindsight, perhaps what I meant to ask was: Is there a variant of PCT from a synthetical perspective?

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"If you want sense, you'll have to make it yourself." - Norton Juster

"McClelland, Kent" <MCCLEL@GRINNELL.EDU> 1/3/2013 3:39 PM >>>

[From Kent McClelland (2013.01.03.1440 CST)]

Hi Chad,

Could you give a little background on Edelman's theory of neuronal group selection for those of us who are unfamiliar with it, and also explain how you see it connecting with PCT? Thanks.

Kent

···

On Jan 3, 2013, at 2:05 PM, Chad Green wrote:

[From Chad Green (2013.1.3.1504 EST)]

EJ: Thus, the unknown (to the control loop) & unpredicted variation from the
reference signal accumulates in the preceptual signal. What is left in the
whirlpool, so to speak, is the remaining uncertainty deriving from the
disturbance function, & this accumulates in inverse form in the net output
(& by this I mean, the output as amplified or modified by the EFF.) So
there is a secondary reduction in uncertaintay, which means a gain in
information, at the point of that net output, as it more & more precisely
counteracts the net effect of any disturbances.

CG: In this context, could we replace "perceptual signal" with "value signal" in accordance with Edelman's theory of neuronal group selection, and while we're at it, rename this whirlpool to "value system"?

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"If you want sense, you'll have to make it yourself." - Norton Juster

[From Chad Green (2013.01.07.1203 EST)]

MT: That doesn't make sense. Most of PCT is concerned with "the
unconscious
mind", inasmuch as very few perceptions are ever made conscious among
the many we do control or could control. PCT at present has little to
say about conscious perception beyond outright speculation.

CG: Yes, I now see that it is the purpose of MOL. As for the conscious
mind, Epictetus' writings on prohairesis may be a good fit
theoretically: http://en.wikipedia.org/wiki/Prohairesis .

MT: In PCT terms, he is (or should be) referring to the bandwidth of
control, not to the processing of perceptual input. The limit on output

is much lower than the limit on input. One speculation is that what is

made conscious is related in some way to the multiplexing of control
across different perceptions that might be controlled. The "care of a
few children" is a case in point that ties the estimate to multiplexing

control over a limited-bandwidth output system, not to the limits of
perceptual processing.

CG: Are you perchance referring to the process of chunking raw data to
extract meaningful structure? Perhaps what MOL is really doing is
enabling patients to chunk seemingly independent episodes from their
lives into higher patterns of connection that bind the lowers ones
together:

"Some of our greatest insights can be gleaned from moving up another
level and noticing that certain patterns relate to others, which on
first blush may appear entirely unconnected�spotting patterns of

patterns, say (which is what analogies essentially are)." - Daniel Bor
(The Ravenous Brain)

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"If you want sense, you'll have to make it yourself." - Norton Juster

[From Fred Nickols (2012.12.2.1530 AZ)]

As I understand it, W. Ross Ashby’s Law of Requisite Variety asserts that a control system must be capable of a sufficient variety of actions to control that which is to be controlled. If it’s not, control is not possible. Ashby’s law is sometimes stated as “the complexity of a control system must equal or exceed that of the system to be controlled.” I’m wondering how Ashby’s law fits with PCT, if it does. It seems to me that we sometimes bite off more than we can chew so to speak and those are instances wherein our complexity is exceeded by that of the situation/variables we try to control. The “disturbances” overwhelm us. Comments anyone?

Regards,

Fred Nickols, CPT

Managing Partner

Distance Consulting LLC

The Knowledge Worker’s Tool Room

Blog: Knowledge Worker Tools

www.nickols.us | fred@nickols.us