[From Rick Marken (940207.0900)]
Bill Leach (06 Feb 1994 17:09:28) --
You don't really know p(t)
in an experiment and you definately don't know what the function h()
We don't know p(t), true. But in a tracking task, we are able to
mimic the subject's behavior pretty accuratly under the assumption
that p(t) = c(t). We don't know h() exactly either but, again, modelling
shows that we know most of the relevant aspects of h(). If we didn't
have a pretty good idea of what p(t), h( ) and g( ) were, we would
not be able to reliably build models whose behavior correlates with
real human behavior at the .99+ level.
Also, I suspect that there are unknowns concerning the control loop that
actually is used by the subject to input the "g(o)", that is the mouse
and its' related interactions with the subject.
Yes. We have to guess at the system parameter(s) and there are
uncontrollable aspects of the environment (little changes in g(o)
becuase of changes in mouse friction, etc). But these turn out to
be very inconsequential details -- as proved by the success of the
basic model in mimicing human behavior.
It seems to me that information theory can be useful to PCT for a couple
of reasons. Probably the single most important one being its' rather
rigorous analysis of information transfer mechanics.
We have been waiting breathlessly for a coherent description of how IT
might contribute to PCT. I have never doubted that it MIGHT contribute --
but so far I have seen nothing from ITers but post hoc, QUALITATATIVE
assertions about behavior that are not obviously related to Shannon's
theorms. What we have shown in the "information in perception" debate is
that the IT concept of information is definitely irrelevant to the
OPERATION of a control system; there is nothing "in" the perceptual signal
that "informs" the system about the appropriate output to generate in order
to compensate for the net distrubance to the controlled variable. Now
the ITers are saying that all they meant by "information in perception"
is that you can't solve for d in p = o+d unless you are given o and p.
I have to agree with this astounding revelation -- but I don't see how
that goes much past basic PCT, since PCT accepts the tenets of algebra.
Nevertheless, IT might be useful to an OBSERVER of a control system.
AS you say:
I would also suggest that IT might be useful in analyzing how a deviation
from expected result might be due to supplying information (the signal
generated for perception) beyond what was intended.
If you are talking about a deviation from a result expected by the
OBSERVER of a control system, then I agree -- IT MIGHT be useful in
analyzing why this occurred. A demonstration of how this analysis might
proceed would be very helpful. But we already know that IT has nothing to
contribute to our modelling of how the control system itself deals with a
"deviation from an expected result" if this means "deviation of a perception
from its reference state".
If IT showes that there is information in the signal generated for
perception that is in addition to the intended information then the
nature of such information could be useful in determining the nature of
control system changes (that is algorithmic changes).
A very big IF.
I think we have shown beyond a reasonable doubt that there is
no information for the control system about anything (other than
the state of the perceptual signal itself) in the perceptual
signal. I have no idea what the "intended information" in the
perceptual signal might be.
The bottom line here is this: PCT is about the behavior of variables
in a CLOSED LOOP; information theory is about the behavior of signals
in a commincation channel -- an OPEN LOOP. The behavior of variables
in a closed loop is compeletely different than the behavior of the same
variables when the loop is open. This is why ANY open loop - based model
is irrelevant to understanding purposeful behavior.