[From Bill Powers (921010.0945)]
Greg Williams (921010) --
...the control structure itself at time t1 depends in part on the
environment prior to t1. (1) WHETHER learning/reorganization takes
place depends in part on the environment (I think we agree on
this). (2) WHERE learning/reorganization stops (the resulting new
control structure) depends in part on the environment (you don't
agree with this). (3) Attempts to alter another's current reference
signals "directly" (not involving learning/reorganization) is
likely to produce conflict in the current control structure (I
think we agree on this).
1. Learning-reorganization starts when something causes a critical
error to become large enough. The external cause itself makes no
difference; only the effect on a critical variable and the resulting
error matters in setting reorganization into motion. Many different
environmental situations may be entirely equivalent in their effects
on critical variables (the environment has many more degrees of
freedom than there are critical variables). So learning-reorganization
does depend in part on the environment, but only in a general
existential way. There is no one specific state of the environment on
which critical variables depend.
2. Learning-reorganization stops when, for any reason, critical error
drops to a small enough value. If the critical error was produced by
some effect of the environment on the body, it may disappear when that
effect is removed. If the reason for the disappearance was the
acquisition (through reorganization) of a learned control system that
directly or indirectly removes the environmental disturbance, the
learned control system will continue to function in the same way under
the same circumstances that formerly resulted in disturbance of a
critical variable. The same environment will then become unable to
cause that disturbance of the critical variable; the learned control
system will prevent the environment from going into the state that
disturbs it.
The criterion for stopping reorganization is zero (or small enough)
critical error. This criterion is met only when critical variables are
at their respective genetically-defined reference levels. There is no
particular state of the environment at which this state occurs. There
is no particular learned control structure that will result in
correction of critical error -- that is, an infinite number of
different control structures could have the same effect of preventing
critical error. So while the organization of the learned system does
"depend on the environment," the dependence is not systematic but only
qualitative. It is ambiguous.
3. Attempts to alter another's ACTIONS are unlikely to result in
conflict, because actions normally change as a way of counteracting
disturbances. Changing actions requires changing lower-level reference
signals. These changes are initiated as a way of counteracting a
disturbance at a higher level. The associated perceptual signals
remain near in value to the changing reference signals, so control is
not disrupted at any level. At the disturbed level, neither the
reference signals nor the perceptual signals are significantly
changed; the lower-level changes have that purpose.
Conflict results when the disturbance is too large or too fast to be
resisted, producing errors that bring other control systems into the
picture. It can also result when the lower-level corrective actions
are driven into forbidden states. This is not as likely because the
higher system being disturbed is not likely to be organized to produce
those forbidden states, and thus will not use them to correct its
errors. Conflict is produced mainly by an external agent that insists
on disturbing a controlled variable, and produces as much force as
required to disturb it by a significant amount. In short, by another
control system that wants the same controlled variable to be in a
different state, or wants something equivalent having the same effect.
ยทยทยท
---------------------------------------------------
To say that something depends "in part" on something else is not a
step toward precision, but away from it. You can truthfully say that
human behavior depends "in part" on "the environment." You can also
truthfully say that the climate on Earth depends "in part" on galactic
supernovas. Everything depends "in part" on everything else within the
same event horizon. But this is not the way to arrive at an
understanding of nature. By making more and more general statements,
one can eventually arrive at statements that are universally true.
"Everything affects everything within the same event horizon" is a
universally true statement. But it is also trivial and useless.
It is worse than that, because it implies that all less-general
statements consistent with the most general one are also true, and
this is not at all the case. There is a very large difference between
saying that the form of learned control systems "depends in part" on
"the environment" and saying that the environment can be configured in
a specific way to determine the form of a learned control system. That
is simply not possible; there are too many different ways of
controlling that have the same effect. There are too many ways of
affecting the environment that would serve to correct the same
critical error. The only thing that can be pinned down to any degree
is the state at which changes in organization will cease, and that is
the state in which critical variables match their reference levels.
That is the only predictable outcome of reorganization.
To say that A depends on B is to make a clear statement: given B, one
can predict A. To say that A depends "in part" on B is also to make a
clear statement: given B, one can predict nothing about A. The
qualifier "in part" does not just slightly reduce the amount of
dependence. It eliminates dependence altogether.
------------------------------------------------------------------
Best,
Bill P.