[From Bill Powers n(2004.12.19.0730 MST)]
Bruce Gregory (2004.1219.0836)–
If I were to propose that
thermostats were sentient, that they
experience a range of emotions and are frustrated when the do not
achieve their goals, you might be a bit reluctant to accept my
story.
More than likely, you would ask what evidence I have that
thermostats
are not simple control systems that lack the properties I was
attributing to them.
I would say you are absolutely right. No single control system could
possibly explain a complex multidimensional multiordinate phenomenon like
emotion. Emotions require systems at several levels operating at the same
time, so that one system can perceive the consequences of other systems
in operation (or failing to operate right). This is why we can’t consider
emotion to exist at the same explanatory level as control. Emotion can
arise only as (in your terms) an epiphenomenon, or more properly, emotion
is an emergent phenomenon rather than a primary one.
You might even point out that
there is no need to invoke such anthropomorphic imagery (as Bryan notes)
to explain the behavior of the thermostat.
I think it is quite admissible to use anthropomorphic imagery when
talking about human beings. It’s only when we speak of other species,
whose experiences we do not (to our knowledge) share, or nonliving
systems and processes (like evolution), that anthropomorphism is a
logical mistake. The mistake when speaking of human beings is to
fail to anthropomorphize.
However, anthropomorphizing is not the right objection to attributing
emotions to a thermostat. A thermostat has only one goal. It can’t sense
its own efforts (the heat output of the furnace, the turning of the fan
motors), and it has no higher levels from which to judge its own success
or failure. So it is not sufficiently complex to support the phenomena we
call emotions. That, and not anthropomorphizing, would be the mistake in
attributing emotions to a thermostat.
I am making the same point about
PCT models and awareness, memory,
imagination, and emotions. The models work perfectly well in the
absence or these embellishments. Awareness, imagination, memories
and
emotions are, as far as I can tell, epiphenomena. They do none of
the
lifting, but are simply along for the ride. The work is none by the
hardware described in the block diagrams.
But that isn’t true. Awareness, imagination, memories, and emotions are
observable phenomena that require explanation. They are not
“epi” phenomena – they are real products of the operation of a
real system, real phenomena. A model of human organization can’t be said
to work “perfectly well” if it can’t reproduce these phenomena.
Contrary to your post of a day or so ago, PCT is not a theory of
“behavior.” It’s a theory that includes explanations of
behavior, but it is also intended to explain how the world, including our
own actions and their consequences, appears to us. Its very name,
perceptual control theory, refers to something that is known only
to each of us as a subjective phenomenon (even if we sometimes mistakenly
think it is objective). In fact, one of the earliest findings from this
theory is that “behavior” is never the point of our control
actions; the point is to make perceptions be in the states
we want. Behavior is what other people observe; perception is what we
observe of our own control operations.
This is not a criticism of PCT,
but an attempt to clarify the domain in
which PCT applies. B:CP spells out this domain:
behavior.
But this is an attempt to limit, not clarify, the domain, both
unnecessarily and incorrectly. The domain of PCT is human
experience. That includes both actions and the effects of actions; it
includes both external and internal effects. It includes hierarchies of
effects, in which one effect is produced by whatever action is necessary
at the moment as a means of producing more general effects. It includes
effects that are controlled and effects that are not controlled, as well
as effects we experience that are caused and controlled by other
organisms, or by non-sentient natural processes. You will find all these
elements in even the simplest control tasks such as tracking.
To understand PCT you have to understand HPCT. It is a mistake to draw
conclusions about the whole system from characteristics of one of its
elementary components. You have to consider what phenomena can
emerge from an complex assembly of many such components – phenomena such
as emotion.
Best,
Bill P.