Consciousness, control, and automatization

[Peter Cariani (960423.1130 EDST)]

[Bill Powers (960422.1130 MDT)] wrote:

I'm still puzzled about the relationship between consciousness and
control. Amid all the control processes that must be going on all the
time, at all the levels, I experience a sort of
central control process.
This is the main thing I am doing at the time. Right now, I'm typing
this post, with some sort of vague idea in mind that I'm exploring at
the same time that I'm writing about it. This is the main thing I'm
doing right at this moment, as far as conscious activities are
concerned.

My experience of this is that my consciousness
is driving this activity -- if I turned my attention
to doing something else, what I am doing now
would stop happening and some other Main Thing would start happening
instead. If my attention turns toward something else, the present
activity won't keep going all by itself.
A lot of things will keep going
-- I won't fall out of the chair, for example, or forget
to breathe, or forget that I'm trying to communciate something.
But if my attention turns toward trying to grasp the picture
of me consciously writing this, the typing pauses and
when I have more to write, I find that my hands
are off the keys and I have to put them back
(and put down my cigarette, too, probably).

One thing that seems to happen when attention
wanders away from what I'm doing is that the loop gain starts to fall.
The control systems start to sag and relax.
I see this happening even more so in my youngest grandchild.
We have to remember to take the glass of juice out of his
hand when we show him something interesting, because as soon as his
attention goes somewhere else, the hand with the glass in it starts to
relax and tilt. If I say "Juice, Ethan," it will level up again. It's
just as if the control system starts to lose gain as soon as his
attention is diverted, and gets it back again as soon
as the glass is in consciousness again. It doesn't look to me
like a change in the reference signals, although it could be...

There is a line of thinking that consciousness is one aspect of a
particular mode of functional organization, namely a (circular)
self-production ("autopoietic") system. What is being produced are
sets of signals, they are being produced by assemblies
(or populations) of neurons, and the persistence of a particular
set of signals (which constitutes a particular functional
organization at a particular time, i.e. a "mental" state) is due
to the coherence of the set in regenerating itself. (A very close
analogy is one of an autocatalytic reaction network in which the
network maintains stability by regenerating its components. All
biological organisms do this, and it is the basis of biology,
although I think one would be hard-pressed to see the point
explicitly made in a biology textbook.) Mental states thus
correspond to the eigenstates (parameter/signal sets that result in
stable trajectory patterns) of this terribly complicated
dynamical system (von Foerster).

One can think of the recurrent signalling loops in the system
as feedback control loops (I'm not sure if this is too restrictive,
if there might be closed loops that are not control loops.....).
In any system that has this recurrent, regenerative organization,
there are two kinds of events, those which are determined by the
system itself, and those which are contingent upon outside
influences (i.e. that are not anticipated by the system). We
perceive those events that are contingent relative to our
expectations, those that "surprise" us, and those events that
occur that are completely expected or anticipated we are not
generally conscious of. Examples are the disappearance of images
that are stabilized on the retina as well as the disappearance of
tones (or odors, or any other stimulus) after they persist
unchanged for any length of time. We -- well, I speak for myself --
I experience "perceptions" qualitatively differently from "thoughts".
Those (sets of) events (and signalling/state sequences)
that are determined internally, I experience as "thoughts".

If each autonomous neural assembly (or control loop) has its own
recurrent, regenerative organization, then it has its own
"percepts" and "thoughts" and "awareness", albeit a very primitive
one, with very few distinctions. What I experience as my "ego"
is the largest of these neural assemblies that gets inputs from
many smaller, more specialized ones, and consequently can
represent more perceptual distinctions and implement a greater
diversity of (longer & longer) signalling sequences. (The major
advantage that we have over other animals is not some special,
unique processing architecture, but that we can, with our larger
cortical surface area, concatenate much longer sequences of
signals.) Our brains are collections of these diverse, semiautonomous
neural (control) assemblies, each of which has an independent
perspective on the world (I believe this is what Leibnitz was
getting at in his theory of monads).

Neural assemblies learn, and when a smaller assembly can control
some external aspect of the environment, then that (controlled)
aspect becomes in effect a part of the internal state of the
system, and no longer contingent (and therefore no longer perceived).
The automatization of an action (driving a car, hitting a tennis
ball) by forming neural assemblies to control the environment
(thereby controlling the perception and closing the loop)
changes the contingency of the input so that it is part of an
internal loop (but not the main one), and it drops out of
"our" conscious awareness. Successful control entails a
stable feedback loop that extends from the organism's effectors
out through the environment, back through the organism's
sensory system, through its coordinative apparatus ("cognition"),
and back to the effectors. Once successful control is achieved
then that part of the environment that is controlled in effect
becomes a functional part of the organism [pardon my language,
I'm sure I'm screwing up the jargon in one way or another]. It's
now just another link in the self-production system. [When "we"
attend to some aspect of the world, the Big Loop pre-empts
smaller assemblies from automatically dealing with the inputs
or outputs in question. If the pre-emption is too general,
other loop gains fall, and we drop our glass. Maybe what to
pre-empt and how much is also learned.]
The notion of loops through the environment is what I take
to be the central point of McCulloch's (1946) "Heterarchy of values"
paper and Bill Powers' perceptual control theory.
I think a coherent account of the structure of experience
is possible if we think carefully about the basic
organizational structure of self-production systems
and control loops, what is contingent, what is or comes to be
controlled, how networks of control loops see the world
(by themselves and through each other). I know many of you on
the CSG list have been thinking through these issues for much of
your lives, and I think a great amount of understanding has been
gained about how these networks might function (Power's hierarchical
ladder of control elements has been an especially useful image for me).

Such a theory assumes that consciousness is one aspect of
"functional organization", not just nervous activity
in some specialized "consciousness module"
(an idea that is unfortunately in vogue these days).
I believe that this functional organization
involves temporal coherence, an underlying neural "temporal code",
but this is another issue. The philosophical
approach is Aristotle's "hylomorphic" theory (form/organization
embedded in matter), very clearly articulated in Deborah Modrak's
book, Aristotle: The Power of Perception, U Chicago, 1987.

I apologise that I haven't been able to follow the discussion more
closely -- I hope this outburst isn't too irrelevant to the current
strand.

Peter Cariani

Peter Cariani Ph.D.
Eaton Peabody Laboratory
Mass Eye & Ear Infirmary
243 Charles St.
Boston, MA 02114 USA
tel (617) 573-4243
FAX (617) 720-4408
peter@epl.meei.harvard.edu