Ashby's "Requisite variety"

[From Bruce Abbott (2004.05.08.1235 EST)]

[Bill Powers (2004.05.07.0821
MST)]

I’m reminded of W. Ross Ashby’s concept of “variety,” loosely
related to

information theory and entropy. Ashby concluded that a control system
has

to have an amount of variety equal [at least equal?] to the variety of
the

controlled system. This is sort of like saying that a machine has
to have

a temperature higher than the temperature of its surroundings to do work
on

the surroundings. These generalizations may be true, but they say
nothing

about how you distinguish useful from useless variety, nor do they help
you

design a system that will exhibit the required variety-matching while
also

performing some useful kind of control. It’s possible to decrease
the

entropy in a system by feeding organization to it, and then let the
entropy

increase again while the system does nothing of any interest. The

difference between that system and another one that writes symphonies
is

not in the fluctuations of entropy, but in the kind (not just the
amount)

of organization represented in it.

As it happens, Ashby chose the wrong design for a brain, though it could
be

shown to obey the “laws of variety” just as well as a negative
feedback

control system does. He thought that the most effective design for
a

controller was one in which the controller sensed the states of all

disturbances of the controlled quantity and computed just the output
that

would cancel all their effects. He pointed out that in principle, this
kind

of system could achieve zero error, while a negative feedback
control

system always had to allow some error to exist, since its output
was

error-driven. “In principle,” of course, is a long way from
“in practice,”

and Ashby, being a psychiatrist, was not enough of an engineer to see
how

impracticable his design was. I got a good part of my inspiration for
what

is now PCT from the parts of Ashby’s book that dealt with the kind
of

control system he ultimately rejected.

So maybe you can see why I react to generalizations like the "law
of

variety" with something elss than unrestrained
enthusiasm.

In Introduction to Cybernetics, Ashby (1957) refers to this as the
“law of requisite variety.” Imagine that you had a
disturbance with a certain amount of possible variation. To
counteract this disturbance, the variation in output would have to be at
least as great as the variation in the disturbance, in the effect of each
on the controlled variable. The variation in disturbance and in
output can be quantified in bits of information. Because the output
of a control system opposes the disturbance (i.e., its sign is opposite
to that of the disturbance), perfect control would remove all
disturbance-induced variation in the controlled variable, or in other
words, destroy the information (i.e., variation in CV = 0
bits).
I read Introduction to Cybernetics before I ever heard about PCT,
still have my paperback copy of it, and would still recommend it for
those who would benefit from a tutorial on cybernetics couched in
discrete mathematics rather than the calculus. Amazingly, it has
recently been reprinted on the Web in pdf (Adobe Acrobat) format.
Point your Web browser to:

http://pespmc1.vub.ac.be/books/IntroCyb.pdf

You can download this to your local drive and read it at your
leisure.

Best wishes,

Bruce A.

please put Re: o title to differentitate it from spam. thanks a bunch

From:
Bruce Abbott

To: CSGNET@listserv.uiuc.edu

Sent: Saturday, May 08, 2004 1:37 PM

Subject: Ashby’s “Requisite variety”

[From Bruce Abbott (2004.05.08.1235 EST)]

  [Bill Powers (2004.05.07.0821 MST)]

  I'm reminded of W. Ross Ashby's concept of "variety," loosely related to
  information theory and entropy. Ashby concluded that a control system has
  to have an amount of variety equal [at least equal?] to the variety of the
  controlled system.  This is sort of like saying that a machine has to have
  a temperature higher than the temperature of its surroundings to do work on
  the surroundings. These generalizations may be true, but they say nothing
  about how you distinguish useful from useless variety, nor do they help you
  design a system that will exhibit the required variety-matching while also
  performing some useful kind of control. It's possible to decrease the
  entropy in a system by feeding organization to it, and then let the entropy
  increase again while the system does nothing of any interest. The
  difference between that system and another one that writes symphonies is
  not in the fluctuations of entropy, but in the _kind_ (not just the amount)

of organization represented in it.

  As it happens, Ashby chose the wrong design for a brain, though it could be
  shown to obey the "laws of variety" just as well as a negative feedback
  control system does. He thought that the most effective design for a
  controller was one in which the controller sensed the states of all
  disturbances of the controlled quantity and computed just the output that
  would cancel all their effects. He pointed out that in principle, this kind
  of system could achieve zero error, while a negative feedback control
  system always had to allow some error to exist, since its output was
  error-driven. "In principle," of course, is a long way from "in practice,"
  and Ashby, being a psychiatrist, was not enough of an engineer to see how
  impracticable his design was. I got a good part of my inspiration for what
  is now PCT from the parts of Ashby's book that dealt with the kind of
  control system he ultimately rejected.

  So maybe you can see why I react to generalizations like the "law of
variety" with something elss than unrestrained enthusiasm.

In Introduction to Cybernetics , Ashby (1957) refers to this as the “law of requisite variety.” Imagine that you had a disturbance with a certain amount of possible variation. To counteract this disturbance, the variation in output would have to be at least as great as the variation in the disturbance, in the effect of each on the controlled variable. The variation in disturbance and in output can be quantified in bits of information. Because the output of a control system opposes the disturbance (i.e., its sign is opposite to that of the disturbance), perfect control would remove all disturbance-induced variation in the controlled variable, or in other words, destroy the information (i.e., variation in CV = 0 bits).
I read * Introduction to Cybernetics* before I ever heard about PCT, still have my paperback copy of it, and would still recommend it for those who would benefit from a tutorial on cybernetics couched in discrete mathematics rather than the calculus. Amazingly, it has recently been reprinted on the Web in pdf (Adobe Acrobat) format. Point your Web browser to:

http://pespmc1.vub.ac.be/books/IntroCyb.pdf

You can download this to your local drive and read it at your leisure.

Best wishes,

Bruce A.

···

----- Original Message -----

In Introduction to
Cybernetics
, Ashby (1957) refers to this as the “law of
requisite variety.” Imagine that you had a disturbance with a
certain amount of possible variation. To counteract this
disturbance, the variation in output would have to be at least as great
as the variation in the disturbance, in the effect of each on the
controlled variable. The variation in disturbance and in output can
be quantified in bits of information. Because the output of a
control system opposes the disturbance (i.e., its sign is opposite to
that of the disturbance), perfect control would remove all
disturbance-induced variation in the controlled variable, or in other
words, destroy the information (i.e., variation in CV = 0
bits).

Unfortunately, statistical measures like variety don’t specify that the
action must vary as a specific function of the disturbance – only that a
measure similar to variability must have the same size. So the question
of how the right kind of variability can be achieved is left unanswered
by the law of requisite variety (thanks for bringing that term back to me
– another neuron shot to hell). There are many ways to create variety of
output that will match the variety of the disturbance, yet not stabilize
the controlled variable at all.

Best,

Bill P.