Scale Invariance and HPCT

[From Bruce Gregory (2003.12.04.0724)]

It struck me that HPCT is scale invariant to the extent that control
operates on the basis of the same mechanism at all levels in the
hierarchy.

Bruce Gregory

"Everything that needs to be said has already been said. But since no
one was listening, everything must be said again."

                                                                                Andre Gide

from [Marc Abrams (2003.12.04.0909)]

[From Bruce Gregory (2003.12.04.0724)]

It struck me that HPCT is scale invariant to the extent that control
operates on the basis of the same mechanism at all levels in the
hierarchy.

I agree and that is one of the reasons I think PCT is so important,
_especially_ in the physiological realm.

Marc

[From Bill Powers (2003.12.04.1248 MST)]

Bruce Gregory (2003.12.04.0724) --

It struck me that HPCT is scale invariant to the extent that control
operates on the basis of the same mechanism at all levels in the
hierarchy.

You forgot to say whether you're asking literally or ironically. The
trouble with joking too much is that you get a reputation for joking, and
then it's hard to get people to take you seriously when you really would
like them to. If you're serious, my impression is that scale invariance
(re: fractals) is a spatial concept. What is invariant is a measure of
shape, and shape is invariant with respect to changes in size. I don't see
the parallel with higher and lower orders of control.

If this is relevant, I also don't think of higher systems as sharing the
same neural circuitry as lower systems. They are physically distinct and
are in different locations. I mention this because there is one concept of
hierarchy in which the actual physical neurons are the same ones at all the
levels; a "level" is then determined by the size of the groups of neurons
you're taking into account. I seem to remember that Howard Pattee had this
sort of idea about hierarchies. This "grouping" idea would be more
appropriate for your analogy, because the size of the sets of neurons would
vary while something about them stayed the same (their "controllingness"?).

Best,

Bill P.

[From Bruce Gregory (2003.12.04.1700)]

Bill Powers (2003.12.04.1248 MST)

Bruce Gregory (2003.12.04.0724) --

It struck me that HPCT is scale invariant to the extent that control
operates on the basis of the same mechanism at all levels in the
hierarchy.

You forgot to say whether you're asking literally or ironically.

The following is to be taken LITERALLY.

The
trouble with joking too much is that you get a reputation for joking,
and
then it's hard to get people to take you seriously when you really
would
like them to. If you're serious, my impression is that scale invariance
(re: fractals) is a spatial concept. What is invariant is a measure of
shape, and shape is invariant with respect to changes in size. I don't
see
the parallel with higher and lower orders of control.

The essential idea is that the system looks the same on all scales.
That is, there is no "natural" scale. This is what leads to the power
law associated, for example, with earthquake intensities (which are not
spatial.) In any case the analogy I saw was that control works in
exactly the same way at all levels of the hierarchy and there is no
"natural" level which there seems to be in a neuron 'bundling" model.

Bruce Gregory

"Everything that needs to be said has already been said. But since no
one was listening, everything must be said again."

                                                                                Andre Gide

[From Bill Powers (2003.12.04.1520 MST)]

Bruce Gregory (2003.12.04.1700)--

The essential idea is that the system looks the same on all scales.
That is, there is no "natural" scale. This is what leads to the power
law associated, for example, with earthquake intensities (which are not
spatial.)

I thought that the power law of earthquake intensities exists because Herr
Richter decided to use a logarithmic scale, so one point on the Richter
scale is defined as a factor of ten in (estimated) released energy. The
magnitude scale of stellar brightness is another example, where one
magnitude difference implies a brightness ratio of 2.51, which is the 5th
root of 100, so there are 5 magnitudes difference in a ratio of 100:1. I
don't think these scales imply any sort of self-similarities. The claim for
fractals that I have often seen is that they uncannily represent natural
features of the landscape from the microsopic to the largest ones. I've
been disappointed by the examples, because I can easily see that most of
the pictures look quite different on different size scales. Perhaps it is
necessary to take a very friendly view of fractals to see their importance.
I'm left saying, "Wow, that's interesting. So what does it mean?"

In any case the analogy I saw was that control works in
exactly the same way at all levels of the hierarchy and there is no
"natural" level which there seems to be in a neuron 'bundling" model.

Is that the name of what I was describing, or something I don't know about?

Yes, the basic organization of the hierarchy is the same at every level, in
that negative feedback control works the same way wherever it is found. I'm
not sure whether this is a virtue of PCT or a sign that we haven't looked
closely enough. Actually, I was trying to avoid looking for some organizing
principle that would generate the specific levels ( principles such as
size, complexity, speed, or anything else), because I felt that we had to
look in detail at our experiences of perception, and only after that try to
generalize. I still don't feel we're in a position to generalize about the
whole system.

I find it extremely tempting to think of finding simple general principles
from which we could generate the entire heirarchy. Or maybe a principle of
reorganization like what the neural net people dream of, which would
spontaneously create the whole hierarchy, or _a_ whole hierarchy, of
control.Start the program, go to bed, and in the morning the computer would
say, "43. Next question, Bill?" (I refer to the Hitchhiker's Guide to the
Galaxy).

Best,

Bill P.

[From Bruce Gregory (2003.12.04.2125)]

Bill Powers (2003.12.04.1520 MST)

The following post is to be taken LITERALLY.

I thought that the power law of earthquake intensities exists because
Herr
Richter decided to use a logarithmic scale, so one point on the Richter
scale is defined as a factor of ten in (estimated) released energy.

The power law applies to the frequency of earthquakes of different
intensities. It is called the Gutenberg-Richter law. During an interval
of time in which there are 10,000 earthquakes of magnitude 4 on the
Richter scale, there are approximately 1000 earthquakes of magnitude 5,
100 earthquakes of magnitude 6, 10 earthquakes of magnitude 7, etc.
(Per Bak, _How Nature Works_).

The
magnitude scale of stellar brightness is another example, where one
magnitude difference implies a brightness ratio of 2.51, which is the
5th
root of 100, so there are 5 magnitudes difference in a ratio of 100:1.
I
don't think these scales imply any sort of self-similarities. The
claim for
fractals that I have often seen is that they uncannily represent
natural
features of the landscape from the microsopic to the largest ones. I've
been disappointed by the examples, because I can easily see that most
of
the pictures look quite different on different size scales. Perhaps it
is
necessary to take a very friendly view of fractals to see their
importance.

No, it just takes a willingness to believe statistics.

I'm left saying, "Wow, that's interesting. So what does it mean?"

See the Bak book if you are really interested. Or, _Fractals, Chaos,
Power Laws: Minutes from an Infinite Paradise_ by Manfred Schroeder.

In any case the analogy I saw was that control works in
exactly the same way at all levels of the hierarchy and there is no
"natural" level which there seems to be in a neuron 'bundling" model.

Is that the name of what I was describing, or something I don't know
about?

I was using your description.

I find it extremely tempting to think of finding simple general
principles
from which we could generate the entire heirarchy. Or maybe a
principle of
reorganization like what the neural net people dream of, which would
spontaneously create the whole hierarchy, or _a_ whole hierarchy, of
control.Start the program, go to bed, and in the morning the computer
would
say, "43. Next question, Bill?" (I refer to the Hitchhiker's Guide to
the
Galaxy).

I suspect that the general principle is natural selection. But as you
say, that remains to be demonstrated.

Bruce Gregory

"Everything that needs to be said has already been said. But since no
one was listening, everything must be said again."

                                                                                Andre Gide

[From Bill Powers (2003.12.05.0750 MST)]

Bruce Gregory (2003.12.04.2125) --

I thought that the power law of earthquake intensities exists because
Herr Richter decided to use a logarithmic scale, so one point on the Richter
scale is defined as a factor of ten in (estimated) released energy.

The power law applies to the frequency of earthquakes of different
intensities. It is called the Gutenberg-Richter law. During an interval
of time in which there are 10,000 earthquakes of magnitude 4 on the
Richter scale, there are approximately 1000 earthquakes of magnitude 5,
100 earthquakes of magnitude 6, 10 earthquakes of magnitude 7, etc.
(Per Bak, _How Nature Works_)

How approximately?

I suppose some physical sense can be made (and probably has been made by
someone) of this power law, which I didn't know about. As you describe it,
it says that about the same amount of energy is dissipated per unit time in
all sizes of earthquakes. I guess the underlying assumption would be that
stress accumulates at a constant rate as plates move past each other. The
length of time it takes to accumulate enough stress for an earthquake at
the plane of contact would be proportional to the breaking strength of the
rock, and so would be the amount of energy released when the break occurs.
So this law is equivalent to saying that the breaking strength of rocks
along faults is evenly distributed from low to high. Is that more or less
how seismologists explain this law?

In astronomy, it was found that the proportion of single to multiple
(double or higher) stars increased as the magnitudes of stars grew fainter.
Later researchers realized that it's harder to detect multiple stars when
they are fainter, partly because they're harder to see, but largely
because, on the average, they're farther away. In fact there is no
magnitude effect at all.

Apparently mysterious effects often turn out to have simple explanations
when the underlying principles are known.

I suspect that the general principle [behind the hierarchy] is natural
selection. But as you say, that remains to be demonstrated.

Natural selection has some decided drawbacks as an explanation of
evolution. Most genetic algorithms I have seen introduce a rather lenient
teacher who gives partial credit for progress toward survival, rather than
requiring that the move be completed. Try setting up an E. coli demo in the
manner of natural selection. Let tumbles takes place at some uniform
background rate. After each tumble, if the bacterium is not going up the
gradient, you kill it. Otherwise you let it reproduce and the offspring try
again. The number of bacteria that actually make significant progress
toward the source of the nutrient would be pretty close to zero.

What makes the E. coli principle different is that there is a sensor in the
bacterium that detect gradients, and thus tells whether it is heading
toward or away from the goal. Tumbles are timed according to this
information. Now progress toward the goal can be extremely efficient even
though the tumbles are still completely random.

So yes, I'd say that the principle of natural selection remains to be
demonstrated as a believable explanation of evolution.

Best,

Bill P.

[Martin Taylor 2003,12,04,1718]

I'm in two minds as to whether to contribute to this discussion,
because to my mind the appropriate approach is a bit more complex to
explain than is normal practice here. But, being no angel, I'll rush
in--and probably out again just as quickly :slight_smile:

[From Bruce Gregory (2003.12.04.1700)]
"Everything that needs to be said has already been said. But since no
one was listening, everything must be said again."
Andre Gide

Well, that about covers it :slight_smile:

[From Bill Powers (2003.12.05.0750 MST)]

Apparently mysterious effects often turn out to have simple explanations
when the underlying principles are known.

However, to be less cryptic, I have to back off a long way, to the
REASONS why you get fractal effects in nature. Fractal technically
does imply some kind of self-similarity at all scales from infinitely
small to infinitely large, so, clearly, no physical system can be a
fractal. Usually, things are called fractal when self-similarity
applies over a reasonable range of scales.

There has to be a mechanism that accounts for any shape. Things get
square because they have the appropriate crystal structure or
cleavage planes, or because some purposeful entity (read "human") has
decided that square is the desired shape. Fractals (in the colloquial
sense) are harder to make purposefully, but easier to make naturally.

The classic example of a fractal system is the "sandpile avalanche."
To create a classic sandpile avalanche, drop grains of sand one at a
time from a fixed dispenser onto a flat table. The first few grains
bounce around on landing, but after enough of them have been dropped,
a more or less cone-shaped pile begins to form, as each falling
sand-grain finds a "nest" of earlier grains in which to sit. The nest
might be at the very top of the pile, but if it is, then that nest
won't be there for the next grain, which has to fall down the slope
until it finds its own nest, partway down or on the table at the
bottom of the hill.

Each falling grain has momentum and kinetic energy. Each static grain
not already on the table has potential energy with respect to the
level of the table. A grain that forms part of a nest may be
precariously balanced, and the newly falling grain may have enough
kinetic energy to dislodge it. Now we have two falling grains, each
of which has kinetic energy, and each of which must either find a new
nest or must fall down the slope to the table surface.

If many grains find nests on the slope, the cone steepens, and it
becomes more probable that the next falling grain will dislodge
another, and then the two will dislodge yet others, in an cascading
avalanche. The avalanche will stop when all the moving grains have
found nests stable enough to absorb their kinetic energy without
breaking the nest. Some avalanches are big, some small. When the cone
gets very large, the avalanche distribution follows the usual fractal
laws. It is a self-similar system. There are lots of little
avalanches, and very few big ones, according to the power law Bruce
described.

One more thing to note--if the table is lightly vibrated, the
quasi-stable slope of the cone will be flatter than if the table is
stationary. The reason is left as an exercise for the reader (but
don't think too trivially about it. Think energy distributions and
nest stability).

This sandpile system is much more general than one might at first
think. The same phenomena show up in all sorts of places, for the
same reason: there is a potential for change in the interacting
elements of some static system; something causes one element to
change, which may or may not cause a neighbour to change ... As the
system "energy" builds, the likelihood of one change causing another
increases. Sometimes that results in a lot of changes happening more
or less all at once, but much more often only a few elements change.

When the system is a reorganizing hierarchy, I used (ten years ago)
to call this effect "The Bomb in the Machine." And I suspect, without
having analyzed it properly, that it is why Bill's huge reorganizing
system runs away when he stops reorganizing. The system is still in a
very high energy state, like a too-steep sandpile. I also suspect
that given enough reorganization time, the big system will
quasi-stabilize, so that only small portions of it run away most of
the time, and occasionally much or all of it will.

One of the key factors in this effect is what one might generically
call the "coupling constant", the degree to which effects in one
element influence the behaviour of others, or the likelihood that one
will affect others. Often, in such systems, there is a phase change
as one increases the coupling constant, from a situation in which
only little "avalanches" ever happen to one in which everything
topples at once. Right on the edge is where the classic sandpile
lives. A bit steeper, and everything would fall; a bit flatter, and
there would be no big avalanches. That's the rationale behind the
generation of the self-supporting loops described on the Web pages
that were being discussed a few days ago.
<http://www.mmtaylor.net/PCT/Mutuality/many-control.html&gt;

I think normal reorganization in a growing hierarchy also tends to
that edge between too much stability and none. Growth in an
unchanging environment makes the hierarchy very unstable against
small changes in the environment (as a tree is weak if it is always
protected from being blown by the wind). A sandpile grown on a
stationary table soon falls apart if the table is slightly joggled.
Likewise a hierarchy that is coddled too much early on can easily
blow up when confronted with novelty--but on the other hand, a
hierarchy stressed too much while growing may never be able to
reorganize to be effective.

Martin

[From Bruce Nevin (2003.12.05 12:59 EST)]

Bill Powers (2003.12.05.0750 MST) --

Bruce Gregory (2003.12.04.2125) --

I guess the underlying assumption
[of the the Gutenberg-Richter law] would be that
stress accumulates at a constant rate as plates move past each other. The
length of time it takes to accumulate enough stress for an earthquake at
the plane of contact would be proportional to the breaking strength of the
rock, and so would be the amount of energy released when the break occurs.
So this law is equivalent to saying that the breaking strength of rocks
along faults is evenly distributed from low to high. Is that more or less
how seismologists explain this law?

So if the opposing faces have a high proportion of weaker rock on either or both sides and a low proportion of stronger rock, the likelihood/frequency of slippage is higher but the magnitude of the event lower; conversely, when strong rock faces strong rock over more of the fault, slippage is resisted and but when it does occur the event is of higher magnitude.

There is a striking parallel to the erosion of coastlines as discussed earlier.

Natural selection has some decided drawbacks as an explanation of
evolution. [...] Try setting up an E. coli demo in the
manner of natural selection.

One thing missing in the demo is variation in the population. The agents here are all organized alike.

It was the proposal by Malthus on population and famine that gave rise to the idea of selection (for Wallace as well as Darwin, I read in a recent book review). Population expands beyond the food supply, some starve and die, others survive. Rather, some procreate and others fail to -- e.g. perhaps failing to defend their young from being eaten. Procreation is missing from the E. coli analog.

Insofar as procreation is a cooperative endeavor, selection favors mutually supportive arrangements among individuals. This kind of variation is also absent from the E. coli demo, though perhaps not from the E. coli analog. Are there among 'free-ranging' microorganisms precursors of the slime molds and multicellular creatures? There's just a tremendous amount of research out there now on cell-cell communication.

         /Bruce Nevin

···

At 08:20 AM 12/5/2003 -0700, Bill Powers wrote: