[Martin Taylor 960314 11:10]
Bill Powers (960314.0100 MST) (and Peter Cariani 960313 1100, below)
Martin Taylor 960312 0950 --
Yes, provided you substitute "representation of an algorithm" for
"algorithm."No, that's a substitution I don't wish to make. An algorithm, broadly
speaking, is a mathematical form. Nature's behavior can be idealized or
approximated by mathematical forms, but it isn't represented by them.
So, as I said, any conflict between us is simply in the usage of terms.
You insist on sustaining a conflict situation, whereas I was trying to
eliminate it by pointing out that when you define "algorithm" as something
in a mind, then it is something in a mind. When the word is used to mean
a process that delivers consistent results from consistent data, then it
is something in a world that may be perceived by a mind.
I don't care which way you want to use the word, so long as you don't insist
that Dennett uses it that way and then castigate him for obvious foolishness
in doing so.
As I, and I think Dennett, use the term, an "algorithmic process"
it means a process that delivers consistent results if it starts
from the same state and is given the same data over and over.I believe that this view of natural processes has been superseded since
the chaotic nature of the world was recognized, and here I refer only to
the weak form of chaos, the Butterfly Effect.
A "chaotic" process is different from a noisy process. A chaotic process
will always give the same results, when provided with the same starting
state and initial conditions. In that sense, a chaotic process is algorithmic.
One feature of the kinds of chaotic process usually observed in the world
is that they often display a sort of near-periodic behaviour. The Lorentz
attractor, for example, looks a bit like a pair of spectacle frames joined
at the bridge. The state of the system circles around one of the "eyes" for
a long time, looking as if it were an almost periodic oscillator, and then
"without warning" unless you are measuring very carefully, it shifts into
circling around the other "eye," looking like a very different almost
periodic oscillator. The Lorentz system is defined by a quite simple
mathematical expression, and is clearly algorithmic, even in the sense that
you want to give the word.
"Chaos" has nothing to do with the problem. I argued against Penrose's
thesis on quite different grounds. An algorithmic system is defined in a
world that is closed to outside influences as soon as it has been supplied
with its initial data. One can extend this to a feedthrough system such as
a filter, because subsequent data have no influence on the downstream effects
of earlier data. A feedthrough filter can be seen as an algorithmic system
even when it is subjected to influences (a data stream) from outside itself.
A feedback system cannot be treated as an algorithmic system unless it is
isolated from the outer world. In a feedback system subjected to external
disturbances, the effects of old influences depend not only on the process
but also on newer external influences. That's why I said that the parts of
a control system can be treated as algorithic processes (using my sense of
the term because I know of no other word to describe what I said it meant),
whereas the control system as a whole cannot be treated as algorithmic when
there are external disturbances.
···
----------------
I take it you also don't like the word "represented" for the relation
between the world and a perception of the world. Would you care to suggest
a word that suits you? Or would agreement on such a word deprive you of
the opportunity for further "red herring" conflicts?
----------------
Peter Cariani, 960313, 1100
You seem to suggest that a process that produces as its result the
continuous sum of two continuous variables is _not_ an algorithmic process.
You don't mean that, do you?I do mean that, but one needs to think hard about what it means to
have an "effective procedure" for computing the value of the
continuous sum of two continuous variables.
We seem to have a related problem here. You, like Bill, deal with "algorithm"
only in the sense of a simulation of what is going on in the outer world.
As I said to Bill, I don't object to anyone using their own definitions of
words, provided everyone knows what is meant. I said what I meant, and now
you have done the same.
When you are dealing in representations, you have to be concerned with
fidelity, and obviously within the limits of any measuring system, there
can be a discrete system that is so faithful to the continuous "real?" system
that no difference can be measured or detected.
There's a conceptual difference, I think, between defining discrete procedures
that identify a labelled real number in terms of other labelled real numbers
("integers"), and defining simulations of real-world processes. The first
relates to writing down the digits of "pi" or "e", the second to simulating
the behaviour of something that sums (as might be the case for the acceleration
of an object pushed from two opposite sides).
If you can't express it in a finite, discrete notation, then
the entity in question is, operationally speaking, not uniquely
distinguishable and exhaustively defined (i.e. it is ill-defined).
Therefore every real number except for a set of zero measure is ill-defined.
It is a pity that technically correct statements must be couched in language
that has pejorative connotations. Yes, it is ill-defined within the finite
discrete notation. But remember, "pi" is well defined in one discrete
notation, as "the ratio between the circumference of a circle on a plane
and its radius." That is an exact definition of "pi". But "pi" is ill-defined
in a different discrete notation, as 3.14159.....
Let's be more concrete and say you have a device that has an
analog sensor A that produces a continuous voltage a and another
one B that produces b, and you have an element C that (you think)
sums them together to produce voltage c.
One (postulates that one) can describe the device by the equation
a + b = c, where a, b, and c are quantities that can (in one's mind) take on
a continuum of values.
OK, here's where we can make more precise the need for a word. On the
assumption that the device produces that same value c for given values
a and b (an assumption not testable by measurement), I term it an "algorithmic
process." You don't like the term to be used in this way, but instead you
want it to be used for the simulation of that device in a system that you
assert must be discrete (I'm not convinced the mind is such a device, but
let that pass).
I think one can say then that the device's behavior can be
approximated using a numerical algorithm (in the same way that a
differential equation is approximated by a numerical procedure), but not
that the device itself is performing an algorithm.
I'd agree that the device is not "performing an algorithm." I'd say that
it is "executing an algorithmic process." Or even that it is an "algorithmic
device."
I think we are bogging down in the semantics of "algorithm",
Agreed.
so I'll end
soon. It may be another semantics impasse, and I'm probably the only person
on earth who cares about these distinctions.
Apparently not.
If you can't express it in a finite, discrete notation,...
I'll argue that there is a reason for this, and it has to do with communication.
But insofar as the "finite, discrete, notation" has to do with a real
world that may or may not be continuous, this "If you can't" statement is
an admission of failure in our ways of representing the world.
If "a" "b" and "c" are all allowed to represent real numbers, one can
_assert_ a = b+c in a discrete notation. One can't _evaluate_ "a" in a
discrete notation except with zero probability (when it happens to be a
rational fraction, a fractional polynomial, a named transcendental, or
some such).
---------------------
[I should say that I also have great problems with the
Dedekind cut and the usual means by which the real numbers
are constructed. A continuum is not an infinite series of
individuated entities, it is lack of individuation.
Yes, exactly. And this hearkens back to my problem with Bill's rejoinder.
Some long time ago, a group of use were concerned with this issue as it
related to cognition (before my acquaintance with PCT). We were concerned
with the possible evolutions of a complex isolated system. We identified
six different kinds of evolution, which had six quite different sorts of
consequence, and only one of which permitted logical thought and was at
the same time realizable in a physical system. I hadn't intended to go
into this on CSGnet, ever, but why not?
All of the following deal with the relationship between the state of a system
(defined as the continuous values of N variables, where N is indefinitely
large) at time t0 and at some time t1, t1 - t0 = deltat seconds later.
1. Random evolution: For all values of deltat, the state at t1 is unpredictable
from the state at t0. No physical examples can exist.
2. Chaos type 1. No attractors, all states being equally achievable from all
other states. The deviation of the state at t1 from the state at t0 increases
with deltat until there is no remaining predictability. (I guess this is a
system with all Lyapunov exponents greater than unity, but I'm not 100% sure
of this). Example: an ideal gas in a box with ideally reflecting surfaces.
3. Chaos type 2. Strange attractors. From any initial state t0, the state
approaches a strange attractor more closely as t1 - t0 increases.
Predictability of _where_ on the strange attractor the state may be decreases
over time, but predictability of _how close to_ the strange attractor it is
increases over time. Example: the Lorentz system (which gave rise to the
idea of the Butterfly Effect.)
4. "Semi-chaos". A term we invented, that has no mathematical validity
that I know of. The notion is that the state space of the system is divided
into subregions, each of which corresponds to a, possibly unique, attractor.
Any state within one of these regions at time t0 will be found closer to
the attractor for that region as deltat increases. There's no claim as to
whether the attractor is strange (i.e., whether the dynamics within any
individual region is chaotic or not). However, there is strong sensitivity
to initial conditions near the boundaries between the subregions of the
state space, states on one side of the boundary moving to one attractor,
and stated on the other moving to the other attractor. Example: threshold
discriminations, instantaneous category judgments.
5. Logic. This is the most interesting, in that it involves a state space
subdivision like (4), but one in which the boundaries shift over time. It is
identical to (4) when we consider the evolution of an isolated system, but
not if the system is subjected to outside influences. Alone among the six
kinds, the Logic system is not algorithmic (in my sense). The same initial
state does _not_ always lead to the same attractor. In a Logic system,
if the state is influenced from outside to move from one subregion to another,
the boundary between the pre- and post-influenced regions is shifted so
that the state has to be influenced back beyond the point at which the
switch occurred. Either way, in the absence of outside influence, the
system moves toward the attractor corresponding to its current subregion.
Example: a flip-flop, consonantal category judgment, most categorical
perception considered over time.
6. Death. Like 5, except that the attractors are fixed in the state space
so that if deltat is large enough, the state at t2 (>t1) is almost identical
to the state at t1 (in the absence of outside influence).
It may seem paradoxical that only the Logic system is non-algorithmic, but
I think it is necessary if logical (i.e. discrete) algorithms are to be
executed in a physical system, be it living or silicon-based. All physical
systems are subjected to outside influences, so any system state can be
influenced readily over a nearby boundary. Hysteresis prevents this problem,
allowing logical systems to operate even in an environment at 300K. If one
operated in a type 4 system, one would never be assured that the results
obtained from a logical operation could be duplicated if the same logical
operation were to be repeated. The cost is that logical representation
of the real world is of limited fidelity, the limit being intrinsic.
Martin