[Martin Taylor 950111 15:20]

Bill Powers (950110.1115 MST)

Sorry to send two replies to the same message, but the themes are quite

For all practical purposes, the perceptual signal at time t can be
treated as if it is a cause of the perceptual signal at time t.

Change "all" practical purposes to "many" practical purposes, and I'm with
you. But sometimes the difference is critical, and it is always critical
when you are treating the question of the bandwidth of disturbance over
which good control can be maintained. It is because I am interested in
that aspect of control that I try to make sure that the existence of
the loop delay is kept in mind.

While what you say is technically correct, it is a strategic error to
emphasis this aspect of a control system. One of the biggest problems we
have in dealing with people who have tried to think about closed-loop
effects without using control theory is that they want to try to deal
with input, output, and feedback processes sequentially, as if only one
of them were occurring at any given instant.

Understood. It may often be useful to simplify drastically in order to
make a point, especially when the point in question is known to be often

I think it is better to ignore the lags and use simple algebra or
differential equations to show how a control systems works. This gives
the correct picture of inputs following reference signals and outputs
opposing disturbances.

Yes, I do it myself, in the initial stages of introducing people to PCT.



When you are dealing with the entropy of a physical system, it can be
treated in terms of the information obtainable about THAT system (not
another), whether the system be open or closed.

I had understood that the system that does the obtaining of the
information supposedly has its own entropy (relative to the source)

I'm not sure what you mean by "relative to the source," but there is a
sense in which you would be correct. One part of the entropy of a system
that contains some kind of model (in Hans Blom's sense) of another system
can be attributed to that model. When the uncertainty associated with that
model is reduced, then that component of the entropy of the modelling system
is reduced. But normally that would be accompanied by a more than compensating
increase in entropy due to the irreversible processes involved in the
physical events. One can't say for sure one way or the other whether
this would happen in any particular case, without analyzing the situation

My comment concerned only the popular notion that when a transmitter
sends information to a receiver (defined appropriately to the receiver),
the result is a decrease in entropy in the receiver, and that this has
something to do with decreasing physical entropy in the receiving

I didn't know this was a popular notion.

My point about the relation of physical entropy and information came from
a quite different tack, which is that the physical entropy can be treated
as being closely related to the amount of information obtainable about
a system under defined conditions of observation. It was unrelated to
notions about what happens to physical entropy when a message goes from
one place to another under undefined conditions. The point I tried to make
was that you shouldn't treat the information about one thing as if it
might be related to the entropy of something quite different.

My example was intended to show
that the direction of information transmission is independent of the
direction of energy transmission.

Yes, we've agreed that this is the case, several times. It is true. And
if you think that there are people reading CSG-L who think the contrary,
perhaps it is worth repeating from time to time.


I'm not sure about the term "inheritance" hierarchy. If you have a
system with n levels, and add level n+1 to it as a physically distinct
system, what "inherits" what?

What gets inherited in object-oriented programming is a set of properties
and processes that treat those properties. A low-level object may have
only a few attributes (e.g. a "person" object might be defined to have
properties such as "height" "weight" and "bank account") and methods of
affecting those properties or of reporting them to the world outside the
object. A next level object has those properties (although they may be
redefined) and processes (possibly reprogrammed) as well as others (e.g.
a "social person" object might have properties "sex" "generosity" etc.,
and the "bank account" object that is one of the "person" attributes
might also have descendants "chequing account" and "savings account" that
have added properties, all of which are accessible by the "social person"
class of object, but differently from the way the "person" object would
access the "bank account" attribute). A long bracket, but I hope it conveys
the idea.

The object-oriented "inheritance" hierarchy was not brought up as a suggestion
that the control hierarchy is the same thing. I was only suggesting that
different kinds of notions of hierarchy can co-exist and be useful, even
in reference to the same set of objects.