[Martin Taylor 2009.08.21.21.57]
(Gavin Ritz
2009.08.22.13.46NZT)
Martin Taylor 2009.08.21.17.37]
Rick Marken (2009.08.20.1020)]
Martin Taylor (2009.08.20.15.01)-
Rick Marken (2009.08.20.0900)
I don’t fully get it Martin, but anything with life that has some control to a greater or lesser degree has limits (within itself & the environment) and therefore is entropy reducing, both exporting (limits other life forms- through whatever means) and itself.
Humans found this out centuries ago when to order society (themselves) they started creating limits, (eg laws, customs, rights etc) which is all entropy reducing. Which is control so maybe control and entropy are the same thing in different forms.
Regards
Gavin
I’m not clear what relation you see between limits and entropy. When
you mentioned limits as entropy reducing in an earlier message, I
thought you were referring to limit cycles, and convergence to an
attractor does indeed reduce Boltzmann entropy. Now you seem to be
talking about limits on the distributions of the values of different
perceptions. Certainly cutting the tails of uncertainty distributions
will reduce entropy, but only marginally – unless the limits are
severe. Control, on the other hand, reduces the breadth of the
uncertainty distribution of the controlled variable, whether there are
or are not limits on the permitted values of the variable. That means
significant and continuing entropy export from the controlled variable,
to balance that imported from the disturbance and the reference signal.
Maybe you can explain what you mean by the relation between entropy and
limits. It would be easier for me to understand if you could use the
Boltzmann conceptual background I sketched in my repost of a 1993-6
message [Martin Taylor 2009.08.18.14.41]. What are the macrostates and
microstates, and how does the imposition of limits affect them?
Martin