[Martin Taylor 950622 12:00]
Bill Powers (950621.1415 MDT) to Hans Blom (950621a)
When there would be no noise in the system, all errors would be
zero if there are no conflicts.This isn't true in a negative feedback control system. If there are
systematic disturbances, there must be error to drive the output that
systematically opposes the disturbances. The error signal itself can be
small in comparison with the reference signal, but it must be amplified
so the result is sufficient output to handle the range of disturbances.
This straightforward fact highlighted an issue that has been on the edge
of my consciousness for quite a long time: it concerns jitter.
Many control systems, including human ones, show jitter, which is a
non-systematic high-frequency deviation about the mean value of whatever
signal one examines. Various uses for jitter have been proposed, some
of which I heard as an undergraduate, some on CSG-L. But I have not heard
the following, and I am wondering whether those with practical experience
with either human or artifical control systems may know whether it is true.
First, what I hope are facts, then the speculation:
Without jitter, and in the absence of friction or dead zones, the perceptual
signal settles at a level that differs from the reference by an amount
D/(1+G). That is the "small" error signal to which Bill P. refers. With
jitter, the error signal deviates around this mean value and the deviation
is, as Bill has elsewhere noted, essentially uncorrelated with the disturbance
(as it must be if the jitter and the disturbance occur in different
frequency bands).
Is it possible that the control system might in some way use the jitter
to affect its output so that the jittered error averages zero regardless
of the DC value of the disturbance?
Commentary:
If the speculation is correct, it would imply that the jitter occurs outside
the control bandwidth of the the control loop, and does not affect the output
moment-by-moment. The error signal would have to be filtered at the output
stage into, say, "controllable" and "uncontrollable" frequency bands. The
"controllable" band is what is normally considered when dealing with the
operations of the control loop.
Suppose now that the "uncontrollable" jitter-based signal is low-pass
filtered or in some way averaged. Any deviation of it from zero is
independent of the disturbance, and could affect the output additively
until that jitter deviation approaches zero. The result would be that
the mean value of the perceptual signal would come to have zero error,
or much close to zero than without the jitter.
I can see that even the jitter deviation average would still need to have
some small remanent error, but it seems to me that the incremental output
based on it could potentially increase the apparent system gain very greatly.
Is this idea something quite stupid, quite well known, or untried?
Martin