Sums of variables

[Martin Taylor 970429 13:45]

Rick Marken (970429.0800 PDT)]

Martin Taylor (970429 10:30) --

Me:

> If p = f(o+d) then there is no information about d in p whether d is
> thought of as the disturbance variable (d) itself or the influence of
> that variable (what you now call the disturbance signal (ds)) .

Ye:

>That's wrong.

More teasing? I thought we already agreed that the only sense in which
there is information about d (or ds) in p is as follows:

> Given p, f() and o, an outside observer can solve for d; but the control
> system itself, which knows only p, cannot.

No teasing. And "we" didn't agree what you say. It is the other way around.
If you _can_ reconsitute d, given p and f() (not o), then p has to convey
information about d. Nothing is said about when you can't do the reconstruction.You seem still to be thinking that if you gain _any_ information about
variable X, you can reconstruct X exactly ("solve for X"). That's not so.

Because you liked Richard Kennaway's postings so well, I refer you to one:
+Richard Kennaway (970421.1610 BST):

+The square of the correlation is the proportion of variation in R
+"explained" (I really can't bear to drop the scare quotes from that word in
+the context of analysis of variance) by P. So if P = R + N (where N =
+noise), and R and N are independent, then Var P = Var R + Var N

ยทยทยท

+
+If P, R, and N are normally distributed (maybe if they aren't, I haven't
+checked), and c is the correlation between P and R, then c-squared is (I'm
+hoping this is true) Var R/Var P.

If X = Y + Z, and Y and Z are uncorrelated, then VarX = VarY + VarZ.

According to Richard, the correlation between X and Y is sqrt(1 + VarY/VarZ),
and the mutual information between X and Y is 0.5*log2(1 + VarY/VarZ).
(Always assuming the algebraic half--or less--of my brain is in working order).

Of course, in the closed-loop situation, o and d are not uncorrelated, but
all I'm trying to show here is that it is wrong to say that if one variable
is the sum of two others, then measurement of the sum provides no information
about the value of either component.

Are
you just saying that I'm wrong about there _not_ being information about d
in p because it is possible for an outside observer to solve for d given p
and f() and o?

No, as you see. In my example, it is impossible to solve for Y or Z given
only X, but nevertheless, information from both is "in" X.

H(X) = H(X:Y) + Hy(X) (where the small y should be a subscript "Y").

H(X:Y) is the information in X about Y (or vice-versa). Hy(X) is the
uncertainty remaining in X after you measure Y, weighted by the
probability (or probability density) of getting each particular value
of Y. This is a general statement, having nothing to do with assumptions
about Gaussian distributions and the like.

Martin