Perception and PCT

Rick,
Take a look at:

David

···

----- Original Message ----- From: "Richard Marken" <rsmarken@GMAIL.COM>
To: <CSGNET@LISTSERV.UIUC.EDU>
Sent: Friday, April 11, 2008 12:35 PM
Subject: Re: Perception and PCT

[From Rick Marken (2008.04.11.0940)]

Bill Powers (2008.04.11.0759 MDT)

>Rick Marken (2008.04.10.2310) --

> It seems like whatever I find it makes Martin's point.

I wish you would stop making snide remarks about Martin.

I don't see what's snide about that remark. But I'll wish for me to
stop making then, too. Though I think it's part of my charm.

> "The perceptual signal is perfectly correlated with qi when the noise
> amplitude is zero. The quality of control, measured by S, is nearly
> perfect (S = ~1.0). When noise is added the correlation between p and
> qi goes down in proportion to the amplitude of the noise. However,
> adding noise that brings the correlation between p and qi down to
> nearly zero hardly affects the quality of control at all (S ~ .98).

I find that very hard to understand. Without the noise, p = r and is
constant. So control looks perfect. But when you add internal high-frequency
noise to p without directly affecting qi, p begins to vary right along with
the (internal) disturbance... Control of qi relative to the same internal
disturbance would be far better... I guess I don't understand how you set
up the situation

I think you understand the set up. What was unclear, I think, was what
I meant by control of p and qi. When the high frequency (broad band)
noise was added to p, the variance of p was, indeed, greater than the
variance of qi (which is 0.0). However, control of p, was measured as

S = 1 - sqrt(var(p)/[var(d)+var(o)])

where d is the narrow band disturbance to qi and o is the output
effect on qi. The value of S is nearly perfect (1.0 is perfect and I
get S values of .98 or more) when low amplitude broad band noise is
added to p, even though the correlation between qi and p goes down to
near 0.0.

Of course, the value of S for qi -- S = 1 -
sqrt(var(qi)/[var(d)+var(o)]) -- is higher than for p (1.0 compared to
~.98) but the point of the demo (from my perspective) was to show that
control (measured in terms of stability of qi or p) can be quite good
even when there is no correlation between qi and p. It seems to me
that this demo proves that accurate information about qi (which would
be a high correlation between qi and p) is not needed for control.

"Information" is a technical term in Martin's argument, and the only way to
draw conclusions is to calculate the information content according to the
definitions. We should wait for Martin to do that.

I was going to suggest, in my original description of the simulation,
that Martin calculate the information transmitted about qi when the
correlation between qi and p is 0. I can't remember the exact formula
for transmitted information (which Shannon called H) but I think it
was something like:

H = Sum [log2 (Pr(p|qi))]

That is, information is, technically, a log (base 2) function of the
conditional probability of getting particular values of p given
particular values of qi summed over the range of possible qi (the
messages). I can't see how H can come out to be anything other than 0
when the correlation between qi and p is 0. But I agree that we should
wait for Martin to do that. In the meantime I'll try to find the
information transmission measure myself and see how much information
is actually transmitted in this simulation. If anyone out there
happens to know off hand Shannon's formula for H please let me know
and I'll compute it in my simulation.

Best regards

Rick
--
Richard S. Marken PhD
rsmarken@gmail.com

[From Rick Marken (2008.04.11.2200)]

Rick,
Take a look at:
http://en.wikipedia.org/wiki/Shannon_entropy
David

Thanks David. I actually found it after I posted my formula from memory.

I didn’t do too badly. I wrote:

H = Sum [log2 (Pr(p|qi))]

The real formula for the information content (entropy) of a source is

\begin{matrix} H(X)  =  \operatorname{E}( I(X) ) & = & - \displaystyle{\sum_{i=1}^np(x_i)\log_2 p(x_i)} \qquad \end{matrix}So I was off by a minus sign and a multiplication.
The problem is that I can’t find the formula for measuring information transmission (which is what I think this is about; Martin is saying that information about qi is transmitted to the control system in (or as) p even when the correlation between qi and p is 0.0). And I don’t know how to use the continuous variables, qi and p, in the information formula anyway. So I’m afraid that Wikipedia won’t help us this time; we really need Martin to explain it.
Best

Rick

···

On Fri, Apr 11, 2008 at 9:38 PM, davidmg davidmg@verizon.net wrote:

Richard S. Marken PhD
rsmarken@gmail.com

Try John Pierce’s “An introduction to Information Theory, Symbols
Signs & Noise”.

Number of bits per second C

C=Wlog (1+P/N)

···

-----Original Message-----
From: Control Systems Group
Network (CSGnet) [mailto:CSGNET@LISTSERV.UIUC.EDU] On Behalf Of Richard Marken
Sent: Saturday, 12 April 2008 5:00
p.m.
To: CSGNET@LISTSERV.UIUC.EDU
Subject: Re: Perception and PCT

[From Rick Marken
(2008.04.11.2200)]

On Fri, Apr 11, 2008 at 9:38 PM, davidmg davidmg@verizon.net wrote:

Rick,

Take a look at:

http://en.wikipedia.org/wiki/Shannon_entropy

David

Thanks David. I actually found it after I posted my formula from memory.

I didn’t do too badly. I wrote:

H = Sum [log2 (Pr(p|qi))]

The real formula for the information content (entropy) of a source is

\begin{matrix} H(X)  =  \operatorname{E}( I(X) ) & = & - \displaystyle{\sum_{i=1}^np(x_i)\log_2 p(x_i)} \qquad \end{matrix}

So I was off by a minus
sign and a multiplication.

The problem is that I
can’t find the formula for measuring information transmission (which is what I
think this is about; Martin is saying that information about qi is transmitted
to the control system in (or as) p even when the correlation between qi and p
is 0.0). And I don’t know how to use the continuous variables, qi and p, in the
information formula anyway. So I’m afraid that Wikipedia won’t help us this
time; we really need Martin to explain it.

Best

Rick

Richard S. Marken PhD

rsmarken@gmail.com

[From Rick Marken (2008.04.12.0900)]

···

On Fri, Apr 11, 2008 at 10:38 PM, Gavin Ritz garritz@xtra.co.nz wrote:

Try John Pierce’s “An introduction to Information Theory, Symbols
Signs & Noise”.

Number of bits per second C

C=Wlog (1+P/N)

Is C a measure of information transmitted? I’m interested in measuring the amount of information in the environmental variable,qi, that is transmitted to the perceptual variable, p, when the correlation between qi and p is 0.0. If C is a measure of information transmission (as I presume it is) then what are the variables W, P and N and how do they correspond to qi and p? Again, I think this is what we need Martin for.

Actually, I looked it up; W is bandwidth, P is signal power and N is noise power. In my naivite, I would be inclined to say that P/N = 0 since, with a correlation of 0.0 between qi and p ther signal to noise ratio of qi (signal) to noise must be 0.0. So whatever W is, log2(1+0) = 0 and C = 0.

So I would conclude that control can occur when 0 bits of information/sec are being transmitted from qi to p. In other words, control does not require that the system have any information about qi. I don’t think this analysis would be acceptable to an information theorist, suggesting, as it does, the irrelevance of information theory to control. So I’ll wait until Martin provides the proper analysis.

Best

Rick

Richard S. Marken PhD
rsmarken@gmail.com

[From Bill Powers (2008.04.12.1900 MDT)]

Rick Marken (2008.04.12.0900) --

Actually, I looked it up; W is bandwidth, P is signal power and N is noise power. In my naivite, I would be inclined to say that P/N = 0 since, with a correlation of 0.0 between qi and p the signal to noise ratio of qi (signal) to noise must be 0.0. So whatever W is, log2(1+0) = 0 and C = 0.

If you're adding an internal random noise N to the variations in the input quantity with a perceptual input function of unity, the signal to noise ratio would be the amplitude of the variations in qi divided by the mean amplitude of the noise, wouldn't it? There are tricky calculations I'm leaving out for converting amplitude variations to power variations, but that's not critical.

You'd have to add an infinite amount of noise to get the information transmission, and also the correlation, down to zero, I think.

I think, also, that it's necessary to solve the control-system equations and work out the behavior of each of the variables before one can even start an information-theory analysis. So information theory is simply not fundamental to the way a control system works. If it were, you could use information theory to deduce the values of all the variables in the loop, and I've never seen that done.

Some people obviously love information theory, and I wouldn't want to deprive them of it. But I don't have any plans to use it in the foreseeable future. I haven't seen a need for it yet. As far as I know, control-system engineers don't use it either, except maybe when someone asks them how many bits per second a control system can detect, or produce, or whatever. Then most of them, I would guess, would have to go find an information theorist to do the calculations.

I will happily leave this subject to Martin when he gets back.

Best,

Bill P.

Gavin Ritz (2008.04.13.14.14)

[From Bill Powers (2008.04.12.1900 MDT)]

Rick Marken (2008.04.12.0900) --

I agree with Bill I can't see how one can use information theory in any
meaningful way and I've looked at it from many angles. If one is designing
telecommunication systems that's a different story.

Actually, I looked it up; W is bandwidth, P is signal power and N is
noise power. In my naivite, I would be inclined to say that P/N = 0
since, with a correlation of 0.0 between qi and p the signal to
noise ratio of qi (signal) to noise must be 0.0. So whatever W is,
log2(1+0) = 0 and C = 0.

If you're adding an internal random noise N to the variations in the
input quantity with a perceptual input function of unity, the signal
to noise ratio would be the amplitude of the variations in qi divided
by the mean amplitude of the noise, wouldn't it? There are
tricky calculations I'm leaving out for converting amplitude
variations to power variations, but that's not critical.

You'd have to add an infinite amount of noise to get the information
transmission, and also the correlation, down to zero, I think.

I think, also, that it's necessary to solve the control-system
equations and work out the behavior of each of the variables before
one can even start an information-theory analysis. So information
theory is simply not fundamental to the way a control system works.
If it were, you could use information theory to deduce the values of
all the variables in the loop, and I've never seen that done.

Some people obviously love information theory, and I wouldn't want to
deprive them of it. But I don't have any plans to use it in the
foreseeable future. I haven't seen a need for it yet. As far as I
know, control-system engineers don't use it either, except maybe when
someone asks them how many bits per second a control system can
detect, or produce, or whatever. Then most of them, I would guess,
would have to go find an information theorist to do the calculations.

I will happily leave this subject to Martin when he gets back.

Best,

Bill P.

[From Rick Marken (2008.04.12.1940)]

Gavin Ritz (2008.04.13.14.14)

Bill Powers (2008.04.12.1900 MDT)]

>Rick Marken (2008.04.12.0900) --

I agree with Bill I can't see how one can use information theory in any
meaningful way and I've looked at it from many angles. If one is designing
telecommunication systems that's a different story.

So the three of us agree. I shouldn't have even tried to apply the
information transmission formula that you sent. We should leave the
information theory analysis to Martin Taylor when he gets back.

I'm not as sanguine about information theory as Bill is, by the way. I
don't think information theory is simply useless, at least in terms
using control theory as the basis of understanding human nature. While
I think it's obviously possible to understand the basics of PCT even
if one loves information theory (as is the case with Martin Taylor)
but I think the input-output concepts of information theory can get in
the way of understanding PCT in some areas. But, I agree, that if
someone love information theory I'm not going to try to convince them
that it's a worthless hussy;-)

Best

Rick

···

--
Richard S. Marken PhD
rsmarken@gmail.com

(Gavin Ritz 2008.04.13.15.57NZT)

[From Rick Marken (2008.04.12.1940)]

Gavin Ritz (2008.04.13.14.14)

Bill Powers (2008.04.12.1900 MDT)]

>Rick Marken (2008.04.12.0900) --

I agree with Bill I can't see how one can use information theory in any
meaningful way and I've looked at it from many angles. If one is
designing
telecommunication systems that's a different story.

So the three of us agree. I shouldn't have even tried to apply the
information transmission formula that you sent. We should leave the
information theory analysis to Martin Taylor when he gets back.

I'm not as sanguine about information theory as Bill is, by the way. I
don't think information theory is simply useless, at least in terms
using control theory as the basis of understanding human nature. While
I think it's obviously possible to understand the basics of PCT even
if one loves information theory (as is the case with Martin Taylor)
but I think the input-output concepts of information theory can get in
the way of understanding PCT in some areas. But, I agree, that if
someone love information theory I'm not going to try to convince them
that it's a worthless hussy;-)

I don't think it's a bad theory it's that I have never been able to
reconcile it to what I'm interested in and that's organisational behaviour.