A paper on causal analysis in the presence of control systems

CSGnet: see last couple of paragraphs.
Here is the link to Richard's article:
http://www2.cmp.uea.ac.uk/~jrk/temp/RK-20120430-causnoncorr.pdf

Hello, Richard --
Once again I am staggered by your ability to make yourself
understood. By rights I should not have understood a word of what you
visibly just beneath the surface, and at times were crystal clear.
How do you do that?

A couple of random comments which may or may not correlate with anything.

First, to a control system there is no such thing as noise. There are
just variables that vary, sometimes too fast to keep up with and
sometimes in ways that allow maintaining good control. The control
system doesn't know or care about waveforms so either everything
looks random or nothing does. Noise is just a classification by an
observer of waveforms that he can't explain or predict in detail.

Second, causation is a meaningless term unless you simply mean that B
is a particular known function of A. As most people use the term,
causation is magic: B varies simply because A varied, and not because
of any intervening connection between them. The wizard waves his wand
(A) and causes the castle to disappear (B). The way statistics is
used in the life sciences mostly guarantees that causation is
magical: there is no attempt to analyze the situation and work out
what mechanisms are behind apparent causation. The whole point of
statistics is to find causation where you have no idea of the
mechanisms. If you knew the mechanisms, you wouldn't need the
statistics. As you show in your writings here, if you just know the
statistics, you don't know anything about the mechanisms, either.
Drop the idea of causation and those problems disappear, don't they?

By the way, Allie and I would both sort of like to know if anyone
other than the two who have sent their conference fees already is
coming to the July CSG meeting. Before long we are going to have to
make some payments to reserve space and time, and the plan was to use
the conference fees for that. We have received five descriptions of
what some people wish to present, which is already more than the
number registering to attend, so I am having trouble seeing how that
is going to work. And even with five presenters and five days to do
the presenting, the schedule is not going to be very crowded and we
won't need a big conference room.

I'm blind-copying this to CSGnet to catch other people who might
come. Various useful documents are attached. I am going to start nagging.

Best,

Bill P.

CallForPapers2012A1.doc (26.5 KB)

CSGRegistration2012FEB2.doc (74.5 KB)

[From Rick Marken (2012.04.30.1820)]

CSGnet: see last couple of paragraphs.
Here is the link to Richard's article:
http://www2.cmp.uea.ac.uk/~jrk/temp/RK-20120430-causnoncorr.pdf

Hello, Richard --
Once again I am staggered by your ability to make yourself understood. By
rights I should not have understood a word of what you wrote to your
mathematical peers, but your meanings always shimmered visibly just ï¿½beneath
the surface, and at times were crystal clear. How do you do that?

He's just a very smart guy, that's how!

I do wish he would have given a reference to a recent paper of mine:
Marken, R. S. and Horth, B. (2011) When Causality Does Not Imply
Correlation: More Spadework at the Foundations of Scientific
Psychology, Psychological Reports, 108, 1-12

It's certainly not up to Richard's mathematical level but it would
have been fun to be mentioned in a high class article like that;-)

By the way, Allie and I would both sort of like to know if anyone other than
the two who have sent their conference fees already is coming to the July
CSG meeting.

I'm coming. I haven't sent you the fee because I was planning to
deposit it directly into the treasury. But I will sent it if you need
it to keep track of who's coming.

Best

Rick

···

On Mon, Apr 30, 2012 at 3:21 PM, Bill Powers <powers_w@frontier.net> wrote:
--
Richard S. Marken PhD
rsmarken@gmail.com

[From Richard Kennaway (2012.05.01:1358)]

[From Rick Marken (2012.04.30.1820)]
I do wish he would have given a reference to a recent paper of mine:
Marken, R. S. and Horth, B. (2011) When Causality Does Not Imply
Correlation: More Spadework at the Foundations of Scientific
Psychology, Psychological Reports, 108, 1-12

Yes, it would be worth mentioning. Maybe Bill's "Spadework" paper as well, although I don't want to try to hit two many birds with this one stone.

Can you send me a copy of the published version? I only have a pre-pub one with various marginal corrections, and my university doesn't subscribe to the journal.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, Richard Kennaway
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.

[Martin Taylor 2012.05.16.23.20]

Richard,

I am unfamiliar with the theories of causality that you mention, but since even Wikipedia points out that you can have causality without correlation, the target must be already pretty wounded.

It occurs to me that you weaken your argument a little by first pointing out that the derivative (and hence the integral) of a variable is uncorrelated with the variable itself, and then using a circuit that includes an integral function as an independent example. You make the point less mathematically but more directly without the potential red herring of the integral in the loop when you say near the end:
-------quote------
...control systems display a systematic tendency to violate Faithfulness.... Low correlations can be found where there are direct causal eﬀects, and high correlations between variables that are only indirectly causally connected, by paths in which every step shows low correlation. This follows from the basic nature of what a control system does: vary its output to keep its perception equal to its reference. The output automatically takes whatever value it needs to, to prevent the disturbances from aﬀecting the perception. Its very function is to actively destroy the data that current techniques of causal analysis work from.

What every controller does is hold P close to R. For constant R, variations in P measure the imperfection of the controller’s performance�the degree to which it is not doing what it is suppossed to be doing. This may be useful information if one already knows what it is supposed to be doing, as will typically be the case when studying an arti�?cial control system, of a known design and made for a known purpose. However, when studying a biological or social system which might contain control systems that one is not yet aware of, correlations between perception and action�in other terminology, input and outtput, or stimulus and response�must fail to yield any knowledge about how it works. These methods can only proceed by making assumptions to rule out the problematic cases.
--------end quote------

This is a powerful statement, one that does not depend on explicitly introducing into the supposedly independent case an instance of the first case you adduce (non correlation between a variable and its derivative or integral). All that is necessary is to show the loop, to point out that if P is to be kept close to an invariant R, then O must correlate highly with D and P must not. No reference to the precise form of the control circuit is necessary. I think if you said this first, and then went into your simulations and numeric results, the independent nature of this second case would be more evident to the reader.

Martin