Superstition: breaking the loop

[From Rupert Young (2014.03.14 21:15)]

I was giving some thought to how superstitious beliefs fit into the PCT scheme of things, as I am currently cycling around India (which is why I haven't been contributing to recent studies and discussions) and there's a lot of it here.

A normal closed loop works because your actions affect the perception, your goal. If your child is sick and you want, obviously, her to be healthy your action would be to take her to the doctor. However, if you believe that offering rice and beans to an effigy of ganesh is going to help you're going to be out of luck as your actions have no effect on the goal.

So, could this be said to be a case of open loop behaviour, which is clearly ineffective due to the broken loop?

···

--

Regards,
Rupert

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the PCT
scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a
lot of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be healthy
your action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're going
to be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is
clearly ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they do
not) and we learn to expect success only occasionally. From the perceiver's
point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents reorganization
from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control, such
as administering an antibiotic.

Of course, there are cases in which actions do bring about intended results,
but only probabilistically. A pigeon working for access to grain on a
variable-interval schedule or a person calling into a radio show hoping to
be the fifth caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of
importance to me, I will employ those actions to control that variable, even
though the actions seem to bring about the intended result only some of the
time. When those "successes" arise only coincidentally, I can be seduced
into believing that I have some degree of control when actually, I don't.

Bruce

[From Fred Nickols (2014.03.13.1445 EDT)]

Good points, Bruce. In addition to the behavioral illusion there also seems
to be a control illusion.

Fred Nickols

From: Bruce Abbott [mailto:bbabbott@FRONTIER.COM]
Sent: Thursday, March 13, 2014 1:22 PM
To: CSGNET@LISTSERV.ILLINOIS.EDU
Subject: Re: Superstition: breaking the loop

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the

PCT

scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a

lot

of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be

healthy your

action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're

going to

be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is

clearly

ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they
do
not) and we learn to expect success only occasionally. From the

perceiver's

point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents

reorganization

from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control,

such

as administering an antibiotic.

Of course, there are cases in which actions do bring about intended

results,

but only probabilistically. A pigeon working for access to grain on a

variable-

interval schedule or a person calling into a radio show hoping to be the

fifth

caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of

importance

to me, I will employ those actions to control that variable, even though

the

···

-----Original Message-----
actions seem to bring about the intended result only some of the time.
When those "successes" arise only coincidentally, I can be seduced into
believing that I have some degree of control when actually, I don't.

Bruce

BA: Not necessarily.

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

We could say then this is a case of a failing control system, akin to a mental disorder.

It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

Rupert

···

On 13 March 2014 22:52:03 GMT+05:30, Bruce Abbott bbabbott@FRONTIER.COM wrote:

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the PCT
scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a
lot of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be healthy
your action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're going
to be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is
clearly ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they do
not) and we learn to expect success only occasionally. From the perceiver's
point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents reorganization
from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control, such
as administering an antibiotic.

Of course, there are cases in which actions do bring about intended results,
but only probabilistically. A pigeon working for access to grain on a
variable-interval schedule or a person calling into a radio show hoping to
be the fifth caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of
importance to me, I will employ those actions to control that variable, even
though the actions seem to bring about the intended result only some of the
time. When those "successes" arise only coincidentally, I can be seduced
into believing that I have some degree of control when actually, I don't.

Bruce

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

Hi Rupert, speaking from a clinical perspective, all beliefs, however irrational they may look, have a function in terms of closed loop goals, but these are often implicit rather than explicit. So for example, religion functions to serve social connectedness, get help from others when in need and certain practices and rituals such as prayer and meditation may even get people into mental modes that benefit reorganisation. It’s still all turtles all the way down!

Warren

···

Sent from my iPhone

On 6 Apr 2014, at 14:32, Rupert Young rupert@moonsit.co.uk wrote:

BA: Not necessarily.

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

We could say then this is a case of a failing control system, akin to a mental disorder.

It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

Rupert

On 13 March 2014 22:52:03 GMT+05:30, Bruce Abbott bbabbott@FRONTIER.COM wrote:

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the PCT
scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a
lot of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be healthy
your action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're going
to be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is
clearly ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they do
not) and we learn to expect success only occasionally. From the perceiver's
point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents reorganization
from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control, such
as administering an antibiotic.

Of course, there are cases in which actions do bring about intended results,
but only probabilistically. A pigeon working for access to grain on a
variable-interval schedule or a person calling into a radio show hoping to
be the fifth caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of
importance to me, I will employ those actions to control that variable, even
though the actions seem to bring about the intended result only some of the
time. When those "successes" arise only coincidentally, I can be seduced
into believing that I have some degree of control when actually, I don't.

Bruce

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

[Martin Taylor 2014.04.07.09.37]

BA: Not necessarily.

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

There's a continuum between perfect control and no control, and no control doesn't necessarily mean a broken loop. Consider voting. If you are in a group of five people, the decision is appreciably more likely to go your way if you vote than if you do not. If you are in an electorate of a few thousand people, whether you vote or not is highly unlikely to affect the actual decision, but that isn't the same as having no effect. The loop isn't broken, but your actions have almost no influence on your perceptions.

In the real world, particularly in social situations, the degree to which your actions influence your perceptions, and even the direction in which your actions influence your perceptions, is often quite changeable. Also the influence might be quite delayed and distributed over time. It's not like a laboratory situation, in which the effect of an action is clear and usually fairly precisely delimited in time, and it makes reorganization rather difficult and slow.

Reorganization is a statistical process. In a stationary environment, it's fine to have a rapid reorganization rate when error is sustained or persistently increasing, because statistically it is clear whether control is getting better or worse. In a fluctuating environment with randomly delayed effects of actions on perceptions, rapid reorganization would be likely to result, quite often, in no control at all. However, as Warren has pointed out, to behave in ways that lead others to act so as to allow you to control your perceptions through their actions is one way reorganization can operate reliably. That convoluted sentence means "If I believe what they say they believe, they will see me as one of them, and will be likely to help me if I need it."

We see "superstition" because we see no physical way in which certain acts such as burning incense can cause the effects they are intended to have. But there may be ways through the physical-social environment in which they do have the desired effects, and reorganization is likely to find those ways, even if the conscious analysis of what is happening postulates non-physical environmental feedback paths such as divine intervention.

We could say then this is a case of a failing control system, akin to a mental disorder.

It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

Do they really? I mean is "rather than what gets results" a fair assessment?

It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

And yet there is a big software industry in India. My point is really that what works depends on the social ocean in which you swim. Your analysis considers only the physical environment in which an individualistic hero can triumph over physical complexities. Socially, shared superstition can be useful -- or rather, the observable actions by which you show to others that you share the superstition can be useful.

Besides, some of those Indian religious festivals look rather fun, no matter what you believe. My atheist daighter, for example, just arrived in Nepal for a few months teaching English, and got caught up in what they call "Holi", which seems to involve throwing coloured powders and water at everybody and anybody, and streaking strangers' faces with the results. Lots of fun, and quite emotional, according to her.

I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would've thought that the concept of belief as a form of perception was fundamental to PCT.

Rupert

I totally agree. It's not only in the religious sense in which you couch your question, but also in the question of why scientists cling to beliefs that are refutable by logic and observation.

We had a long and unresolved period of discussion of this many years ago. Maybe it is time to revisit the issue.

Martin

···

On 2014/04/6 9:32 AM, Rupert Young wrote:

[From Bruce Abbott (2014.04.07.1035 EDT)]

···

From: csgnet-request@lists.illinois.edu [mailto:csgnet-request@lists.illinois.edu] On Behalf Of Rupert Young
Sent: Sunday, April 06, 2014 9:33 AM
To: Control Systems Group Network (CSGnet)
Subject: Re: Superstition: breaking the loop

BA: Not necessarily.

RY:

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

BA: In the cases you mention, it is a belief that there exists a causal relationship between certain other perceptions and one’s actions , or in other words, a belief  in one’s ability to control those other perceptions. But beliefs can be about other things. I might believe that the moon is made of green cheese, for example, or that the universe is nearly 14 billion years old.  How do we acquire these beliefs, and why to some beliefs persist in the face of strong evidence to the contrary? I’m not sure how to answer those questions within the PCT framework, other than to say that reorganization must be involved.

RY: We could say then this is a case of a failing control system, akin to a mental disorder.

BA: It may not be failing from the point of view of the believer. The believer may see it as a means of control that is not always successful, but seems to be successful often enough to persist in using it, unfortunately to the exclusion of more reliable and effective methods.

RY: It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

BA: Careful here – complacency is an observer’s description of a person’s behavior (persistence in an effort at control that the observer views as misguided), and is not an explanation for that behavior.Â

RY: It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

BA: So, how does one go about changing such beliefs? What possible solutions might be derived from a PCT-based analysis?

RY: I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

BA:  Perhaps this issue has been dealt with before and I’m just not aware of it. I’d be interested in hearing Rick Marken’s take on it.

Bruce

[From Rick Marken (2014.04.07.1140)]

···

Bruce Abbott (2014.04.07.1035 EDT)–

BA: So, how does one go about changing such beliefs? What possible solutions might be derived from a PCT-based analysis?

RY: I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

BA: Perhaps this issue has been dealt with before and I’m just not aware of it. I’d be interested in hearing Rick Marken’s take on it.

RM: Wow, I’m honored! I pretty much agree with everything that has been said about this so far. But since you asked;-)

RM: First, I would say that the word “belief” corresponds to an imagined perception in PCT; but a particular type of imagined perception, one that is imagined to correspond to reality. So imagining without part of that imagination being that it corresponds to reality is not “believing”; it’s just imagining. The imagination of a god, for example, becomes a belief only when it includes the idea that there really is that god. So there are people who can imagine the Greek gods without believing in them while at the same time being able to imagine the Judeo/Christian god and believing in it.

RM: So belief, to me, is imagination that includes the idea that what is imagined corresponds to what is actually true. I don’t think there is anything inherently wrong with belief. It’s how we deal with belief that determines whether or not there will be problems. I think a scientific theory, for example, is an example of a useful belief. What makes it useful is that it’s tentative. A scientific theory is a belief about what is actually true “out there” and what we should perceive if this belief is true. So a scientific theory is a belief that makes predictions about what will be perceived if we take certain actions (run certain experiments). So it’s a tentatively held, testable belief. And as long as the tests pan out – we perceive what is predicted – then the belief is maintained. I think scientific theories are, therefore, not only not problematic beliefs; they are very useful beliefs

RM: So tentatively held, testable beliefs (scientific theories) are extremely useful; non-tentatively held, untestable beliefs (religious beliefs) not so much.

RM: I think what we would want, as PCTers, is not for people to give up belief but to recognize the difference between tentatively held, testable beliefs and non-tentatively held, untestable beliefs and to recognize the problems associated with the latter kind. But I don’t think there is any “cure” for believing; it’s clearly something people do. But the non-tentatively held, untestable type beliefs have been an enormous problem for humanity (as Rupert notes). I have no idea what to do about it; just try to get out of the way when such believers start trying to prove their beliefs by killing the people who don’t believe as they do, I suppose. Or just say you believe it too and try to make them happy.

RM: One last point. I think so called “superstitious behavior”, like doing rituals to cure disease or gambling to make money, are puzzling indeed. The fact that people keep doing these things even though they don’t regularly produce the results (cured person, lots of money) that are ostensibly being controlled for, suggests that reorganization isn’t going on. So my guess is that these superstitious behaviors persist because people are getting from them what they want; they are controlling successfully for some perception, it’s just not the one they say they are controlling for. The ritual performer gets some satisfaction from the ritual itself; the gambler gets some satisfaction from the gambling itself.

RM: The problem is that doing these things has unfortunate side effects (as Rupert also notes). The ritual performer never gets into the lab to try to figure out what might actually cure people; the gambler loses all his money and ruins his family. So it would be nice if we could figure out a way to get people to not participate in these behaviors to the extent that they have these destructive side effects. But sometimes these behaviors don’t have these side effects. I know of Native American scientists, for example, who still participate in their native rituals and I know gamblers who are quite skilled and disciplined and can leave the table if they are above or below their cutoff winnings/losings. So at an individual level you have to determine whether it’s actually a problem or not.

RM: Anyway, thanks for asking.

Best

Rick


Richard S. Marken PhD
www.mindreadings.com
It is difficult to get a man to understand something, when his salary depends upon his not understanding it. – Upton Sinclair

Hi Warren, I don’t doubt that there may be psychosocial benefits to religion (though I think the same benefits can be attained in a non-religious context) but that is a separate issue.

I am saying this is a case of a failing control system that is not being reorganised, due to the persistence of the belief. Perhaps precisely because of the above mentioned benefits.

Regards

Rupert

···

On 7 April 2014 12:24:49 GMT+05:30, Warren Mansell wmansell@gmail.com wrote:

Hi Rupert, speaking from a clinical perspective, all beliefs, however irrational they may look, have a function in terms of closed loop goals, but these are often implicit rather than explicit. So for example, religion functions to serve social connectedness, get help from others when in need and certain practices and rituals such as prayer and meditation may even get people into mental modes that benefit reorganisation. It’s still all turtles all the way down!

Warren

Sent from my iPhone

On 6 Apr 2014, at 14:32, Rupert Young rupert@moonsit.co.uk wrote:

BA: Not necessarily.

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

We could say then this is a case of a failing control system, akin to a mental disorder.

It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

Rupert

On 13 March 2014 22:52:03 GMT+05:30, Bruce Abbott bbabbott@FRONTIER.COM wrote:

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the PCT
scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a
lot of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be healthy
your action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're going
to be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is
clearly ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they do
not) and we learn to expect success only occasionally. From the perceiver's
point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents reorganization
from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control, such
as administering an antibiotic.

Of course, there are cases in which actions do bring about intended results,
but only probabilistically. A pigeon working for access to grain on a
variable-interval schedule or a person calling into a radio show hoping to
be the fifth caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of
importance to me, I will employ those actions to control that variable, even
though the actions seem to bring about the intended result only some of the
time. When those "successes" arise only coincidentally, I can be seduced
into believing that I have some degree of control when actually, I don't.

Bruce

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

Hi Rupert, I agree with you entirely except for the word ‘failing’. To me, the degree of success or failure of any control system is relative, subjective, and continuous. Would my control system for spotting a green woodpecker in a distant tree fail at making sure I have long lasting relationships, or stopping me from falling off a log? Probably… Don’t all control systems fail at controlling the things they have not learned to control? Whether that ‘matters’ is partly evolutionary and partly socially subjective. This is certainly the case for ‘mental disorders’, which, maybe unfortunately, have all kinds of advantages to different people for different reasons.

I think that religion would only be a problem for someone when it stopped them from doing something they really want to do. This can be common (e.g. teaching Darwin, being gay) and this is a real problem for many people, but I am not sure it has much to do with the irrationality of the religious beliefs themselves? It’s the conflict, no?

Warren

···

On 8 Apr 2014, at 11:54, Rupert Young rupert@moonsit.co.uk wrote:

Hi Warren, I don’t doubt that there may be psychosocial benefits to religion (though I think the same benefits can be attained in a non-religious context) but that is a separate issue.

I am saying this is a case of a failing control system that is not being reorganised, due to the persistence of the belief. Perhaps precisely because of the above mentioned benefits.

Regards

Rupert

On 7 April 2014 12:24:49 GMT+05:30, Warren Mansell wmansell@gmail.com wrote:

Hi Rupert, speaking from a clinical perspective, all beliefs, however irrational they may look, have a function in terms of closed loop goals, but these are often implicit rather than explicit. So for example, religion functions to serve social connectedness, get help from others when in need and certain practices and rituals such as prayer and meditation may even get people into mental modes that benefit reorganisation. It’s still all turtles all the way down!

Warren

Sent from my iPhone

On 6 Apr 2014, at 14:32, Rupert Young rupert@moonsit.co.uk wrote:

BA: Not necessarily.

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

We could say then this is a case of a failing control system, akin to a mental disorder.

It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

Rupert

On 13 March 2014 22:52:03 GMT+05:30, Bruce Abbott bbabbott@FRONTIER.COM wrote:

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the PCT
scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a
lot of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be healthy
your action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're going
to be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is
clearly ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they do
not) and we learn to expect success only occasionally. From the perceiver's
point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents reorganization
from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control, such
as administering an antibiotic.

Of course, there are cases in which actions do bring about intended results,
but only probabilistically. A pigeon working for access to grain on a
variable-interval schedule or a person calling into a radio show hoping to
be the fifth caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of
importance to me, I will employ those actions to control that variable, even
though the actions seem to bring about the intended result only some of the
time. When those "successes" arise only coincidentally, I can be seduced
into believing that I have some degree of control when actually, I don't.

Bruce

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

WM: I agree with you entirely except for the word ‘failing’. To me, the degree of success or failure of any control system is relative, subjective, and continuous. Would my control system for spotting a green woodpecker in a distant tree fail at making sure I have long lasting relationships, or stopping me from falling off a log? Probably… Don’t all control systems fail at controlling the things they have not learned to control?

Well, of course, and who would suggest otherwise?

If the output of your woodpecker-spotting system were to immerse your head in a bucket of custard wouldn’t you consider that a failing system seeing as the output has no effect whatsoever on the goal perception?

There are many control systems involved in religion, some which may have psychosocial benefits and could be said to be not ‘failing’, but with some, I suggest, specifically involving superstitious beliefs there is failure in the above sense. However, reorganisation doesn’t take place due to constraints from the other systems involved in religion. Perhaps we can look at intrinsic error from the set of religion systems driving reorganisation and in this set it is insufficient.

Just some thoughts.

Rupert

···

On 8 April 2014 18:58:05 GMT+05:30, Warren Mansell wmansell@gmail.com wrote:

Hi Rupert, I agree with you entirely except for the word ‘failing’. To me, the degree of success or failure of any control system is relative, subjective, and continuous. Would my control system for spotting a green woodpecker in a distant tree fail at making sure I have long lasting relationships, or stopping me from falling off a log? Probably… Don’t all control systems fail at controlling the things they have not learned to control? Whether that ‘matters’ is partly evolutionary and partly socially subjective. This is certainly the case for ‘mental disorders’, which, maybe unfortunately, have all kinds of advantages to different people for different reasons.

I think that religion would only be a problem for someone when it stopped them from doing something they really want to do. This can be common (e.g. teaching Darwin, being gay) and this is a real problem for many people, but I am not sure it has much to do with the irrationality of the religious beliefs themselves? It’s the conflict, no?

Warren

On 8 Apr 2014, at 11:54, Rupert Young rupert@moonsit.co.uk wrote:

Hi Warren, I don’t doubt that there may be psychosocial benefits to religion (though I think the same benefits can be attained in a non-religious context) but that is a separate issue.

I am saying this is a case of a failing control system that is not being reorganised, due to the persistence of the belief. Perhaps precisely because of the above mentioned benefits.

Regards

Rupert

On 7 April 2014 12:24:49 GMT+05:30, Warren Mansell wmansell@gmail.com wrote:

Hi Rupert, speaking from a clinical perspective, all beliefs, however irrational they may look, have a function in terms of closed loop goals, but these are often implicit rather than explicit. So for example, religion functions to serve social connectedness, get help from others when in need and certain practices and rituals such as prayer and meditation may even get people into mental modes that benefit reorganisation. It’s still all turtles all the way down!

Warren

Sent from my iPhone

On 6 Apr 2014, at 14:32, Rupert Young rupert@moonsit.co.uk wrote:

BA: Not necessarily.

Ok, so you agree that there can be open (broken) loop cases? Which are the situations I am interested in.

Some superstitious beliefs may never bring results, even probalistically, such as going to church every Sunday so you get into heaven, eating beetroot and garlic to cure HIV, praying to Shiva to win the lottery or doing stimulus-response research to understand how living systems work.

So here the feedback path is not just unreliable but a complete failure, though people continue as the belief (reference? ) is so persistent.

We could say then this is a case of a failing control system, akin to a mental disorder.

It is the consequences of such cases which are of interest which I see as complacency as people act according to their beliefs rather than what gets results.

It is particularly striking in India, where I am at the moment, which looks to me like a society, in part, mired in superstitions. So rather than spending time and money on health care, education and science, which would achieve something, it is spent on building churches, temples and mosques and praying and going on pilgrimages and holding religious festivals (something they do a great deal here) which achieve nothing, in terms of progressive goals.

I certainly agree with you that there is a very interesting area of discussion and research here (for you psychologists), and would’ve thought that the concept of belief as a form of perception was fundamental to PCT.

Rupert

On 13 March 2014 22:52:03 GMT+05:30, Bruce Abbott bbabbott@FRONTIER.COM wrote:

[From Bruce Abbott (2014.03.13.1320 EDT)]

Rupert Young (2014.03.14 21:15) --

RY: I was giving some thought to how superstitious beliefs fit into the PCT
scheme of things, as I am currently cycling around India (which is why I
haven't been contributing to recent studies and discussions) and there's a
lot of it here.

RY: A normal closed loop works because your actions affect the perception,
your goal. If your child is sick and you want, obviously, her to be healthy
your action would be to take her to the doctor. However, if you believe that
offering rice and beans to an effigy of ganesh is going to help you're going
to be out of luck as your actions have no effect on the goal.

RY: So, could this be said to be a case of open loop behaviour, which is
clearly ineffective due to the broken loop?

BA: Not necessarily. Actions intended to bring about some result do not
always work as intended; indeed, in some cases the intended result occurs
only probabilistically. Superstition takes hold when actions do sometimes
appear to bring about the hoped-for outcome (even though in reality they do
not) and we learn to expect success only occasionally. From the perceiver's
point of view, the loop is closed but with an unreliable environmental
feedback function. The occasional apparent success prevents reorganization
from dismantling the (perceived) control system. Unfortunately, this can
prevent an individual from seeking a more effective method of control, such
as administering an antibiotic.

Of course, there are cases in which actions do bring about intended results,
but only probabilistically. A pigeon working for access to grain on a
variable-interval schedule or a person calling into a radio show hoping to
be the fifth caller and thus win some free tickets are two examples.

There's a nice discussion lurking here on belief as a form of perception.
If I perceive my actions to be effective in controlling a variable of
importance to me, I will employ those actions to control that variable, even
though the actions seem to bring about the intended result only some of the
time. When those "successes" arise only coincidentally, I can be seduced
into believing that I have some degree of control when actually, I don't.

Bruce

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

Sent from my Android phone with K-9 Mail. Please excuse my brevity.

[Martin Taylor 2014.04.10.13.47]

Rupert, I suspect that Warren was not really talking about the green

woodpecker control system causing those other strange effects, but
about the fact that if the green woodpecker system affected those
other perceptions at all, it is only by way of side-effects. Any one
elementary control unit controls only it own perception, and green
woodpecker perceptual control doesn’t fail if you fall off a log
(except that then environmental feedback path possibilities –
effordances-- have changed because of your change of position). The
failure of the “perceive myself on a log” might produce a
side-effect that changed the ability of the “gerrn woodpecker”
system to control, but that is not in itself a failure of the green
woodpecker system.
If the global system had reorganized so that the output of the green
woodpecker system resulted in your head being immersed in a bucket
of custard, I suspect that further reorganization would happen
fairly soon…
As for superstitious beliefs and religion, if your beliefs lead you
to behave in a way that leads others to perceive you as a “good
guy”, (which means they have reference levels for perceptions of
your wellbeing at some high level), Then you are probably better
able to control your own perceptions through their actions than if
they saw you as a “bad guy” (which means their reference levels for
perceiving aspects of your well-being would be at rather low
levels).
Martin

···

On 2014/04/10 1:21 AM, Rupert Young
wrote:

  WM: I agree with you entirely except for the word 'failing'. To

me, the degree of success or failure of any control system is
relative, subjective, and continuous. Would my control system for
spotting a green woodpecker in a distant tree fail at making sure
I have long lasting relationships, or stopping me from falling off
a log? Probably… Don’t all control systems fail at controlling
the things they have not learned to control?

  RY: Well, of course, and who would suggest otherwise?



  RY: If the output of your woodpecker-spotting system were to

immerse your head in a bucket of custard wouldn’t you consider
that a failing system seeing as the output has no effect
whatsoever on the goal perception?

  RY: There are many control systems involved in religion, some

which may have psychosocial benefits and could be said to be not
‘failing’, but with some, I suggest, specifically involving
superstitious beliefs there is failure in the above sense.
However, reorganisation doesn’t take place due to constraints from
the other systems involved in religion. Perhaps we can look at
intrinsic error from the set of religion systems driving
reorganisation and in this set it is insufficient.

  RY: Just some thoughts.



  Rupert

If the global system had reorganized so that the output of the green
woodpecker system resulted in your head being immersed in a bucket of
custard, I suspect that further reorganization would happen fairly
soon....

But not in the case of (some) superstitious belief. For example, when someone continues to control for their prosperity (see green woodpecker) by offering rice and beans (head in custard) to Shiva, despite no change to the perception. Continual error (still poor) no reorganisation.

I do have an idea of how this might happen (error no reorganisation) but would be interested in suggestions from others to check if we are even on the same page.

Regards
Rupert

···

On 10 April 2014 23:32:27 GMT+05:30, Martin Taylor <mmt-csg@mmtaylor.net> wrote:

On 10 April 2014 23:32:27 GMT+05:30, Martin Taylor <mmt-csg@mmtaylor.net> wrote:

[Martin Taylor 2014.04.10.13.47]

On 2014/04/10 1:21 AM, Rupert Young wrote:

WM: I agree with you entirely except for the word 'failing'. To me,
the degree of success or failure of any control system is relative,
subjective, and continuous. Would my control system for spotting a
green woodpecker in a distant tree fail at making sure I have long
lasting relationships, or stopping me from falling off a log?
Probably.... Don't all control systems fail at controlling the things

they have not learned to control?

RY: Well, of course, and who would suggest otherwise?

RY: If the output of your woodpecker-spotting system were to immerse
your head in a bucket of custard wouldn't you consider that a failing

system seeing as the output has no effect whatsoever on the goal
perception?

RY: There are many control systems involved in religion, some which
may have psychosocial benefits and could be said to be not 'failing',

but with some, I suggest, specifically involving superstitious

beliefs

there is failure in the above sense. However, reorganisation doesn't
take place due to constraints from the other systems involved in
religion. Perhaps we can look at intrinsic error from the set of
religion systems driving reorganisation and in this set it is
insufficient.

RY: Just some thoughts.

Rupert

Rupert, I suspect that Warren was not really talking about the green
woodpecker control system causing those other strange effects, but
about
the fact that if the green woodpecker system affected those other
perceptions at all, it is only by way of side-effects. Any one
elementary control unit controls only it own perception, and green
woodpecker perceptual control doesn't fail if you fall off a log
(except
that then environmental feedback path possibilities -- effordances--
have changed because of your change of position). The failure of the
"perceive myself on a log" might produce a side-effect that changed the

ability of the "gerrn woodpecker" system to control, but that is not in

itself a failure of the green woodpecker system.

If the global system had reorganized so that the output of the green
woodpecker system resulted in your head being immersed in a bucket of
custard, I suspect that further reorganization would happen fairly
soon....

As for superstitious beliefs and religion, if your beliefs lead you to
behave in a way that leads others to perceive you as a "good guy",
(which means they have reference levels for perceptions of your
wellbeing at some high level), Then you are probably better able to
control your own perceptions through their actions than if they saw you

as a "bad guy" (which means their reference levels for perceiving
aspects of your well-being would be at rather low levels).

Martin

--
Sent from my Android phone with K-9 Mail. Please excuse my brevity.

[Martin Taylor 2014.04.11.12.03]

If the global system had reorganized so that the output of the green
woodpecker system resulted in your head being immersed in a bucket of
custard, I suspect that further reorganization would happen fairly
soon....

But not in the case of (some) superstitious belief. For example, when someone continues to control for their prosperity (see green woodpecker) by offering rice and beans (head in custard) to Shiva, despite no change to the perception. Continual error (still poor) no reorganisation.

The question I see is not the question you see. You take it for granted that offering rice and beans to Shiva results in no reduction in the error of whatever perceptions are being controlled by that action. I don't see on what you base your assertion. For example, using "I" as teh worshipper, if "I" am controlling for perceiving myself to be a good Hindu, failure to make the offering would increase the error in that perception, as would also be the case if I didn't believe the offering would induce Shiva to send me money to feed my kids, but nevertheless controlled for perceiving that other people with whom I interact would see me as being a good Hindu (and perhaps therefore being inclined to lend me money if I couldn't otherwise feed my kids). In that case, feeding Shiva does directly reduce error in a perception of "kids (not) being hungry".

To my understanding (no longer being a hypothetical Hindu), you are assuming that some particular perception is being controlled by a mechanism that cannot influence that perception, and are asking how this particular set of connections can be "reorganizationally stable" (to borrow a term from evolutionary theory). I am looking at it from the opposite side, seeing a behaviour that appears to be pretty stable and asking what kind of perception it might control so that the loop can be reorganizationally stable.

I do have an idea of how this might happen (error no reorganisation) but would be interested in suggestions from others to check if we are even on the same page.

Regards
Rupert

Do we ever know that there's no reorganization going on? There could be lots of internal reorganization that does not appear in ways obvious to an outside observer. The lower levels probably don't reorganize much after early youth, and their stability could sometimes mask big changes as reorganization happens much higher in the heirarchy. Remember that one of the PCT mantras is "same results by different means", so as the specific local environment changes, as it does by the second, the hour, the season, and over long time, the observable behaviours will change in the absence of reorganization. Those changes could easily be confused with changes due to reorganization.

Failure to change behaviour could be the other side of that coin; in an unstable environment, maintenance of a stable behaviour that at least does not cause problems for the reorganization of other control systems could be quite useful. If I remember correctly, religious strictness and religious attendance usually increase in time of turmoil.

Martin

···

On 2014/04/12 9:40 AM, Rupert Young wrote:

On 10 April 2014 23:32:27 GMT+05:30, Martin Taylor <mmt-csg@mmtaylor.net> wrote:

On 10 April 2014 23:32:27 GMT+05:30, Martin Taylor <mmt-csg@mmtaylor.net> wrote:

[Martin Taylor 2014.04.10.13.47]

On 2014/04/10 1:21 AM, Rupert Young wrote:

WM: I agree with you entirely except for the word 'failing'. To me,
the degree of success or failure of any control system is relative,
subjective, and continuous. Would my control system for spotting a
green woodpecker in a distant tree fail at making sure I have long
lasting relationships, or stopping me from falling off a log?
Probably.... Don't all control systems fail at controlling the things
they have not learned to control?

RY: Well, of course, and who would suggest otherwise?

RY: If the output of your woodpecker-spotting system were to immerse
your head in a bucket of custard wouldn't you consider that a failing
system seeing as the output has no effect whatsoever on the goal
perception?

RY: There are many control systems involved in religion, some which
may have psychosocial benefits and could be said to be not 'failing',
but with some, I suggest, specifically involving superstitious

beliefs

there is failure in the above sense. However, reorganisation doesn't
take place due to constraints from the other systems involved in
religion. Perhaps we can look at intrinsic error from the set of
religion systems driving reorganisation and in this set it is
insufficient.

RY: Just some thoughts.

Rupert

Rupert, I suspect that Warren was not really talking about the green
woodpecker control system causing those other strange effects, but
about
the fact that if the green woodpecker system affected those other
perceptions at all, it is only by way of side-effects. Any one
elementary control unit controls only it own perception, and green
woodpecker perceptual control doesn't fail if you fall off a log
(except
that then environmental feedback path possibilities -- effordances--
have changed because of your change of position). The failure of the
"perceive myself on a log" might produce a side-effect that changed the

ability of the "gerrn woodpecker" system to control, but that is not in

itself a failure of the green woodpecker system.

If the global system had reorganized so that the output of the green
woodpecker system resulted in your head being immersed in a bucket of
custard, I suspect that further reorganization would happen fairly
soon....

As for superstitious beliefs and religion, if your beliefs lead you to
behave in a way that leads others to perceive you as a "good guy",
(which means they have reference levels for perceptions of your
wellbeing at some high level), Then you are probably better able to
control your own perceptions through their actions than if they saw you

as a "bad guy" (which means their reference levels for perceiving
aspects of your well-being would be at rather low levels).

Martin

[Rupert young (2014.04.13 21.00)]

The question I see is not the question you see. You take it for granted
that offering rice and beans to Shiva results in no reduction in the
error of whatever perceptions are being controlled by that action.

It’s not a matter of taking it for granted it is how I am defining the scenario.

For example, using “I” as teh
worshipper, if “I” am controlling for perceiving myself to be a good
Hindu, failure to make the offering would increase the error in that
perception

Yes, though that is in a different control system within you, as a Hindu, and this is probably key to the “reorganizationally stablility”.

as would also be the case if I didn’t believe the offering
would induce Shiva to send me money to feed my kids, but nevertheless
controlled for perceiving that other people with whom I interact would
see me as being a good Hindu (and perhaps therefore being inclined to
lend me money if I couldn’t otherwise feed my kids). In that case,
feeding Shiva does directly reduce error in a perception of “kids (not)
being hungry”.

That’s a different scenario and changes the subject to a different discussion.

To my understanding (no longer being a hypothetical Hindu), you are
assuming that some particular perception is being controlled by a
mechanism that cannot influence that perception, and are asking how this
particular set of connections can be “reorganizationally stable”

yes, that is the question I am looking at.

I am looking at it from the
opposite side, seeing a behaviour that appears to be pretty stable and
asking what kind of perception it might control so that the loop can be
reorganizationally stable.

Ok, but that is a different question.

But perhaps you are saying that my scenario if physically impossible, in which case I’d be interested in your reasons for that.

Do we ever know that there’s no reorganization going on?

I am not actually saying reorganization isn’t going on necessarily just that it is not working, in reorganizing the system.

Sorry for brevity, internet cafe closing.

Regards

Rupert

···

On 12 April 2014 at 17:20 Martin Taylor mmt-csg@mmtaylor.net wrote:

I thought I was sending this to the list, but I wasn’t. Rupert, I’m
sorry you get it twice

···

Martin

  -------- Original Message --------

[Martin Taylor 2014.04.13.12.35]
You assert, without evidence, that offering rice and beans to
Shiva result in no reduction in the error of whatever perceptions
are being controlled by that action. OK, we can take that as a
hypothetical situation, though not a situation that necessarily
corresponds to the facts on the ground.
Sorry, I was not dealing with your hypothetical situation in which
this does not happen.
No, I’m not saying that your hypothestical situation is physically
impossible. I’m saying that if maturation does involve
reorganization, whether it’s Bill’s e-coli method or any other, a
behaviour that takes time and effort to perform but that does not
influence any controlled perception is unlikely to stick around
very long when it might conflict with other behaviours that do
influence controlled behaviours.
In your hypothetical system, that is true by definition.
Statistically, it could occur fleetingly in a real system. But is
it likely to be observed as a pervasive and long-lasting
phenomenon in the real world?
Martin

Subject:
Re: Superstition: breaking the loop
Date:
Sun, 13 Apr 2014 12:43:05 -0400
From:
Martin Taylor
To:
mmt-csg@mmtaylor.netrupert@moonsit.co.ukrupert@moonsit.co.uk

    On 2014/04/13 11:28 AM, wrote:

rupert@moonsit.co.uk

[Rupert young (2014.04.13 21.00)]

      > On 12 April 2014 at 17:20 Martin Taylor wrote: > > The question I see is not the question you see. You take

it for granted > that offering rice and beans to Shiva results in no
reduction in the > error of whatever perceptions are being controlled by
that action.

      It's not a matter of taking it for granted it is how I am

defining the scenario.

mmt-csg@mmtaylor.net

      > For example, using "I" as teh

      > worshipper, if "I" am controlling for perceiving myself

to be a good

      > Hindu, failure to make the offering would increase the

error in that

      > perception
      Yes, though that is in a different control system within

you, as a Hindu, and this is probably key to the
“reorganizationally stablility”.

      > as would also be the case if I didn't believe the

offering

      > would induce Shiva to send me money to feed my kids, but

nevertheless

      > controlled for perceiving that other people with whom I

interact would

      > see me as being a good Hindu (and perhaps therefore being

inclined to

      > lend me money if I couldn't otherwise feed my kids). In

that case,

      > feeding Shiva does directly reduce error in a perception

of "kids (not)

      > being hungry".
      That's a different scenario and changes the subject to a

different discussion.

      > To my understanding (no longer being a hypothetical

Hindu), you are

      > assuming that some particular perception is being

controlled by a

      > mechanism that cannot influence that perception, and are

asking how this

      > particular set of connections can be "reorganizationally

stable"

yes, that is the question I am looking at.

I am looking at it from the

      > opposite side, seeing a behaviour that appears to be

pretty stable and

      > asking what kind of perception it might control so that

the loop can be

      > reorganizationally stable.

Ok, but that is a different question.

      But perhaps you are saying that my scenario if physically

impossible, in which case I’d be interested in your reasons
for that.

      > Do we ever know that there's no reorganization going on?
      I am not actually saying reorganization isn't going on

necessarily just that it is not working, in reorganizing the
system.

[Rupert Young 2014.04.14.19.00]

(Martin Taylor 2014.04.13.12.35)

MT: You assert, without evidence, that offering rice and beans to Shiva result in no reduction in the error of whatever perceptions are being controlled by that action. OK, we can take that as a hypothetical situation, though not a situation that necessarily corresponds to the facts on the ground.

RY: Well, if we’re going to need evidence for every discussion here we may as well shut down csgnet and go home :slight_smile:

Besides this is a hypothetical discussion; a thought experiment, so we should be able to move forwards on the grounds of premises being, seemingly, physically and reasonably possible. You have agreed on the latter, and to me it seems quite reasonable, looking at real world beliefs and behaviour, that one, people do things that are ineffectual, and two, that the relevant perceptions do not change as they expect (and people are aware they do not change), and three, people continue to act in this ineffectual manner.

Your objection appears to be that reorganisation will reorganise this behaviour out of existence. A quite reasonable position consistent with PCT.
If I can overcome this objection, and still be consistent with PCT, perhaps you will be more amenable to my scenario?

So this is my suggestion (hypothesis). This particular superstitious belief (control) system (SBS) exists only within the context of a whole gamut of other religious beliefs and is dependant upon them, in that they are higher level (dominant?) systems, and is defined by them. This SBS is subservient in that it could not exist in isolation; it would get reorganised.

However, the SBS also effects these other systems. So reorganisation of the SBS would result in error in all the these other (dominant) systems, which would mean that they would reorganise back to where they were, and back to where the SBS was, maintaining the status quo of reorganisational stability despite there being continual error in one of the systems. You said it yourself “if I am controlling for perceiving myself to be a good Hindu, failure to make the offering would increase the error in that perception”.

Perhaps there is a reorganisational principle here, that reorganisation of high level systems overrides that of lower-level, dependant systems.

That’s it. No evidence, just an idea

Rupert

···

Martin

  -------- Original Message --------

[Martin Taylor 2014.04.13.12.35]
You assert, without evidence, that offering rice and beans to
Shiva result in no reduction in the error of whatever perceptions
are being controlled by that action. OK, we can take that as a
hypothetical situation, though not a situation that necessarily
corresponds to the facts on the ground.
Sorry, I was not dealing with your hypothetical situation in which
this does not happen.
No, I’m not saying that your hypothestical situation is physically
impossible. I’m saying that if maturation does involve
reorganization, whether it’s Bill’s e-coli method or any other, a
behaviour that takes time and effort to perform but that does not
influence any controlled perception is unlikely to stick around
very long when it might conflict with other behaviours that do
influence controlled behaviours.
In your hypothetical system, that is true by definition.
Statistically, it could occur fleetingly in a real system. But is
it likely to be observed as a pervasive and long-lasting
phenomenon in the real world?
Martin

Subject:
Re: Superstition: breaking the loop
Date:
Sun, 13 Apr 2014 12:43:05 -0400
From:
Martin Taylor
To:
mmt-csg@mmtaylor.netrupert@moonsit.co.ukrupert@moonsit.co.uk

    On 2014/04/13 11:28 AM, wrote:

rupert@moonsit.co.uk

[Rupert young (2014.04.13 21.00)]

      > On 12 April 2014 at 17:20 Martin Taylor wrote: > > The question I see is not the question you see. You take

it for granted > that offering rice and beans to Shiva results in no
reduction in the > error of whatever perceptions are being controlled by
that action.

      It's not a matter of taking it for granted it is how I am

defining the scenario.

mmt-csg@mmtaylor.net

      > For example, using "I" as teh

      > worshipper, if "I" am controlling for perceiving myself

to be a good

      > Hindu, failure to make the offering would increase the

error in that

      > perception
      Yes, though that is in a different control system within

you, as a Hindu, and this is probably key to the
“reorganizationally stablility”.

      > as would also be the case if I didn't believe the

offering

      > would induce Shiva to send me money to feed my kids, but

nevertheless

      > controlled for perceiving that other people with whom I

interact would

      > see me as being a good Hindu (and perhaps therefore being

inclined to

      > lend me money if I couldn't otherwise feed my kids). In

that case,

      > feeding Shiva does directly reduce error in a perception

of "kids (not)

      > being hungry".
      That's a different scenario and changes the subject to a

different discussion.

      > To my understanding (no longer being a hypothetical

Hindu), you are

      > assuming that some particular perception is being

controlled by a

      > mechanism that cannot influence that perception, and are

asking how this

      > particular set of connections can be "reorganizationally

stable"

yes, that is the question I am looking at.

I am looking at it from the

      > opposite side, seeing a behaviour that appears to be

pretty stable and

      > asking what kind of perception it might control so that

the loop can be

      > reorganizationally stable.

Ok, but that is a different question.

      But perhaps you are saying that my scenario if physically

impossible, in which case I’d be interested in your reasons
for that.

      > Do we ever know that there's no reorganization going on?
      I am not actually saying reorganization isn't going on

necessarily just that it is not working, in reorganizing the
system.

[Martin Taylor 2014.04.14.15.18]

Like continuing to vote in elections. For me, I do it for much the

same reason I suggested for the Hindu offering. I am controlling a
self-image perception of myself as a good citizen. Voting is a
necessary component of the “good citizen” perception, but I have
never known my vote to influence the result of an election.
I don’t think it’s question of overcoming an objection, so much as
devising a simulation that would demonstrate your point.
Context? Or contribution to a higher level perception?
I think there may well be a reorganizational principle here. One
sees it in the resilience of many networks, ecological, for example.
When there’s a disturbance, the effect might die away, or there
might be a catastrophic change that propagates through the network.
In the PCT hierarchy, I called that effect “The Bomb in the
Machine”.
The Bomb isn’t the same as what you are suggesting, but I think it
is in the same family of effects, and I think it would be very
interesting to flesh it out, and perhaps then to test it in
simulation. I have a suspicion that you might find a similar
avalanche effect here, and you might also be pointing toward a
solution to the question of why recent converts are often more
religious than the long-time faithful.
Sorry for having initially misinterpreted what you were saying.
Ideas are what we need, as well as fine-grained pursuit of the ideas
already out there.
Martin

···

[Rupert Young 2014.04.14.19.00]

  (Martin Taylor 2014.0[4.13.12.35](http://4.13.12.35))



  MT: You assert, without evidence, that offering rice and beans to

Shiva result in no reduction in the error of whatever perceptions
are being controlled by that action. OK, we can take that as a
hypothetical situation, though not a situation that necessarily
corresponds to the facts on the ground.

  RY: Well, if we're going to need evidence for every discussion

here we may as well shut down csgnet and go home :slight_smile:

  Besides this is a hypothetical discussion; a thought experiment,

so we should be able to move forwards on the grounds of premises
being, seemingly, physically and reasonably possible. You have
agreed on the latter, and to me it seems quite reasonable, looking
at real world beliefs and behaviour, that one, people do things
that are ineffectual, and two, that the relevant perceptions do
not change as they expect (and people are aware they do not
change), and three, people continue to act in this ineffectual
manner.

  Your objection appears to be that reorganisation will reorganise

this behaviour out of existence. A quite reasonable position
consistent with PCT.

  If I can overcome this objection, and still be consistent with

PCT, perhaps you will be more amenable to my scenario?

  So this is my suggestion (hypothesis). This particular

superstitious belief (control) system (SBS) exists only within the
context of a whole gamut of other religious beliefs and is
dependant upon them, in that they are higher level (dominant?)
systems, and is defined by them. This SBS is subservient in that
it could not exist in isolation; it would get reorganised.

  However, the SBS also effects these other systems. So

reorganisation of the SBS would result in error in all the these
other (dominant) systems, which would mean that they would
reorganise back to where they were, and back to where the SBS was,
maintaining the status quo of reorganisational stability despite
there being continual error in one of the systems. You said it
yourself “if I am controlling for perceiving myself to be a good
Hindu, failure to make the offering would increase the error in
that perception”.

  Perhaps there is a reorganisational principle here, that

reorganisation of high level systems overrides that of
lower-level, dependant systems.

  That's it. No evidence, just an idea



  Rupert
    On 13 April 2014 22:37:18 GMT+05:30,

Martin Taylor wrote:

  --

  Sent from my Android phone with K-9 Mail. Please excuse my

brevity.

mmt-csg@mmtaylor.net

      I thought I was sending this to the list,

but I wasn’t. Rupert, I’m sorry you get it twice

        Martin

        -------- Original Message --------

[Martin Taylor 2014.04.13.12.35]
You assert, without evidence, that offering rice and beans
to Shiva result in no reduction in the error of whatever
perceptions are being controlled by that action. OK, we can
take that as a hypothetical situation, though not a
situation that necessarily corresponds to the facts on the
ground.
Sorry, I was not dealing with your hypothetical situation in
which this does not happen.
No, I’m not saying that your hypothestical situation is
physically impossible. I’m saying that if maturation does
involve reorganization, whether it’s Bill’s e-coli method or
any other, a behaviour that takes time and effort to perform
but that does not influence any controlled perception is
unlikely to stick around very long when it might conflict
with other behaviours that do influence controlled
behaviours.
In your hypothetical system, that is true by definition.
Statistically, it could occur fleetingly in a real system.
But is it likely to be observed as a pervasive and
long-lasting phenomenon in the real world?
Martin

Subject:
Re: Superstition: breaking the loop
Date:
Sun, 13 Apr 2014 12:43:05 -0400
From:
Martin Taylor
To:
mmt-csg@mmtaylor.netrupert@moonsit.co.ukrupert@moonsit.co.uk

          On 2014/04/13 11:28 AM, wrote:

rupert@moonsit.co.uk

[Rupert young (2014.04.13 21.00)]

            > On 12 April 2014 at 17:20 Martin Taylor wrote: > > The question I see is not the question you see. You

take it for granted > that offering rice and beans to Shiva results in no
reduction in the > error of whatever perceptions are being controlled
by that action.

            It's not a matter of taking it for granted it is how

I am defining the scenario.

mmt-csg@mmtaylor.net

            > For example, using "I" as teh

            > worshipper, if "I" am controlling for perceiving

myself to be a good

            > Hindu, failure to make the offering would increase

the error in that

            > perception
            Yes, though that is in a different control system

within you, as a Hindu, and this is probably key to the
“reorganizationally stablility”.

            > as would also be the case if I didn't believe

the offering

            > would induce Shiva to send me money to feed my

kids, but nevertheless

            > controlled for perceiving that other people with

whom I interact would

            > see me as being a good Hindu (and perhaps therefore

being inclined to

            > lend me money if I couldn't otherwise feed my

kids). In that case,

            > feeding Shiva does directly reduce error in a

perception of "kids (not)

            > being hungry".
            That's a different scenario and changes the subject

to a different discussion.

            > To my understanding (no longer being a hypothetical

Hindu), you are

            > assuming that some particular perception is being

controlled by a

            > mechanism that cannot influence that perception,

and are asking how this

            > particular set of connections can be

“reorganizationally stable”

yes, that is the question I am looking at.

I am looking at it from the

            > opposite side, seeing a behaviour that appears to

be pretty stable and

            > asking what kind of perception it might control so

that the loop can be

            > reorganizationally stable.

Ok, but that is a different question.

            But perhaps you are saying that my scenario if

physically impossible, in which case I’d be interested
in your reasons for that.

            > Do we ever know that there's no reorganization

going on?

            I am not actually saying reorganization isn't going

on necessarily just that it is not working, in
reorganizing the system.

Hi Rupert, yep that is what Method of Levels is based on. Because the higher level systems govern the references for the lower level systems, you need to bring awareness to those higher levels to reorganise for long-lasting change…

Warren

···

Martin

  -------- Original Message --------

[Martin Taylor 2014.04.13.12.35]
You assert, without evidence, that offering rice and beans to
Shiva result in no reduction in the error of whatever perceptions
are being controlled by that action. OK, we can take that as a
hypothetical situation, though not a situation that necessarily
corresponds to the facts on the ground.
Sorry, I was not dealing with your hypothetical situation in which
this does not happen.
No, I’m not saying that your hypothestical situation is physically
impossible. I’m saying that if maturation does involve
reorganization, whether it’s Bill’s e-coli method or any other, a
behaviour that takes time and effort to perform but that does not
influence any controlled perception is unlikely to stick around
very long when it might conflict with other behaviours that do
influence controlled behaviours.
In your hypothetical system, that is true by definition.
Statistically, it could occur fleetingly in a real system. But is
it likely to be observed as a pervasive and long-lasting
phenomenon in the real world?
Martin

Subject:
Re: Superstition: breaking the loop
Date:
Sun, 13 Apr 2014 12:43:05 -0400
From:
Martin Taylor
To:
mmt-csg@mmtaylor.netrupert@moonsit.co.ukrupert@moonsit.co.uk

    On 2014/04/13 11:28 AM, wrote:

rupert@moonsit.co.uk

[Rupert young (2014.04.13 21.00)]

      > On 12 April 2014 at 17:20 Martin Taylor wrote: > > The question I see is not the question you see. You take

it for granted > that offering rice and beans to Shiva results in no
reduction in the > error of whatever perceptions are being controlled by
that action.

      It's not a matter of taking it for granted it is how I am

defining the scenario.

mmt-csg@mmtaylor.net

      > For example, using "I" as teh

      > worshipper, if "I" am controlling for perceiving myself

to be a good

      > Hindu, failure to make the offering would increase the

error in that

      > perception
      Yes, though that is in a different control system within

you, as a Hindu, and this is probably key to the
“reorganizationally stablility”.

      > as would also be the case if I didn't believe the

offering

      > would induce Shiva to send me money to feed my kids, but

nevertheless

      > controlled for perceiving that other people with whom I

interact would

      > see me as being a good Hindu (and perhaps therefore being

inclined to

      > lend me money if I couldn't otherwise feed my kids). In

that case,

      > feeding Shiva does directly reduce error in a perception

of "kids (not)

      > being hungry".
      That's a different scenario and changes the subject to a

different discussion.

      > To my understanding (no longer being a hypothetical

Hindu), you are

      > assuming that some particular perception is being

controlled by a

      > mechanism that cannot influence that perception, and are

asking how this

      > particular set of connections can be "reorganizationally

stable"

yes, that is the question I am looking at.

I am looking at it from the

      > opposite side, seeing a behaviour that appears to be

pretty stable and

      > asking what kind of perception it might control so that

the loop can be

      > reorganizationally stable.

Ok, but that is a different question.

      But perhaps you are saying that my scenario if physically

impossible, in which case I’d be interested in your reasons
for that.

      > Do we ever know that there's no reorganization going on?
      I am not actually saying reorganization isn't going on

necessarily just that it is not working, in reorganizing the
system.

[Martin Taylor 2014.04.15.13.22]

Warren, we have interpreted Rupert's hypothetical situation

differently. I saw it as suggesting the possibility that there might
be some resilience in the network structure of the control
hierarchy, whereas you saw it as noting that reorganization depends
on localising attention at a level above where a problem seems to be
manifest.
Whichever interpretation is the one Rupert intended, his question
started with a hypothetical situation in which an action intended to
control some perception (call it “perception X”) cannot actually
influence it. As I understand it, the question is whether such a
situation will be reorganized so that the perception does become
controlled, and if not, why not.
Again as I understand it, Rupert proposed that the interactions
among the various control systems supporting a higher-level system,
one of which is the hypothetical perception X system, can serve to
stabilize the control structure so as to revert a reorganization
that would turn the “non-control” system X into an actual control
system X. Two kinds of reorganization are possible – a change in
the perceptual function and a change in the output organization,
either above or below the "perception X’ control unit.
I think it’s an interesting question. At this level of analysis and
of my limited understanding, I don’t think there’s an intuitively
obvious answer. After all, every one of the control systems
supporting the high-level complex is non-linear. All or most of them
probably support other high-level control systems, which means that
the action that fails to influence perception X may be essential in
controlling other high-level perceptions. Quite possibly,
substituting another action that would influence perception X might
set up hard-to-resolve conflicts in controlling other perceptions,
setting up a “reorganization barrier”.
As it stands, as Rupert said, it’s “just an idea”. I hope I have
properly expressed his idea. If so, there’s a long stretch between
proposing it and finding a way to test it. Perhaps a little more
discussion might point the way to an approach.
Whether such speculations will lead anywhere is anyone’s guess. I
hope they do. And whether or not I’m interpreting Rupert as he would
wish, I’m glad he brought it up.
(Incidentally, it’s exactly the kind of question for which ECACS –
Exploration of Complex Adaptive Control Systems – was designed.
It’s about a complex control structure, and the proposal might hang
in the air for years before someone comes up with a fresh take on it
and continues the discussion from where it left off. Easy to do in a
Forum, almost impossible in a mailing list).
Martin

···
    Hi Rupert, yep that is what Method of Levels is based on.

Because the higher level systems govern the references for the
lower level systems, you need to bring awareness to those higher
levels to reorganise for long-lasting change…

Warren

    Sent from my iPhone
    On 14 Apr 2014, at 14:36, Rupert Young <rupert@moonsit.co.uk        >

wrote:

[Rupert Young 2014.04.14.19.00]

      (Martin Taylor 2014.0[4.13.12.35](http://4.13.12.35))



      MT: You assert, without evidence, that offering rice and beans

to Shiva result in no reduction in the error of whatever
perceptions are being controlled by that action. OK, we can
take that as a hypothetical situation, though not a situation
that necessarily corresponds to the facts on the ground.

      RY: Well, if we're going to need evidence for every discussion

here we may as well shut down csgnet and go home :slight_smile:

      Besides this is a hypothetical discussion; a thought

experiment, so we should be able to move forwards on the
grounds of premises being, seemingly, physically and
reasonably possible. You have agreed on the latter, and to me
it seems quite reasonable, looking at real world beliefs and
behaviour, that one, people do things that are ineffectual,
and two, that the relevant perceptions do not change as they
expect (and people are aware they do not change), and three,
people continue to act in this ineffectual manner.

      Your objection appears to be that reorganisation will

reorganise this behaviour out of existence. A quite reasonable
position consistent with PCT.

      If I can overcome this objection, and still be consistent with

PCT, perhaps you will be more amenable to my scenario?

      So this is my suggestion (hypothesis). This particular

superstitious belief (control) system (SBS) exists only within
the context of a whole gamut of other religious beliefs and is
dependant upon them, in that they are higher level (dominant?)
systems, and is defined by them. This SBS is subservient in
that it could not exist in isolation; it would get
reorganised.

      However, the SBS also effects these other systems. So

reorganisation of the SBS would result in error in all the
these other (dominant) systems, which would mean that they
would reorganise back to where they were, and back to where
the SBS was, maintaining the status quo of reorganisational
stability despite there being continual error in one of the
systems. You said it yourself “if I am controlling for
perceiving myself to be a good Hindu, failure to make the
offering would increase the error in that perception”.

      Perhaps there is a reorganisational principle here, that

reorganisation of high level systems overrides that of
lower-level, dependant systems.

      That's it. No evidence, just an idea



      Rupert








        On 13 April 2014 22:37:18 GMT+05:30,

Martin Taylor <mmt-csg@mmtaylor.net >
wrote:

          I thought I was sending this to the

list, but I wasn’t. Rupert, I’m sorry you get it twice

      --

      Sent from my Android phone with K-9 Mail. Please excuse my

brevity.

            Martin

            -------- Original Message --------

[Martin Taylor 2014.04.13.12.35]
You assert, without evidence, that offering rice and
beans to Shiva result in no reduction in the error of
whatever perceptions are being controlled by that
action. OK, we can take that as a hypothetical
situation, though not a situation that necessarily
corresponds to the facts on the ground.
Sorry, I was not dealing with your hypothetical
situation in which this does not happen.
No, I’m not saying that your hypothestical situation is
physically impossible. I’m saying that if maturation
does involve reorganization, whether it’s Bill’s e-coli
method or any other, a behaviour that takes time and
effort to perform but that does not influence any
controlled perception is unlikely to stick around very
long when it might conflict with other behaviours that
do influence controlled behaviours.
In your hypothetical system, that is true by definition.
Statistically, it could occur fleetingly in a real
system. But is it likely to be observed as a pervasive
and long-lasting phenomenon in the real world?
Martin

Subject:
Re: Superstition: breaking the loop
Date:
Sun, 13 Apr 2014 12:43:05 -0400
From:
Martin Taylor
To:
mmt-csg@mmtaylor.netrupert@moonsit.co.ukrupert@moonsit.co.uk

              On 2014/04/13 11:28 AM, wrote:

rupert@moonsit.co.uk

[Rupert young (2014.04.13 21.00)]

                > On 12 April 2014 at 17:20 Martin Taylor wrote: > > The question I see is not the question you see.

You take it for granted > that offering rice and beans to Shiva results
in no reduction in the > error of whatever perceptions are being
controlled by that action.

                It's not a matter of taking it for granted it is

how I am defining the scenario.

mmt-csg@mmtaylor.net

                > For example, using "I" as teh

                > worshipper, if "I" am controlling for

perceiving myself to be a good

                > Hindu, failure to make the offering would

increase the error in that

                > perception
                Yes, though that is in a different control

system within you, as a Hindu, and this is probably
key to the “reorganizationally stablility”.

                > as would also be the case if I didn't

believe the offering

                > would induce Shiva to send me money to feed my

kids, but nevertheless

                > controlled for perceiving that other people

with whom I interact would

                > see me as being a good Hindu (and perhaps

therefore being inclined to

                > lend me money if I couldn't otherwise feed my

kids). In that case,

                > feeding Shiva does directly reduce error in a

perception of "kids (not)

                > being hungry".
                That's a different scenario and changes the

subject to a different discussion.

                > To my understanding (no longer being a

hypothetical Hindu), you are

                > assuming that some particular perception is

being controlled by a

                > mechanism that cannot influence that

perception, and are asking how this

                > particular set of connections can be

“reorganizationally stable”

yes, that is the question I am looking at.

I am looking at it from the

                > opposite side, seeing a behaviour that appears

to be pretty stable and

                > asking what kind of perception it might control

so that the loop can be

                > reorganizationally stable.

Ok, but that is a different question.

                But perhaps you are saying that my scenario if

physically impossible, in which case I’d be
interested in your reasons for that.

                > Do we ever know that there's no reorganization

going on?

                I am not actually saying reorganization isn't

going on necessarily just that it is not working, in
reorganizing the system.