Belief and Ideology

[From Rick Marken (940819.1445)]

Bill Powers (940819.0845 MDT) --

It may turn out that the real resolutions of problems people have with
each other will come from a direction that nobody expects and that have
nothing to do with the arguments that people hurl at each other. Maybe
the answer to the nuclear-power controversy, and the difficulties we
have in finding economical alternatives, is to get smart and stop using
so goddamn much power that no matter what we do the cost in pollution
and money is unacceptable. Maybe the controversy over whether we should
spend our money at home or give it away to poor people elsewhere in the
world is to stop having so goddamn many babies that in the end we won't
even be able to support ourselves. Maybe the answer to terrorism is to
stop fighting it for a moment and ask what the hell it is that these
people want. Sometime, somehow, we have to grow up a little and look on
these problems as something to be solved rather than as conflicts to be
won. When you see children squabbling over their toys, do you hunker
down beside one of them and help to beat up the other one?

Wonderful point. I have a not too well articulated idea that "beliefs" are at
the heart of the problem--in particular, beliefs that one is not willing to
change-- and I would like to hear your ideas about where belief fits into
PCT.

My own current "mental model" of this is that the word "belief" refers to a
reference for a perception that is experienced almost exclusively as an
imagination (self-generated perceptual signal) . This is the way I think the
term "belief" is used in phrases like "I believe it's going to rain today" or
"I believe in god". When the reference component of such beliefs is
unalterable (so that everyday you believe it is going to rain, no matter
what) then I think we call this an "ideology".

Given this model, I would say that an ideology is an internal conflict
phenomenon (two systems using the same reference in incompatible ways) unless
the reference component of the belief is at the highest level of the control
hierarchy (where there are no higher level systems to change it).

Given this model, it is easy to see why you would say:

Ideology really does get in the way of clear thinking, whether it is "for"
or "against."

An ideology should cripple thinking in much the same way as an "ordinary"
conflict cripples "regular" perceptual control; it screws it up by taking
away the ability to think (or act) in certain ways.

It should be possible to study this kind of mental conflict experimentally.
In fact, I think there is some stuff in the problem solving literature that
may be relevant. For example, I seem to recall a study (by H. Simon?) where
people solved the same problems described in two differnet ways. Depending on
how a problem was described (maybe as hares and foxes rather than as
missionaries and cannibals??), people would make assumptions about
constraints that were or were not explicitly stated in the problem; people
developed little "ideologies" about what could and could not be done to solve
the problem. Does anyone have any idea what I might be thinking of?

Best

Rick

<[Bill Leach 940819.21:32 EST(EDT)]

[Rick Marken (940819.1445)]

My own current "mental model" of this is that the word "belief" refers
to a reference for a perception that is experienced almost exclusively
as an imagination (self-generated perceptual signal) . This is the way
I think the term "belief" is used in phrases like "I believe it's going
to rain today" or "I believe in god". When the reference component of
such beliefs is unalterable (so that everyday you believe it is going
to rain, no matter what) then I think we call this an "ideology".

In the "rain" example, the term is used as I often use the term myself.
The 'belief' is founded upon conjecture based upon experience. The
person expressing the belief is (hopefully) aware that their is a large
possibility of error and little causal evidence supporting the assertion.

I have noted several instances of what I would term "assaults" on belief.
I am aware that I could well be "reading into" the actual statements more
than was intended. However, ALL of us have "beliefs" and are controlling
based upon these beliefs.

The very principles upon which such a scientific endevor as PCT research
relies upon are beliefs -- unproveable axioms. That these axioms have
demonstrated remarkable consistancy and reliability (dare I add
refreshing?) does not change the fact that they are unprovable. Of
course neither does the fact that they are not provable mean that they
are useless or silly, etc.

Given this model, I would say that an ideology is an internal conflict
phenomenon (two systems using the same reference in incompatible ways)
unless the reference component of the belief is at the highest level of
the control hierarchy (where there are no higher level systems to change
it).

An ideology is an internal conflict when there is a conflict. An
ideology that does not conflict with how someone "feels" nor with
perceptions experienced in the environment would not result in any
conflict. I say this with a personal belief that the vast majority of
what any of us would define as ideology probably does result in either
conflict with experience or conflict with desire but still I don't
necessarily feel that such must be the case.

Ideology really does get in the way of clear thinking, whether it is
"for" or "against."

Again, not necessarily. If your belief includes a recognition that not
all is known and what is believed is subject to "reality check" then the
"dangers" are not too great.

I think that the way in which you are using the term is too narrow though
as Bill did use it, I felt it was clear that he meant a belief system
that denies reality and "has all the answers" sort of thing.

Whatever any of us believes is subject to error and recognizing that
single idea is probably the most important step.

I am quite happy to "defend my beliefs" with intelligent people because I
will either discover new support and strengths for what I do believe or I
will learn of errors in my thinking (or a little of both).

Someone else mentioned that "we don't want to be wrong" and I believe
even referenced the idea to basic survival concepts (I am not sure where
however). I don't know that the statement is true though I do recognize
that certainly at times I personally feel really "bad" if I find that I
was "wrong" about something. In general though, I don't think that being
"found to be in error" is a particularly distressful experience for me.

I have noticed that some other people that I know almost can not admit
being in error about anything (which results in some interesting
observations at times). Obviously such behaviour is control of
perception and it seems that the reference must be "never be in error".

One such person that I know (who also happens to be a rather sharp person
and a good friend) will argue with such "logical absurdities" to be
almost laughable. My impression however is that his perceptual reference
is "to not be found in error by someone else" on such occassions. Many
times after (days later) a discussion that degenerated into an arguement
where he was literally "foolishly wrong", he can be heard to support the
very position that he argued against only a few days prior.

-bill