Controlling for ignor(e)ance?

[From Gregg Wierzbicki (970730.1100EST)]

From time to time I wonder how to determine the extent to which we

(individually, in groups, or in general) use language to maintain our state
of ignor(e)ance (a.k.a bliss?), or, if you prefer, control for not-learning so
mething which someone else might like us to learn. How might I test whether
I am controlling to be open to another's verbal/written/non-verbal input; OR
resisting (if not ignoring) input from them? How might I test the same for
someone else? Any interest/ideas?

Thanks,
Gregg

[From Bruce Gregory (970730.1200 EDT)]

Gregg Wierzbicki (970730.1100EST)

From time to time I wonder how to determine the extent to which we
(individually, in groups, or in general) use language to maintain our state
of ignor(e)ance (a.k.a bliss?), or, if you prefer, control for not-learning so
mething which someone else might like us to learn. How might I test whether
I am controlling to be open to another's verbal/written/non-verbal input; OR
resisting (if not ignoring) input from them? How might I test the same for
someone else? Any interest/ideas?

One approach is to adopt the other person's point of view and
see if you can "make it work" in a variety of examples. Make
your intended outcome a successful defense of the other person's
point of view. This is the only way I know to truly learn the
strengths and limitations of a viewpoint. It does, however,
require time and effort. (Not to mention intellectual
honesty...)

Bruce

[From Bill Powers (970730.1344 MDT)]

Gregg Wierzbicki (970730.1100EST)--

From time to time I wonder how to determine the extent to which we
(individually, in groups, or in general) use language to maintain our
state of ignor(e)ance (a.k.a bliss?), or, if you prefer, control for
not-learning something which someone else might like us to learn. How
might I test whether I am controlling to be open to another's
verbal/written/non-verbal input; OR resisting (if not ignoring) input
from them? How might I test the same for someone else? Any
interest/ideas?

A lot of interest, I should think -- it's problem that PCTers run into
frequently (and are sometimes accused of having themselves).

I know that I often have to ignore things I might want to learn simply
because there are so many fields in which PCT would have applications. I've
tried learning something about psychology, linguistics, neurology,
biochemistry, sociology, physiology, management, information theory,
ethology, psychotherapy, economics, education, evolution, and control
theory (classical and "modern"), just to have intelligent conversations
with people who already are open to PCT. But it's really impossible; only a
person who has been immersed in such fields for years and has concentrated
on knowing their history and content could decide just how PCT would apply,
if at all. So for the most part I have to be content with superficial
knowlege, and remain ignorant of most of the details. And I have to resist
consciously the urge to dive into the journals. It helps to pause and run
through the list of journals (and textbooks) I would have to read to
achieve even a beginner's expertise in each field. It also helps to live 13
miles outside of Durango, Colorado, with only an undergraduate library
available at the one local college. I should probably add that it also
helps, in resisting this urge, actually to read some of those journals from
cover to cover.

You're really asking two questions: _how_ do we avoid learning things we
don't want to learn, and _why_ do we do it? The "how" is an interesting
question in itself. One obvious way is to read or listen only when the
material agrees with your existing beliefs. Another way is to read or
listen while continuously reinterpreting the words, so they mean things
that agree with your beliefs. And another way is the well-known "Freudian
slip" -- you misread or mis-hear, so what you understand from the words
does not contradict your beliefs. My favorite examples are the citations of
my book that call it "Perception: the control of behavior." A related one
is the mis-rememberings of the title of Wiener's book as being
"Cybernetics: communication and control in the animal and the machine." It
is, of course, "...control and communciation ...". You can tell what a
person's beliefs and interests are by such rearrangements, I think.

The why is more conjectural, because there are so many possible reasons.
Obviously, to read or hear something that contradicts a belief, a goal for
how things should be perceived to be, is to create an error if you accept
it. If you have a shaky position on some subject, yet this position is very
important to you for other reasons, you might well find it advisable to
avoid finding out what others have to say about it because of the risk of
having to conclude that your idea is wrong. If you're defending an
important belief, and feel that the skeptic is smarter than you or has
logical arguments you can't refute, the best strategy is to avoid learning
what the skeptic has to say. Of course this implies that the belief is only
a means to a more important end -- that the actual truth of the belief is
less important than what the belief implies about you or something else.
For example: I believe I don't have cancer, and I don't want to hear what
my doctor has to say about that.

I suppose you could argue that the deliberate seeking of ignorance is just
a personality trait, but I don't think that idea is either useful or
likely. I think we seek ignorance when we fear that knowledge will hurt us,
even if the knowledge we fear is simply the confirmation that we're too
dumb to understand it. Any change in a world-view is upsetting,
particularly if we have no confidence in being able to pick up the pieces
afterward and still have control of our lives. We avoid the chance of
disturbances because of what we _imagine_ will happen as a result. In
imagination, everything that could conceivably go wrong _does_ go wrong, so
we discourage ourselves from even simple undertakings, like learning
something new.

Best,

Bill P.

ยทยทยท

Thanks,
Gregg

[From Bruce Gregory (970730.1725 EDT)]

Bill Powers (970730.1344 MDT)

I suppose you could argue that the deliberate seeking of ignorance is just
a personality trait, but I don't think that idea is either useful or
likely. I think we seek ignorance when we fear that knowledge will hurt us,
even if the knowledge we fear is simply the confirmation that we're too
dumb to understand it. Any change in a world-view is upsetting,
particularly if we have no confidence in being able to pick up the pieces
afterward and still have control of our lives. We avoid the chance of
disturbances because of what we _imagine_ will happen as a result. In
imagination, everything that could conceivably go wrong _does_ go wrong, so
we discourage ourselves from even simple undertakings, like learning
something new.

I suspect that the reason we avoid learning new things is the
fear that they might threaten our ability to exercise control.
If we believe that new knowledge will enhance our ability to
exercise control, we pursue it. Many of use seem to adopt
limiting belief structures because they promise to allow us to
exercise control. Religious beliefs enable us to confront the
ultimate loss of control by assuring us that, in the end, we
will still be able to control our perceptions.

Bruce

[From Gregg Wierzbicki (970804.1000 EST)]

Bill Powers (970730.1344 MDT)

Gregg Wierzbicki (970730.1100EST)--

From time to time I wonder how to determine the extent to which we
(individually, in groups, or in general) use language to maintain our
state of ignor(e)ance (a.k.a bliss?), or, if you prefer, control for
not-learning something which someone else might like us to learn. How
might I test whether I am controlling to be open to another's
verbal/written/non-verbal input; OR resisting (if not ignoring) input
from them? How might I test the same for someone else? Any
interest/ideas?

A lot of interest, I should think -- it's problem that PCTers run into
frequently (and are sometimes accused of having themselves).
I know that I often have to ignore things I might want to learn simply
because there are so many fields in which PCT would have applications. I've
tried learning something about psychology, linguistics, neurology,
biochemistry, sociology, physiology, management, information theory,
ethology, psychotherapy, economics, education, evolution, and control
theory (classical and "modern"), just to have intelligent conversations
with people who already are open to PCT.

OK. But, I suspect you are in a minority of miniscule proportions. Rather I
suspect that extremely few of us, in the general population, would likely
make such an effort. So, I'm trying to focus attention on the development
of a test to determine whether/when individuals, and/or groups in general,
build systems & institutions to hide themselves comfortably from, if not the
truth, then each other's approach to it.

But it's really impossible; only a
person who has been immersed in such fields for years and has concentrated
on knowing their history and content could decide just how PCT would apply,
if at all.

How so? Maybe those of us who act aggressively to "know" the history and
content of our fields are merely becoming more rigid (institutionalized,
socially allied) in the manner(s) in which we attempt to remain ignorant of
the possibility that we are wrong! And, isn't it possible that we have
neatly developed referreed journals, peer review, tenure, drug
certifications, manufacturer licensing agreements etc. to afford some
security in our position in any given community?

So for the most part I have to be content with superficial
knowlege, and remain ignorant of most of the details.

OK. But, if you would allow that whatever knowledge exists at all, only
exists in the heads of living people, and allow that few individuals take
(have?) the time to insert themselves deeply into a broad range of abstract
and obscure fields of inquiry, and further allow for the possibility that
some of our actions in coming to 'know' a limited number of fields of inquiry
might be for the purpose described above (to remain ignorant of our
ignorance), then isn't it possible (likely?) that we are all just
superficially knowledgeable...at best?

Now, my concern about this state of affairs derives from my suspicion that,
if this is so, then I think PCT might suggest that we have created a society
which controls to maintain superficially knowledgeable citizens. Of course,
in so saying, I have not said much that hasn't been said by many others
before. However, explicit here is the suggestion that we are acting in order
to control for this (superficiality of knowlegdge) as outcome. What needs
more work, I think, is the manner in which we test this suggestion.

And I have to resist consciously the urge to dive into the journals....I

should probably add that it also helps, in resisting this urge, actually to
read some of those >journals from cover to cover.

And, through such resistance, what are you attempting to control? How might
someone test for the control variable here?

I suppose you could argue that the deliberate seeking of ignorance is just
a personality trait, but I don't think that idea is either useful or
likely. I think we seek ignorance when we fear that knowledge will hurt us,
even if the knowledge we fear is simply the confirmation that we're too
dumb to understand it. Any change in a world-view is upsetting,
particularly if we have no confidence in being able to pick up the pieces
afterward and still have control of our lives. We avoid the chance of
disturbances because of what we _imagine_ will happen as a result. In
imagination, everything that could conceivably go wrong _does_ go wrong, so
we discourage ourselves from even simple undertakings, like learning
something new.

OK. But, how might we test these assertions? That's where I think I'm
stumped. Or, perhaps thats where I've allowed myself to become stumped in
order not to challenge my own level of ignorance....

Thanks.

Gregg

[From Bill Powes (970804.0917 MDT)]

Gregg Wierzbicki (970804.1000 EST)--

Maybe those of us who act aggressively to "know" the history and
content of our fields are merely becoming more rigid (institutionalized,
socially allied) in the manner(s) in which we attempt to remain ignorant of
the possibility that we are wrong! And, isn't it possible that we have
neatly developed referreed journals, peer review, tenure, drug
certifications, manufacturer licensing agreements etc. to afford some
security in our position in any given community?

Sure, I suppose that this is a definite motivation for some people. Real
science is hard to do, and if you're not good at it you have to have some
way of staying afloat and competing. A lot of bluffing goes on in
run-of-the-mill science, and there's a certain amount of "If you don't
point out my warts I won't point out yours."

But, how might we test these assertions? That's where I think I'm
stumped. Or, perhaps thats where I've allowed myself to become stumped in
order not to challenge my own level of ignorance....

That's a good start. Whenever I feel like pointing the finger, I always see
a finger pointed right back at me. Once that's out of the way, however, you
can consider the real part of the question. How could we test to see if _a
person_ actively controls for ignorance? That, of course, is what we have
to learn to do first, before we can say anything about the proposal that
"people" control for ignorance.

This suggests the sorts of experiments that psychologists sometimes do. You
set up an artificial situation in which a person is given some sort of
knowledge about a problem to be solved, and gets practice in using it. So
the person acquires a degree of problem-solving ability, using this
artificial knowledge. Then you present data showing that this knowledge is
incomplete or wrong, and see if the person makes any use of the data. Or
something like that.

Behind this design is my thought that for most people, knowledge is only a
means to an end. If the end is achieved, they don't really care if the
knowledge is "right" or not; all that matters is that using it in a
prescribed way will yield the results they want. If it does, or seems to,
"work," they will defend the knowledge against any criticism that seems to
invalidate it, and will have very little interest in being skeptical about
it. I think this is behind most scientific closed-mindedness. All the other
stuff, the clubbiness, the insider privileges, and so on, are just ways of
defending ideas that seem to work well enough, and preventing attempts to
raise doubts where the only "benefit" would seem to be to prevent getting
the results you want (like getting your yearly paper published). Heck, just
tell a carpenter that he doesn't really have to spit on a nail before
hitting it. If spitting on the nail works for him, he's not about to listen
to any doubts about what it accomplishes.

Best,

Bill P.