The Human Mind and Information

(Gavin Ritz, 2010.11.16.12.00NZT)

I am reading Lord Robert’s Winston’s book the human mind. What really strikes me is that the
concept of information is used continually as a means to explain incoming
signals. It is pervasive throughout the book.

This is a very interesting
concept which I have been noticing for many months now reading articles on the
mind. This seems to be an accepted proposition that we are an information
processing organism. And that information comes in from the environment and we process
this.

This we all know is just
not correct in any sense of the meaning. PCT clearly shows that we are an
energy transducing organism. There is no information, so why is the concept of
information so pervasive?

Regards

Gavin

[From Fred Nickols (2010.11.15.1633 MST)]

Hmm. I didn’t know that PCT supposed or postulated that we don’t process information. If true, I’ll have to think about that. On my part, I do believe we process information, which is to say I believe we attach meaning to what we sense. I “read” your post, Gavin, and I attach meaning to it or, said otherwise, “I take your meaning to be…” or “I interpret your words to mean…” and so on. What I don’t believe, which might be close to what you’re trying to convey is that our sensory inputs (aural, visual, etc) come with meaning already attached. I’ll have to check to see what the official PCT position is regarding “meaning.”

While I’m at it, I think I’ll rant a bit. I don’t have much problem with the position taken by many competent PCTers that the behaviorists and the cognitivists have it wrong (with “it” being an accounting for or explanation of human behavior). I’m less reluctant to go along with the accompanying position of “pitch it all” (with it being everything the behaviorists and cognitivists have tried to contribute to our understanding of human behavior) – after all, there might be a baby or two in there with all that bath water. And I am flat out opposed to discarding the hard-won, practical, empirically-based knowledge we have accumulated about human behavior and performance in the workplace. There are far too many babies in that bath water.

As for Winston’s book about the mind, I haven’t read it but I would hazard a guess that it has a lot in common with other such books (e.g., Pink’s); namely, it’s highly speculative, based on some spotty evidence (some or much of which is suspect) and rooted in a view of humans as “soft” computers.

And so I treat just about everything I read as a task in sorting wheat from chaff. That includes PCT stuff.

Regards,

Fred Nickols

Managing Partner

Distance Consulting LLC

1558 Coshcoton Avenue - Suite 303

Mount Vernon, OH 43050-5416

www.nickols.us | fred@nickols.us

“Assistance at a Distance”

···

From: Control Systems Group Network (CSGnet) [mailto:CSGNET@LISTSERV.ILLINOIS.EDU] On Behalf Of Gavin Ritz
Sent: Monday, November 15, 2010 4:10 PM
To: CSGNET@LISTSERV.ILLINOIS.EDU
Subject: The Human Mind and Information

(Gavin Ritz, 2010.11.16.12.00NZT)

I am reading Lord Robert’s Winston’s book the human mind. What really strikes me is that the concept of information is used continually as a means to explain incoming signals. It is pervasive throughout the book.

This is a very interesting concept which I have been noticing for many months now reading articles on the mind. This seems to be an accepted proposition that we are an information processing organism. And that information comes in from the environment and we process this.

This we all know is just not correct in any sense of the meaning. PCT clearly shows that we are an energy transducing organism. There is no information, so why is the concept of information so pervasive?

Regards

Gavin

( Gavin
Ritz 2010.11.16.13.03NZT)

[From
Fred Nickols (2010.11.15.1633 MST)]

Hmm.
I didn’t know that PCT supposed or postulated that we don’t process
information. If true, I’ll have to think about that.

PCT is pretty clear on this, we transduce
incoming energy signals.

On
my part, I do believe we process information, which is to say I believe we
attach meaning to what we sense.

I’m not talking about processing but
I don’t think we process information either.

We process an incoming energy signal
(electromagnetic) into an electrical signal (neural system).

I “read” your post, Gavin, and I attach meaning to it or, said
otherwise, “I take your meaning to be…” or “I interpret
your words to mean…” and so on. What I don’t believe,
which might be close to what you’re trying to convey is that our sensory
inputs (aural, visual, etc) come with meaning already attached.

No I definitely don’t mean this. All
that’s incoming is an energy signal (electromagnetic, sound waves,
pressure, temp, chemical). What we do with it is another issue.

I’ll have to check to see what the official PCT position is regarding
“meaning.”

I was not talking about meaning all I am
saying is that it seems pervasive that information is a quantity (and quality)
that is an incoming signal.

Systems Group Network (CSGnet) [mailto:CSGNET@LISTSERV.ILLINOIS.EDU] On Behalf Of Gavin Ritz
Sent: Monday,
November 15, 2010 4:10 PM
Information

(Gavin Ritz, 2010.11.16.12.00NZT)

I am reading Lord Robert’s Winston’s book the human mind. What really strikes
me is that the concept of information is used continually as a means to explain
incoming signals. It is pervasive throughout the book.

This is a very interesting concept which I have been noticing for
many months now reading articles on the mind. This seems to be an accepted
proposition that we are an information processing organism. And that
information comes in from the environment and we process this.

This we all know is just not correct in any sense of the meaning.
PCT clearly shows that we are an energy transducing organism. There is no
information, so why is the concept of information so pervasive?

Regards

Gavin

···

From: Control
To: CSGNET@LISTSERV.ILLINOIS.EDU
Subject: The Human Mind and

( Gavin
Ritz 2010.11.16.13.25NZT)

Fred I have trolled through a few of my books
on this issue, Miller’s book on Living Systems, is very clear on this
issue of information and how it moves from the outside to the inside. Through markers.
This is really totally nonsensical.

Dawkins is also clear on this issue about
information and DNA.

I’m beginning to suspect we have a serious
problem in understanding the human mind if we can’t get such a basic
assumption right.

I might add PCT is the only model that
says information doesn’t exist this way.

Regards

Gavin

···

[Martin Taylor 2010.11.16.00.31]

(Gavin Ritz 2010.11.16.13.25NZT)

          I

might add PCT is the only model that
says information doesn’t exist this way.

"PCT" doesn't. Some PCT practitioners do.

Martin

( Gavin
Ritz 2010.11.16.22.54NZT)

[Martin Taylor
2010.11.16.00.31]

(Gavin
Ritz 2010.11.16.13.25NZT)

I might
add PCT is the only model that says information doesn’t exist this way.

“PCT” doesn’t. Some PCT practitioners do

It does
page 35 BCP.

I was
driving to a meeting today thinking about information and PCT. Something really
mysterious is going on. It’s almost as if everyone has accepted
that information is something that enters via the organism from the outside. Without
actually looking at it critically.

Lets say
that there is such a thing as information “or its markers”. Then
where is it. What is its measure? And what specifically is information?

Gavin

.

Martin

[From Richard Kennaway (2010.11.16.1059 BST)]

With respect, I don't think that vague discussions of "information" based on popular science writers can contribute much. "Information" is a collection of concepts that inexorably drives discussions into the fog unless one defines precisely which of many concepts one means by it. It has probably generated more confusion than any other word in mathematics and physics. FWIW, there are close connections between information, technically defined, and energy, via statistical mechanics.

For some of the real, technical story of how actual scientists, rightly or wrongly, are applying concepts of information to the study of brain function, one might study Karl Friston's large body of work (http://www.fil.ion.ucl.ac.uk/~karl/). I've been doing that lately, trying to interpret his concepts and analysis by replacing the brain by a control system and seeing if what he is saying still makes sense.

But it's a long slog and I have no conclusions yet.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, http://www.cmp.uea.ac.uk/~jrk/
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.

[From Bill Powers (2010.11.16.0955 MDT}

Richard Kennaway (2010.11.16.1059 BST) --

JRK: FWIW, there are close connections between information, technically defined, and energy, via statistical mechanics.

BP: I can send a telegraph message to someone by intermittently connecting a light bulb across two wires that are powered by a battery at the receiving end. The receiver measures the current drain from the battery and thus reads the message. The flow of energy is from receiver to transmitter, while information travels in the opposite direction. The same arrangement could be accomplished hydraulically, with the transmission consisting of opening and closing a valve at the transmitting end of a long pipe that drains water from a tank at the receiving end, or in many other similar ways. Does this cause any problems for the idea that information is connected to energy and entropy? Does the sign of the energy transfer or the entropy change matter?

Does the fact that two processes can be described by the same mathematical equation mean that they have anything to do with each other? An unstable electronic amplifier and an oscillating pendulum can both be described by a second-order differential equation. Do they have anything else in common?

Best,

Bill P.

[From Richard Kennaway (2010.11.16.1805 BST)

[From Bill Powers (2010.11.16.0955 MDT}

Richard Kennaway (2010.11.16.1059 BST) --

JRK: FWIW, there are close connections between information,
technically defined, and energy, via statistical mechanics.

BP: I can send a telegraph message to someone by intermittently
connecting a light bulb across two wires that are powered by a
battery at the receiving end. The receiver measures the current drain
from the battery and thus reads the message. The flow of energy is
from receiver to transmitter, while information travels in the
opposite direction. The same arrangement could be accomplished
hydraulically, with the transmission consisting of opening and
closing a valve at the transmitting end of a long pipe that drains
water from a tank at the receiving end, or in many other similar
ways. Does this cause any problems for the idea that information is
connected to energy and entropy? Does the sign of the energy transfer
or the entropy change matter?

No, the connection isn't anything like information being energy and energy being information. It takes energy to destroy a bit of information, but the energy is utterly tiny, somewhere around 10^-20 joules (kT where k is Boltzmann's constant and T is the temperature). Computers must theoretically generate a certain amount of heat per information-losing operation, but the actual heat they produce is still far in excess of the theoretical minimum.

The connection is mainly of interest in mathematical physics, where temperature can be given an interpretation in terms of the number of possible states of the system, and the information-theoretic entropy (the log of the number of states available to he system) is proportional to the thermodynamic entropy.

Does the fact that two processes can be described by the same
mathematical equation mean that they have anything to do with each
other? An unstable electronic amplifier and an oscillating pendulum
can both be described by a second-order differential equation. Do
they have anything else in common?

Not necessarily. My current interest in Friston's work is motivated at the moment by wanting to see exactly what there is there, by replacing "the brain" throughout by some known control system and seeing what it says. Friston is a very big gun indeed in neuroscience.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, Richard Kennaway
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.

[Martin Taylor 2010.11.16.11.56]

(Gavin Ritz 2010.11.16.22.54NZT)

[ Martin Taylor
2010.11.16.00.31]

          (Gavin

Ritz 2010.11.16.13.25NZT)

          I might

add PCT is the only model that says information doesn’t
exist this way.

          "PCT" doesn't. Some PCT practitioners do
          It does

page 35 BCP.

Which edition? I can't find any reference to information anywhere

near p35 in the 2005 edition. Nor do I find it in the index.

Whether your comment is relevant depends rather on whether you

consider PCT to be a religion of which B:CP is the Bible, and Bill
Powers the infallible Pope who dictated it by listening to the word
of God, or consider PCT to be a fairly new science, in which B:CP is
a brilliant founding document written by a scientist who has great
insight, but is as fallible as any human.

I take Bill to be a practitioner of PCT, not its infallible Pope,

and PCT to be a science, which means that it is subject to revision
and extension as data and theoretical explorations may indicate.

The other reason I made my comment was a niggle; that I didn't think

“PCT” was an animate body that can “say” anything. Bill has been
taking me to task for the same reason in another thread, because I
like the word “affordance” which he sees as being something only an
animate body can provide, so I’m a bit sensitive to it.

Martin
···

On 2010/11/16 5:05 AM, Gavin Ritz wrote:

( Gavin
Ritz 2010.11.17.11.19NZT)

[Martin Taylor
2010.11.16.11.56]

···

On 2010/11/16 5:05 AM, Gavin Ritz wrote:

(Gavin
Ritz 2010.11.16.22.54NZT)

[Martin Taylor 2010.11.16.00.31]

(Gavin Ritz
2010.11.16.13.25NZT)

I might
add PCT is the only model that says information doesn’t exist this way.

“PCT” doesn’t. Some PCT practitioners do

It does page 35 BCP.

Which edition? I can’t find any reference to information anywhere near p35 in
the 2005 edition. Nor do I find it in the index.

Martin it doesn’t mention information because information nothing
to do with PCT.

Whether your comment is relevant depends rather on whether you consider PCT to
be a religion of which B:CP is the Bible,

Goodness
me, Martin I don’t know why you say things like this.

and Bill Powers the infallible Pope who dictated it by
listening to the word of God,

I’m
the last person who thinks Bill is infallible, but this is not fair or kind comments Martin. These are personal
comments best left out of the dialogue.

or consider PCT to be a fairly new science, in which B:CP is a
brilliant founding document written by a scientist who has great insight, but
is as fallible as any human.

No one
is saying that its infallible or such things we are discussing the model to all
get a better view of it.

I take Bill to be a practitioner of PCT, not its infallible Pope,
and PCT to be a science, which means that it is subject to revision and
extension as data and theoretical explorations may indicate.

The other reason I made my comment was a niggle; that I didn’t think
“PCT” was an animate body that can “say” anything. Bill has
been taking me to task for the same reason in another thread, because I like
the word “affordance” which he sees as being something only an
animate body can provide, so I’m a bit sensitive to it.

Martin try have some fun rather than take this so personally. I

Regards

Gavin

Martin

(Gavin Ritz 2010.11.17.11.43NZT)

[From Richard Kennaway
(2010.11.16.1059 BST)]

. FWIW, there are close connections

between information, technically defined, and energy,
via statistical

mechanics.

Absolutely I agree,
pretty easy to define too, the relationship between information and energy, to
convey one bit of information equals 0.693kT joules/bit where K is the
Boltzmann constant and T is temp in Kelvin’s. To get this from basic
principles and E=hf is pretty simple.

However this has nothing
to do with energy transduction in PCT.

For some of the real, technical story of how actual
scientists,

rightly or wrongly, are applying concepts of
information to the study

of brain function, one might study Karl Friston’s large body of work

(http://www.fil.ion.ucl.ac.uk/~karl/).

Pretty wrongly really, I
had a quick read of the information theory paper and a particular brain
function, looks like they didn’t get anywhere. Not surprisingly though. Information
theory has nothing to do with the brain and the neural system.

What we need to be
studying is the energy transductions at point of contact, the brain imaging and
the loop back to the same points of contact or internal loops.

I’ve been doing that lately,

trying to interpret his concepts and analysis by
replacing the brain

by a control system and seeing if what he is saying
still makes sense.

No sense at all in terms
of information theory. Looks like plenty of people have the view that
information theory and organism is unrelated, Maturana and Varela for one, Bill
for two.

I am of the opinion that
they have nothing to do with each by simple observation of an organism. Even
without PCT information theory makes no sense to use in this context.

Using Information theory
in its appropriate context makes total sense.

Regards

Gavin

···

(Gavin Ritz 2010.11.17.12.22NZT)

[From Richard Kennaway
(2010.11.16.1805 BST)

[From Bill Powers
(2010.11.16.0955 MDT}

Richard Kennaway
(2010.11.16.1059 BST) –

Not necessarily. My current interest in
Friston’s work is motivated

at the moment by wanting to see exactly what there is
there, by

replacing “the brain” throughout by some
known control system and

seeing what it says. Friston is a very big gun
indeed in

neuroscience.

Pretty scary really, because
he’s based his free-energy formula in the brain around information
theoretic quantities.

“Free Energy and
the Brain” Friston & Stephan 2007. Published paper.

He’s going down a
blind alley. The basic premise of his work I think is problematic. Free energy
is a thermodynamic quantity he’s making massive assumptions and
correlations where there are none.

Regards

Gavin

···

Hi Richard,

What are you using to interpret Friston's findings? Personally I would use Vygotstky's model of mediated action (Figure 1), otherwise known as activity theory, within a human-computer interaction context:
http://www.helsinki.fi/cradle/chat.htm
Activity theory - Wikipedia (e.g., see Nardi, 1996 under the citations)

For example, according to Vygotsky, psychological tools (e.g., language) are social, not organic or individual, which are directed toward mastery or control over our behavioral processes. In other words, they alter the entire flow and structure of our mental functions. (BTW, I would assert that artificial intelligence is the highest potential for such tools, but that's another thread.) The ramifications are that if you want to have control over your life as an individual, you absolutely need to gain mastery over these tools (e.g., language).

That being said, the first publication on Friston's website that caught my attention was:

Hierarchical Models in the Brain

I applied the basic structure of activity theory, in the form of a simplexity thinking device, to Figure 10, to interpret its significance. The result is this metaphor: Imagine the interactions forming a container for a liquid. That liquid is your cognitive complexity. Now imagine a conch shell as the container with this liquid at the apex. Now how do I tease the liquid out the aperture? Naturally I would need to rotate the shell counter-clockwise. I'm not certain how many rotations it takes, but I will get you out of your shell eventually. :slight_smile:

Best,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

Richard Kennaway <jrk@CMP.UEA.AC.UK> 11/16/2010 6:09 AM >>>

[From Richard Kennaway (2010.11.16.1059 BST)]

With respect, I don't think that vague discussions of "information"
based on popular science writers can contribute much. "Information"
is a collection of concepts that inexorably drives discussions into
the fog unless one defines precisely which of many concepts one means
by it. It has probably generated more confusion than any other word
in mathematics and physics. FWIW, there are close connections
between information, technically defined, and energy, via statistical
mechanics.

For some of the real, technical story of how actual scientists,
rightly or wrongly, are applying concepts of information to the study
of brain function, one might study Karl Friston's large body of work
(Professor Karl Friston Selected papers). I've been doing that lately,
trying to interpret his concepts and analysis by replacing the brain
by a control system and seeing if what he is saying still makes sense.

But it's a long slog and I have no conclusions yet.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, Richard Kennaway
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.

(Gavin Ritz 2010.11.17.12.22NZT)

[From Richard Kennaway
(2010.11.16.1805 BST)

[From Bill Powers
(2010.11.16.0955 MDT}

Richard Kennaway
(2010.11.16.1059 BST) –

Not necessarily. My current interest
in Friston’s work is motivated

at the moment by wanting to see exactly
what there is there, by

replacing “the brain” throughout
by some known control system and

seeing what it says. Friston is a
very big gun indeed in

neuroscience.

Pretty scary really, because he’s based his free-energy
formula in the brain around information theoretic quantities.

“Free Energy and the Brain” Friston & Stephan 2007.
Published paper.

He’s going down a blind alley. The basic premise of his work
I think is problematic. Free energy is a thermodynamic quantity he’s
making massive assumptions and correlations where there are none.

Regards

Gavin

···

( Gavin
Ritz 2010.11.17.13.12NZT)

Richard

I’ve had a quick
look at a number of papers now. Even Friston seems to question this relationship
last year he wrote a paper “The Free energy Principle a rough guide to the
brain?”

He’s using it to
try explain structure and function of the brain. He wants to go down the energy
entropy route which I believe is correct but the path he’s taking is
rather stunning.

I think it is a rough
guide to the brain Free Energy but not based on Information Theory constructs.
I think it’s based on Gibb’s Free Energy which has been derived by
others.

Anyway that’s my
opinion.

Regards

Gavin

···

[From Richard Kennaway (2010.11.17.0827 BST)]

Chad Green writes:

What are you using to interpret Friston's findings?

My brain.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, Richard Kennaway
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.

[From Richard Kennaway (2010.11.17.0829)]

(Gavin Ritz 2010.11.17.13.12NZT)

I think it is a rough guide to the brain Free Energy but not based on Information Theory constructs. I think it�s based on Gibb�s Free Energy which has been derived by others.

It seems to me that it is based entirely on information-theoretic concepts. The quantity he calls free energy is defined in terms of information, and is called "energy" only by analogy to thermodynamic concepts. It is not measured in joules, and converting it by multiplying by kT would not be relevant to what he is doing.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, Richard Kennaway
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.

[From Bill Powers (2010.11.17.0500 MDT)]

Richard Kennaway (2010.11.16.1805 BST --

No, the connection isn't anything like information being energy and energy being information. It takes energy to destroy a bit of information, but the energy is utterly tiny, somewhere around 10^-20 joules (kT where k is Boltzmann's constant and T is the temperature). Computers must theoretically generate a certain amount of heat per information-losing operation, but the actual heat they produce is still far in excess of the theoretical minimum.

The connection is mainly of interest in mathematical physics, where temperature can be given an interpretation in terms of the number of possible states of the system, and the information-theoretic entropy (the log of the number of states available to the system) is proportional to the thermodynamic entropy.

Now you have me completely bamboozled. When you say "the number of states available to the system," does this include measurements at all the levels of abstraction or just the lowest, intensity? For instance, does the order in which these states are established matter? Is that another measure of state?

I was thinking of things like the number of bits of information in a message being transmitted in a given alphabet. Is that just a popularization of some much deeper concept? Surely it takes more than kT joules to send one bit of that kind. And as to destroying bits, I don't have any idea of what that means. Receiving them? Using them up somehow? Filling in the holes in the zeros with crumpled-up ones?

Best,

Bill

Hi Richard,

Let me be a bit more specific. What conceptual framework, logic (dynamic?), lens, filter, theory, principles, worldview, paradigm, ontology, epistemology, etc., do you use, as experts supposedly do according to the NRC's publication How People Learn: Brain, Mind, Experience, and School (Chapter 2), to make sense out of Friston's findings at a high level, and to perhaps out-think and challenge them?

The brain, without such an organizing or integrating framework is quite noisy and incoherent otherwise.

On a related note, I'm currently engaged in an interesting conversation with Jay Forrester on the K-12 System Dynamics listserv. His concern is with the lack of innovation in public education. So how can Friston's findings be applied to this situation in a way that resonates with educators? The metaphor that I provided earlier is one example.

Source: How People Learn: Brain, Mind, Experience, and School | The National Academies Press

Cheers,
Chad

Chad Green, PMP
Program Analyst
Loudoun County Public Schools
21000 Education Court
Ashburn, VA 20148
Voice: 571-252-1486
Fax: 571-252-1633

"Kennaway Richard Dr (CMP)" <R.Kennaway@UEA.AC.UK> 11/17/2010 3:27 AM >>>

[From Richard Kennaway (2010.11.17.0827 BST)]

Chad Green writes:

What are you using to interpret Friston's findings?

My brain.

···

--
Richard Kennaway, jrk@cmp.uea.ac.uk, Richard Kennaway
School of Computing Sciences,
University of East Anglia, Norwich NR4 7TJ, U.K.