Why we are _not_ conscious

[Bruce Nevin (2016.01.01.08:53 ET)]

Rupert Young (2015.12.31 15.15) –

An excellent beginning!

I agree that this should be preceded by an introduction to the rudiments of HPCT and reorganization. It could be a chapter or section of a larger work.

In such context, it can be presented from the outset as directed reorganization (your term). The posited reorganization system already is meta- to the hierarchy. It does the heavy lifting. Much of your line of argument is subsumed under an account of how a reorganization system emerges in an evolving population. Your specific topic is the further emergence of attention, awareness, and consciousness. Are these distinct?Â

"Conscious attention can be directed to a specific [skill] to be learned, such as juggling."Â Attention can also be directed to systems where there is no error. Directing attention to a well-functioning system that is in the course of executing a practiced skill can interfere with performance, but does not necessarily do so. You can watch your fingers forming chord sequences on a guitar neck. Attention is not an uncontrolled consequence of error, it is subject to control. It is not the case that when reorganization ceases, so does awareness. What system is directing attention?

A popularized form of vipaśyanÄ? or ‘mindfulness’ meditation directs attention to intensities and sensations in successive areas of the body. These perceptions sometimes are associated with error in systems that are usually ignored (e.g. a slight itch, tinnitis, pressure on a foot), but not always. There seems to be a gradation as to how significant error is. A difference of gain? Resources for perceptual input are finite: the eyes can be directed here or there but not both at once. Resources for affecting the environment are finite. So higher-level systems pick your battles for you, and there is chronic conflict inherent to this ignoring of error that does not matter, or does not matter enough.

Awareness is more passive. We become aware of a perception. That is more consistent with the proposal that error determines the locus of awareness.

‘Consciousness’ seems to refer to both attention and awareness generically as an attribute of experience, without specifying any given perception.Â

But yes, attention can also be ‘distracted’ from its intended focus. If there is a system directing attention, then does it redirect attention upon becoming aware of error?

An undead robot can make no distinction between unconscious, automatic control (as in maintenance of posture) and whatever we mean here. Giving a robot ability to practice and sharpen a skill is one aspect. Is it the whole of it?

···

On Thu, Dec 31, 2015 at 2:46 PM, Warren Mansell wmansell@gmail.com wrote:

Hi Rupert, I think this is a vital topic to be working on and I am sure that this essay is on the right track! I think that the last section is particularly intriguing as I wouldxlike to know how we might model a quality control system that manifests itself as qualia.Â
I do think there are some challenges to the style of this piece. First, what is the intended audience? PCT savvy or PCT naive? At the moment there is little definition of terminology such as reorganisation. Second, I think it is important to go through and cite what has already been explained about awareness within PCT, such as within the 1960 papers, B:CP and writings on MOL, so that it is really clear what is original hear. I know that some of it is, probably the important bits, but I think it should be explicit. I am aware of Bill stating on many occasions how awareness helps to focus reorganisation on the regions of a control hierachy that will reduce intrinsic error. The 1960 pspeed are quite specific about it.

Hope this helps!

Warren

On Thursday, December 31, 2015, Ted Cloak tcloak@unm.edu wrote:

[From Ted Cloak (2015.12.31 0927 MST)]

Rupert Young, I think you have a brilliant and intriguing insight here. Please put your name and contact information on it at once, so people can engage you
about it.

Â

Send it to Dan Dennett
(Daniel.Dennett@tufts.edu).

Â

For lay readers, I suggest you insert a definition of “zombie” as used by people who talk about consciousness. And some references to PCT.

Â

Reading the Guardian article reminds me of a serious problem in the study of consciousness: From a scientific point of view, we know what causes consciousness
(brain activity), but we don’t know what it is. Therefore, we can’t develop tests to determine whether it is present (except in ourselves, which doesn’t count in science).

Â

(In the Guardian article, Burkeman makes one rookie mistake: “Above all, critics point out, if this non-physical mental stuff did exist, how could it cause
physical things to happen – as when the feeling of pain causes me to jerk my fingers away from the saucepan’s edge?” William James showed long ago that the fingers begin to jerk away *before * their owner experiences pain. Ergo, the pain is not an intervening
causal variable; rather, it is merely epiphenomenal, a side-effect.)

Â

Best

Ted

Â

Â

[From Rupert Young (2015.12.31 15.15)]

I’ve been considering consciousness in the context of PCT, and have put together a draft of some thoughts in the attached. Comments and feedback welcome.

Here’s a nice summary of the issue,
http://www.theguardian.com/science/2015/jan/21/-sp-why-cant-worlds-greatest-minds-solve-mystery-consciousness

Regards,
Rupert


Dr Warren Mansell
Reader in Clinical Psychology
School of Psychological Sciences
2nd Floor Zochonis Building
University of Manchester
Oxford Road
Manchester M13 9PL
Email: warren.mansell@manchester.ac.uk
Â
Tel: +44 (0) 161 275 8589
Â
Website: http://www.psych-sci.manchester.ac.uk/staff/131406
Â
Advanced notice of a new transdiagnostic therapy manual, authored by Carey, Mansell & Tai - Principles-Based Counselling and Psychotherapy: A Method of Levels Approach

Available Now

Check www.pctweb.org for further information on Perceptual Control Theory

Defers the problem: If consciousness is the workspace in which we monitor salient variables, or make them salient, or whatever, then who (or what) is this “we” of whom you speak–which appears to be conscious of these variables?

···

On Sat, Jan 2, 2016 at 10:25 AM, Warren Mansell wmansell@gmail.com wrote:

Great plan Rupert.

How about this…

Could consciousness be a ‘workspace’ in which we monitor the (many) perceptual variables that we are currently controlling and select, in the moment, the variables we wish to continue controlling, which we wish to start controlling, and which we wish to stop controlling, at any one moment? This decision will be partly based on the trajectory of errors for each of these variables, partly on how we weight the importance of those variables, and partly on the capabilities and opportunities in the current environment.

Thus, even if we have a certain system concepts we wish to maintain over longer periods of time (e.g. Having the role of a ‘good father’; being a ‘good husband’; being a ‘good work colleague’), at any one moment we may not be working towards any of these, and rarel
y all of them at once. So there needs to be a way of switching efficiently between CVs and navigating our control hierarchies on the fly. Indeed, we can have the most noble ideals, but in practice they need to be balanced alongside one another in the moment to be seen through ultimately…

What do people think?

Warren

On 2 Jan 2016, at 14:45, Rupert Young rupert@perceptualrobots.com wrote:

[From Rupert Young (2016.01.02 14.45)]

On 31/12/2015 16:49, Ted Cloak wrote:

        [From

Ted Cloak (2015.12.31 0927 MST)]

        Rupert

Young, I think you have a brilliant and intriguing insight
here. Please put your name and contact information on it at
once, so people can engage you about it.

        Send

it to Dan Dennett
(Daniel.Dennett@tufts.edu).

        For

lay readers, I suggest you insert a definition of “zombie”
as used by people who talk about consciousness. And some
references to PCT.

  On 31/12/2015 19:46, Warren Mansell

wrote:

  Hi Rupert, I think this is a vital topic to be working

on and I am sure that this essay is on the right track! I think
that the last section is particularly intriguing as I wouldxlike
to know how we might model a quality control system that manifests
itself as qualia.
I do think there are some challenges to the style of this
piece. First, what is the intended audience? PCT savvy or PCT
naive? At the moment there is little definition of terminology
such as reorganisation. Second, I think it is important to go
through and cite what has already been explained about awareness
within PCT, such as within the 1960 papers, B:CP and writings on
MOL, so that it is really clear what is original hear. I know
that some of it is, probably the important bits, but I think it
should be explicit. I am aware of Bill stating on many occasions
how awareness helps to focus reorganisation on the regions of a
control hierachy that will reduce intrinsic error. The 1960
pspeed are quite specific about it.

Thanks. At the moment it is an informal draft aimed at CSG, for

discussion. I may develop it further into a paper for a philosophy
journal, and would, of course, go into much more detail of
terminology, previous work, sources and description of PCT.

Rupert

[From Rupert Young (2015.01.03 17.30)]

In such context, it can be presented from the outset as directed reorganization (your term). The posited reorganization system already is meta- to the hierarchy. It does the heavy lifting. Much of your line of argument is subsumed under an account of how a reorganization system emerges in an evolving population. Your specific topic is the further emergence of attention, awareness, and consciousness. Are these distinct?

I'd say awareness is having conscious experience about things, consciousness is a general term for beings who have awareness and attention is an external observation that a being is focussing on specific things (of which they may or may not be aware).

"Conscious attention can be directed to a specific [skill] to be learned, such as juggling." Attention can also be directed to systems where there is no error.

It does seem like that but is that the same level? Something can be the focus of attention from different levels. It may be the case that higher levels are consistently in persistent error, or, to put it another way, resolving the quality of control.

Directing attention to a well-functioning system that is in the course of executing a practiced skill can interfere with performance, but does not necessarily do so. You can watch your fingers forming chord sequences on a guitar neck. Attention is not an uncontrolled consequence of error, it is subject to control.

But why would you be doing this? This sounds like at a higher level than the goal of forming chord sequences. So this may be for the purpose of resolving the quality of control at that higher level.

It is not the case that when reorganization ceases, so does awareness.

When you walk are you still aware of all the low level perceptions involved in limb movements, balance etc?

What system is directing attention?

I'm not sure at the moment. I wonder if what we usually refer to as high level systems are actually the quality control systems (the source of reorganisation).

A popularized form of vipaśyan�? or 'mindfulness' meditation directs attention to intensities and sensations in successive areas of the body. These perceptions sometimes are associated with error in systems that are usually ignored (e.g. a slight itch, tinnitis, pressure on a foot), but not always. There seems to be a gradation as to how significant error is. A difference of gain? Resources for perceptual input are finite: the eyes can be directed here or there but not both at once. Resources for affecting the environment are finite. So higher-level systems pick your battles for you, and there is chronic conflict inherent to this ignoring of error that does not matter, or does not matter enough.

Yes, this sounds like a higher level system "directing" awareness, but could this be due to error at that higher level? That attention could also be grabbed due to specific lower level error (e.g. itch). Are these the same but just at different levels?

But yes, attention can also be 'distracted' from its intended focus. If there is a system directing attention, then does it redirect attention upon becoming aware of error?

Or could we say that awareness is 'directed/distracted' by error/quality, but that that error could be due to a perception changing (e.g. loud noise) or due to a reference changing (as in your meditation example)? I.e. awareness could be 'directed' top-down or bottom-up.

An undead robot can make no distinction between unconscious, automatic control (as in maintenance of posture) and whatever we mean here. Giving a robot ability to practice and sharpen a skill is one aspect. Is it the whole of it?

Are we not all undead robots?

Awareness is more passive. We become aware of a perception. That is more consistent with the proposal that error determines the locus of awareness.

But what does it mean to become aware of a perception? I think we need something more to explain what is going on, as awareness is about qualia not just perceptual signals, which is the heart of the hard problem. This is what I am trying to address with the concept of quality control (though it may be a linguistic sleight of hand to equate qualia and quality).

Rupert

···

On 01/01/2016 13:54, Bruce Nevin wrote:

[From Bruce Nevin (2015.01.05 18:19 ET)]

Rupert Young (2015.01.03 17.30) --

RY: I'd say awareness is having conscious experience about things, consciousness is a general term for beings who have awareness and attention is an external observation that a being is focussing on specific things (of which they may or may not be aware).

BN: We're in agreement about consciousness as a generic term; I don't think you meant to say it is a term for the beings who 'have' it. But to say that awareness is "having conscious experience" makes it dependent on the undefined generic term, consciousness. You make the intriguing suggestion that attention can only be attributed by an external observer. It is true that the direction of certain sensory organs is observable, but there are shifts of sensory attention that are imperceptible to an external observer, especially e.g. in the aural, tactile, and kinesthetic modalities, and a far more commonplace shifting of attention among higher-level perceptions with no change of sensory input. I find a clear subjective distinction between attention, which can be directed, and awareness, which may passively occur, not always as a consequence of one's having directed one's input functions and/or having adjusted input gains appropriately (shifting attention). The passive, receptive character of awareness broaches into a kind of foggy estuary delta of questions of the sort you raise below, e.g. when you relax from attending inwardly to remembered or imagined perceptions and notice what is before your eyes--did some system in the hierarchy direct your fovea there? Supposing yes, if that control is out of your awareness, you wouldn't be able to distinguish it from a consequence of eye movement related to remembering or imagining. Here, too, is the vast and fascinating realm of hypnotic phenomena. (I've almost finished reading Erickson's Collected Papers.)

BN-: "Conscious attention can be directed to a specific [skill] to be learned, such as juggling" [as you said, but] Attention can also be directed to systems where there is no error.

RY-: It does seem like that but is that the same level? Something can be the focus of attention from different levels. It may be the case that higher levels are consistently in persistent error, or, to put it another way, resolving the quality of control.

BN: My initial reaction is: of course not, a system that directs attention to particular perceptions must be at a higher level than those perceptions. But maybe it's not so clear. Consider an animal startled by a noise. The "startle" behavior includes not only preparation for fight or flight, but also observable actions that increase sensory input: eyes and pupils widen, ears prick up, nostrils flare. It is plausible that gain of some perceptual signals within the hierarchy is increased. I understand that all this is driven from 'primitive' parts of the brain with no involvement of higher levels of the hierarchy.
BN: Yes, this reeks of a primitive system attending to error. So, is the hypothesis that error at a high level 'triggers' that same primitive mechanism? Comes a contradiction to a cherished opinion/belief, mocking of a favored athletic team, mention of our name or a work of ours, and our ears prick up.
BN: And yet attention is not always autonomic.

BN-: Directing attention to a well-functioning system that is in the course of executing a practiced skill can interfere with performance, but does not necessarily do so. You can watch your fingers forming chord sequences on a guitar neck. Attention is not an uncontrolled consequence of error, it is subject to control.>>

RY-: But why would you be doing this? This sounds like at a higher level than the goal of forming chord sequences. So this may be for the purpose of resolving the quality of control at that higher level.

BN: Yes, it could be for the purpose of improving quality of control; but it could be 'just noticing'. This seems to be in that foggy estuary where distinctions are difficult to test. Fretting chords is a poor example, because we (or I, anyway) often look to make sure of placement on the correct strings behind the correct frets. Going for a walk, there are many things that we 'just notice'. Scanning for error? So in addition to the autonomic 'startle' loop, there's an ongoing monitoring that can have the subjective effect of voluntary 'noticing'? Your proposed quality-control system might do a broad scan when not employed monitoring the quality of some particular control loop. Is that a locus of a sense of well-being, when all is well?
BN: Reorganization, as postulated, has no intelligence, no input other than error, and no output other than 'change something'. It doesn't act immediately, problem-solving systems get a crack first. And not just any error. Our best model of how to address reorganization to the right system is: Reorganization follows attention; attention goes to error. This is handwaving until we are clear what attention is and how it works.
BN: You proposed that attention can be observed. I assume this refers to observable use of means of perception, e.g. visual focus. Autonomic processes all carry on at once. Evolution seems to have arranged their requisite input and output requirements to permit that. But when 'voluntary' control of a perception uses the same finite input and output capacities (e.g. eyes, hands) as control of another perception uses, they cannot both be accomplished at once. (I have not seen this mentioned in any discussion of conflict.) What restrictions are there on 'internal' shifts of attention among imagined or remembered perceptions within the hierarchy Such shifts of attention do not engage sensory input functions with physically limited capacity. Why is our internal attention limited in the same way?
BN: The answer seems to be awareness. Awareness is limited. Awareness focuses, concentrates on a perception. Which of course is the hand-waving answer that is the topic of our conversation. But when we're not focused and concentrating, we notice things.
BN: Reorganization seems to be deferred until in some sense it is a disturbance to well-being which persists uncorrected. Is there a system scanning for 'well-being'? That notices and appreciates aspects of a harmonious environment without disturbances?

BN-: It is not the case that when reorganization ceases, so does awareness.

RY-: When you walk are you still aware of all the low level perceptions involved in limb movements, balance etc?

BN: Of course not, but were we talking about reorganization by which infants and toddlers learn to use their bodies? I was thinking of the kind of reorganization that leads an MOL client to a personally fitting resolution of psychological distress due to conflict.

BN-: What system is directing attention?

RY-: I'm not sure at the moment. I wonder if what we usually refer to as high level systems are actually the quality control systems (the source of reorganisation).

BN-: A popularized form of vipasyana or 'mindfulness' meditation directs attention to intensities and sensations in successive areas of the body. These perceptions sometimes are associated with error in systems that are usually ignored (e.g. a slight itch, tinnitis, pressure on a foot), but not always. There seems to be a gradation as to how significant error is. A difference of gain? Resources for perceptual input are finite: the eyes can be directed here or there but not both at once. Resources for affecting the environment are finite. So higher-level systems pick your battles for you, and there is chronic conflict inherent to this ignoring of error that does not matter, or does not matter enough.

RY-: Yes, this sounds like a higher level system "directing" awareness, but could this be due to error at that higher level? That attention could also be grabbed due to specific lower level error (e.g. itch). Are these the same but just at different levels?

BN-: But yes, attention can also be 'distracted' from its intended focus. If there is a system directing attention, then does it redirect attention upon becoming aware of error?

RY-: Or could we say that awareness is 'directed/distracted' by error/quality, but that that error could be due to a perception changing (e.g. loud noise) or due to a reference changing (as in your meditation example)? I.e. awareness could be 'directed' top-down or bottom-up.

BN-: An undead robot can make no distinction between unconscious, automatic control (as in maintenance of posture) and whatever we mean here. Giving a robot ability to practice and sharpen a skill is one aspect. Is it the whole of it?

RY-: Are we not all undead robots?

BN: Tim Leary was overheard saying to his wife on the phone "The robot is tired." Is this a question about determinism? If we have freedom and dignity, pace Professor Burrhus Skinner and others, it is exercised when we make a choice. Every choice is a conflict (thank you, Tim and Warren, for that observation). In most cases, a choice is made, a problem solved, a conflict resolved, by pre-existing or learned problem-solving systems in the hierarchy. To that extent, they are determinate. When existing systems find no resolution, and it matters and is an occasion of distress, the reorganization system reaches into the random. As Bateson pointed out, that is the essence of creativity. Being stuck at the level of the CV of contention is the nature of his Double Bind, where you can't jump and you can't not jump. Reaching into the random at a higher level controlling the CV of contention is not determinate.
BN: That may say something about attention, but it says nothing about awareness-consciousness, and attention is (I propose) a circumscription of awareness by the limitations of sensors and/or by something else within the hierarchy. Concentrating and ignoring are complementary sides of the same coin.

BN-: Awareness is more passive. We become aware of a perception. That is more consistent with the proposal that error determines the locus of awareness.

BN: ... and focuses awareness as attention.

RY-: But what does it mean to become aware of a perception? I think we need something more to explain what is going on, as awareness is about qualia not just perceptual signals, which is the heart of the hard problem. This is what I am trying to address with the concept of quality control (though it may be a linguistic sleight of hand to equate qualia and quality).

BN: Not the same as quality of control. Here:

QUALIA: What it is like to have an experience
Qualia include the ways things look, sound and smell, the way it feels to have a pain, and more generally, what it's like to have experiential mental states. (‘Qualia’ is the plural of ‘quale’.) Qualia are experiential properties of sensations, feelings, perceptions and, more controversially, thoughts and desires as well. But, so defined, who could deny that qualia exist?

<Ned Block

Author: "NED BLOCK (Ph.D., Harvard), Silver Professor of Philosophy, Psychology and Neural Science, came to NYU in 1996 from MIT where he was Chair of the Philosophy Program. He works in philosophy of perception and foundations of neuroscience and cognitive science and is currently writing a book on attention."

BN: We've quoted Bill recently on the difference between perceptions as we experience them and perceptions defined as rates of firing in neurons. That difference is in the qualia. I know I have subjective experience that is not accounted for by rates of firing in my neurons nor by electrical and other activities in your robot, however sophisticated. And that, my friend, is the jungle into which we have parachuted.
I will be busy in DC from tomorrow until I return on the 16th, and for a while after my input and output functions will probably be occupied with serving other control loops in my hierarchy. :slight_smile:

···

/Bruce

[JK 2016 01 02]

I’m most curious --Â

Ah, Mr. Picasso; not bad, not bad at all, but could do better. There’s clearly a problem with colour tone. Try again.

That’s a cute little poem Mr Frost, but why don’t you pay more attention to syntax? Good beginning though.

ee cummings comes to mind: those who pay attention to the detail of things will never wholly kiss you.

In RM and TC’s recent collection of essays there’s an elegant and respectful presentation of PCT’s inherent paradox regarding human nature. Fortunately, it’s not littered with pages of citations and footnotes. Thank you two, too.

Like may others I too enjoyed a string of correspondence with Bill. He was invisibly gracious, exercising a pedagogical style which promoted self-accountability. He didn’t mimic any omnipotent editor, he was a bloke offering a hand-up; take it or leave it, your choice.Â

And a personal note: I’m off to Ankara later this month to be on deck to teach a couple of courses (Dev Psych and Attachment). Well, it’s more like continuing to develop and apply my interpretation of “teach”, which is not “Show and tell”. Every course I offer is a quasi-test for generating and recovering errors. As always, my aim is to allow students to, “do learning to learn doing”.  I’m sure you can detect the PCT influence there as well as the long provenance of others. It’s a hard job holding a lit candle to them; cruel winds of the open-loop academic industrial machine blow incessantly.

On another matter entirely, here’s handful of home-grown PCT-inspired New Year quips. There’s an invitation to anybody to chip in some more. After all, many hands…

Resolutions are sought goals seldom attained

PCT bridges troubled waters

PCT’s closed-loop gestalt is simultaneously simple and complex.

A map is a carpet under which many errors are brushed.

PCT offers hope (perhaps, the “thrill of hope for a weary world”) for continuing to grasp at reach. (hat-tilt to Robert Browning)

Tell it like it is and nobody takes a jot of notice; let them discover and they’ll embrace it forever.

May you be blessed with another troubled year.

···

On Sat, Jan 2, 2016 at 2:54 AM, Bruce Nevin bnhpct@gmail.com wrote:

[Bruce Nevin (2016.01.01.08:53 ET)]

Rupert Young (2015.12.31 15.15) –

An excellent beginning!

I agree that this should be preceded by an introduction to the rudiments of HPCT and reorganization. It could be a chapter or section of a larger work.

In such context, it can be presented from the outset as directed reorganization (your term). The posited reorganization system already is meta- to the hierarchy. It does the heavy lifting. Much of your line of argument is subsumed under an account of how a reorganization system emerges in an evolving population. Your specific topic is the further emergence of attention, awareness, and consciousness. Are these distinct?Â

"Conscious attention can be directed to a specific [skill] to be learned, such as juggling."Â Attention can also be directed to systems where there is no error. Directing attention to a well-functioning system that is in the course of executing a practiced skill can interfere with performance, but does not necessarily do so. You can watch your fingers forming chord sequences on a guitar neck. Attention is not an uncontrolled consequence of error, it is subject to control. It is not the case that when reorganization ceases, so does awareness. What system is directing attention?

A popularized form of vipaÅ›yanÄ? or ‘mindfulness’ meditation directs attention to intensities and sensations in successive areas of the body. These perceptions sometimes are associated with error in systems that are usually ignored (e.g. a slight itch, tinnitis, pressure on a foot), but not always. There seems to be a gradation as to how significant error is. A difference of gain? Resources for perceptual input are finite: the eyes can be directed here or there but not both at once. Resources for affecting the environment are finite. So higher-level systems pick your battles for you, and there is chronic conflict inherent to this ignoring of error that does not matter, or does not matter enough.

Awareness is more passive. We become aware of a perception. That is more consistent with the proposal that error determines the locus of awareness.

‘Consciousness’ seems to refer to both attention and awareness generically as an attribute of experience, without specifying any given perception.Â

But yes, attention can also be ‘distracted’ from its intended focus. If there is a system directing attention, then does it redirect attention upon becoming aware of error?

An undead robot can make no distinction between unconscious, automatic control (as in maintenance of posture) and whatever we mean here. Giving a robot ability to practice and sharpen a skill is one aspect. Is it the whole of it?

/Bruce

On Thu, Dec 31, 2015 at 2:46 PM, Warren Mansell wmansell@gmail.com wrote:

Hi Rupert, I think this is a vital topic to be working on and I am sure that this essay is on the right track! I think that the last section is particularly intriguing as I wouldxlike to know how we might model a quality control system that manifests itself as qualia.Â
I do think there are some challenges to the style of this piece. First, what is the intended audience? PCT savvy or PCT naive? At the moment there is little definition of terminology such as reorganisation. Second, I think it is important to go through and cite what has already been explained about awareness within PCT, such as within the 1960 papers, B:CP and writings on MOL, so that it is really clear what is original hear. I know that some of it is, probably the important bits, but I think it should be explicit. I am aware of Bill stating on many occasions how awareness helps to focus reorganisation on the regions of a control hierachy that will reduce intrinsic error. The 1960 pspeed are quite specific about it.

Hope this helps!

Warren

On Thursday, December 31, 2015, Ted Cloak tcloak@unm.edu wrote:

[From Ted Cloak (2015.12.31 0927 MST)]

Rupert Young, I think you have a brilliant and intriguing insight here. Please put your name and contact information on it at once, so people can engage you
about it.

Â

Send it to Dan Dennett
(Daniel.Dennett@tufts.edu).

Â

For lay readers, I suggest you insert a definition of “zombie” as used by people who talk about consciousness. And some references to PCT.

Â

Reading the Guardian article reminds me of a serious problem in the study of consciousness: From a scientific point of view, we know what causes consciousness
(brain activity), but we don’t know what it is. Therefore, we can’t develop tests to determine whether it is present (except in ourselves, which doesn’t count in science).

Â

(In the Guardian article, Burkeman makes one rookie mistake: “Above all, critics point out, if this non-physical mental stuff did exist, how could it cause
physical things to happen – as when the feeling of pain causes me to jerk my fingers away from the saucepan’s edge?” William James showed long ago that the fingers begin to jerk away *before * their owner experiences pain. Ergo, the pain is not an intervening
causal variable; rather, it is merely epiphenomenal, a side-effect.)

Â

Best

Ted

Â

Â

[From Rupert Young (2015.12.31 15.15)]

I’ve been considering consciousness in the context of PCT, and have put together a draft of some thoughts in the attached. Comments and feedback welcome.

Here’s a nice summary of the issue,
http://www.theguardian.com/science/2015/jan/21/-sp-why-cant-worlds-greatest-minds-solve-mystery-consciousness

Regards,
Rupert


Dr Warren Mansell
Reader in Clinical Psychology
School of Psychological Sciences
2nd Floor Zochonis Building
University of Manchester
Oxford Road
Manchester M13 9PL
Email: warren.mansell@manchester.ac.uk
Â
Tel: +44 (0) 161 275 8589
Â
Website: http://www.psych-sci.manchester.ac.uk/staff/131406
Â
Advanced notice of a new transdiagnostic therapy manual, authored by Carey, Mansell & Tai - Principles-Based Counselling and Psychotherapy: A Method of Levels Approach

Available Now

Check www.pctweb.org for further information on Perceptual Control Theory

[From Rupert Young (2016.01.02 14.45)]

Thanks. At the moment it is an informal draft aimed at CSG, for

discussion. I may develop it further into a paper for a philosophy
journal, and would, of course, go into much more detail of
terminology, previous work, sources and description of PCT.
Rupert

···

On 31/12/2015 16:49, Ted Cloak wrote:

        [From

Ted Cloak (2015.12.31 0927 MST)]

        Rupert

Young, I think you have a brilliant and intriguing insight
here. Please put your name and contact information on it at
once, so people can engage you about it.

        Send

it to Dan Dennett
(Daniel.Dennett@tufts.edu).

        For

lay readers, I suggest you insert a definition of “zombie”
as used by people who talk about consciousness. And some
references to PCT.

  On 31/12/2015 19:46, Warren Mansell

wrote:

  Hi Rupert, I think this is a vital topic to be working

on and I am sure that this essay is on the right track! I think
that the last section is particularly intriguing as I wouldxlike
to know how we might model a quality control system that manifests
itself as qualia.
I do think there are some challenges to the style of this
piece. First, what is the intended audience? PCT savvy or PCT
naive? At the moment there is little definition of terminology
such as reorganisation. Second, I think it is important to go
through and cite what has already been explained about awareness
within PCT, such as within the 1960 papers, B:CP and writings on
MOL, so that it is really clear what is original hear. I know
that some of it is, probably the important bits, but I think it
should be explicit. I am aware of Bill stating on many occasions
how awareness helps to focus reorganisation on the regions of a
control hierachy that will reduce intrinsic error. The 1960
pspeed are quite specific about it.

Great plan Rupert.

How about this…

Could consciousness be a ‘workspace’ in which we monitor the (many) perceptual variables that we are currently controlling and select, in the moment, the variables we wish to continue controlling, which we wish to start controlling, and which we wish to stop controlling, at any one moment? This decision will be partly based on the trajectory of errors for each of these variables, partly on how we weight the importance of those variables, and partly on the capabilities and opportunities in the current environment.

Thus, even if we have a certain system concepts we wish to maintain over longer periods of time (e.g. Having the role of a ‘good father’; being a ‘good husband’; being a ‘good work colleague’), at any one moment we may not be working towards any of these, and rarel
y all of them at once. So there needs to be a way of switching efficiently between CVs and navigating our control hierarchies on the fly. Indeed, we can have the most noble ideals, but in practice they need to be balanced alongside one another in the moment to be seen through ultimately…

What do people think?

Warren

···

On 31/12/2015 16:49, Ted Cloak wrote:

        [From

Ted Cloak (2015.12.31 0927 MST)]

        Rupert

Young, I think you have a brilliant and intriguing insight
here. Please put your name and contact information on it at
once, so people can engage you about it.

        Send

it to Dan Dennett
(Daniel.Dennett@tufts.edu).

        For

lay readers, I suggest you insert a definition of “zombie”
as used by people who talk about consciousness. And some
references to PCT.

  On 31/12/2015 19:46, Warren Mansell

wrote:

  Hi Rupert, I think this is a vital topic to be working

on and I am sure that this essay is on the right track! I think
that the last section is particularly intriguing as I wouldxlike
to know how we might model a quality control system that manifests
itself as qualia.
I do think there are some challenges to the style of this
piece. First, what is the intended audience? PCT savvy or PCT
naive? At the moment there is little definition of terminology
such as reorganisation. Second, I think it is important to go
through and cite what has already been explained about awareness
within PCT, such as within the 1960 papers, B:CP and writings on
MOL, so that it is really clear what is original hear. I know
that some of it is, probably the important bits, but I think it
should be explicit. I am aware of Bill stating on many occasions
how awareness helps to focus reorganisation on the regions of a
control hierachy that will reduce intrinsic error. The 1960
pspeed are quite specific about it.

[From Rick Marken (2016.01.02.1200)]

···

[JK 2016 01 02]

JK: From my perspective, Rupert’s presented a rather tidy, informative essay. Thank you.

RM: From my perspective as well. Great job, Rupert! I’ll try to say more about it when I work myself out from under all the stuff I should have been doing during the holidays but instead engaged in long philosophical discussions with my 2.25 year old granddaughter.

JK: In RM and TC’s recent collection of essays there’s an elegant and respectful presentation of PCT’s inherent paradox regarding human nature. Fortunately, it’s not littered with pages of citations and footnotes. Thank you two, too.

RM: Thanks John. And in “Controlling People” Tim and I do touch on the same ideas that Rupert discusses. But, as you note, not in the far more scholarly (footnote laden;-) way that Rupert does.

JK: Like may others I too enjoyed a string of correspondence with Bill. He was invisibly gracious, exercising a pedagogical style which promoted self-accountability. He didn’t mimic any omnipotent editor, he was a bloke offering a hand-up; take it or leave it, your choice.

RM: Well put. And when I go back and read his writings, particularly the papers in LCS I and II, I am actually startled by how lucidly brilliant he was and how short a distance we have gone beyond him.

JK: On another matter entirely, here’s handful of home-grown PCT-inspired New Year quips. There’s an invitation to anybody to chip in some more. After all, many hands…

Resolutions are sought goals seldom attained

PCT bridges troubled waters

PCT’s closed-loop gestalt is simultaneously simple and complex.

A map is a carpet under which many errors are brushed.

PCT offers hope (perhaps, the “thrill of hope for a weary world”) for continuing to grasp at reach. (hat-tilt to Robert Browning)

Tell it like it is and nobody takes a jot of notice; let them discover and they’ll embrace it forever.

May you be blessed with another troubled year.

RM: Thank you, particularly for that last one! I couldn’t possibly add anything to that. Well, perhaps:

RM: The only way to avoid trouble is to not want anything. And the only way to not want anything is to not be alive.

Best

Rick

Richard S. Marken

Author, with Timothy A. Carey, of Controlling People: The Paradoxical Nature of Being Human.

[Martin Taylor 2016.01.04.14.38]

Since Rupert's original essay [From Rupert Young (2015.12.31 15.15)] sounded rather familiar, I just went back to a few months of the CSGnet archives that I happen to have stored locally, and searched for the word "consciousness". I extracted a few chunks out of some messages, and a couple of complete messages. I expect that if I had searched for "Attention" or "Awareness" and similar words, I would have found a bit more, but these give a flavour of what was happening on CSGnet in the (more or less randomly chosen) years 1997-99 (plus one fragment from the now defunct ECACS (Explorations of Complex Adaptive Control Systems) bulletin board).

Martin

-------------- message quotes follow ------

Extracts from CSGnet messages in which the word "consciousness" occurs. If the whole message is
quoted, the signature is included at the end. Otherwise only a snippet of a longer message is
included.

···

------
[From Bill Powers (970613.1101 MDT)]

Martin Taylor 970613 0935--

MT: >Presumably "mastery" control is an aspect of the reorganizing
>system, targeted toward the reorganization of a sub-part of the
>perceptual hierarchy that could be involved with the perception
>whose control is to be "mastered."

BP: I guess what I'm trying to say is that I don't think the actual achievement of mastery is under
the control of learned systems, which is where consciousness resides. The reorganizing system is not
part of them; it works automatically, at all levels, in the same way every time. It's an _unlearned_
control system, working outside the learned hierarchy and outside consciousness.

------------------ [From Rick Marken (971010.1000)]

I think consciousness moves up and down the hierarchy "at will", going to the places where error is
currently largest. If you cut your finger consciousness moves down to the intensity/sensation level
(pain). If you are having problems at work consciousness moves up to levels having to do with
relationships, rules and values.

------------------------- [From Rick Marken (971024.1000)]

i.kurtzer (971023) --

> i would suggest that one of the clearest implications of a
> hierarchy of _experience_, !!!!not to be confused with a
> concatation of control structures!!! is that there is no
> real "i" . that it is an arbitrary identification and
> subsequently a reification.

I basically agree with what you are saying here. What I have been calling the "consious me" does
seems to experience the world (the hierarchy of perception) from the point of view of the particular
level in the hierarchy from which it is currently aware. So there are, in a sense, as many conscious
me's as there are levels in the hierarchy.

But the nature of the conscious aspect of me that becomes aware of the world in these different ways
nevertheless seems always the same to me; I think Bill P. once said something like: the nature of
consciousness seems to be the same whether we are conscious of our fingertips or our faults. I think
I have something like the same experience.

I do think it is reasonable to distinguish a conscious, observing, volitional aspect of ourselves
from the automatically controlling aspect. I think this distinction is already embodied in HPCT as
the distinction between the control hierarchy and the reorganization system. The latter is (I think)
the conscious aspect of us that is always "tinkering" with the control hierarchy; it tinkers more
with systems that are experiencing a lot of error than it does with systems that are working fine;
that's probably why the consious aspect of ourselves seems to spend most of it's time tinkering with
higher level perceptions -- of relationships, categories, programs, principles and even system
concepts.

-------------------

[From Bill Powers (971103.1524 MST)]

>i.kurtzer (971103.0915)
> >> i would suggest that one of the clearest implications of a hierarchy of
>>> _experience_, !!!!not to be confused with a concatation of control
>>> structures!!! is that there is no real "i" . that it is an arbitrary
>>> identification and subsequently a reification.

One of my I's agrees with you. There may be no real i's, but there are certainly apparent ones. They
are very convincing, too. As you say, however, they are probably control structures: system
concepts, principles, programs. Of course control systems at that level can do a lot of complex
things, like think and plan; they can be jealous or loving, they can lay blame and accept
responsibility -- all those things that we usually say "I" do. They're not imaginary.

But the real essence of the "I" feeling is none of those things. Whatever you observe about your own
"I", it is an object of observation, not the observer. If you think "I'm too fat" what you mean is
that your body is too fat. The point of view from which you observe your body is not too fat (thin,
angry, fearful, etc.).

There is an Observer which remains the same no matter what is being observed. When you shift
attention from one aspect of yourself to another, the content of consciousness changes but the
Observer does not. This is why all the I's to which you attend are unsatisfactory as a definition of
yourself; they are simply aspects of your experience. However, unless they are being Observed, they
are not part of conscious experience. All those things about you have changed since you were an
infant -- but the Observer is the same.

Not so?

Best,

Bill P.
----------------------------[From Bill Powers (971109.0639 MST)]

...So what IS "conscious experience?" Is it merely the existence of signals in perceptual pathways?
If so, it seems entirely unnecessary. And this interpretation doesn't explain why experience seems
confined, at one moment, to a small subset of all the neural signals that exist in the brain at a
given time, or why it should be confined to _afferent_ neural signals (as it is). And most
important, it doesn't explain how the field of experience can change from one time to another, so
sometimes we attend to one part of it, and sometimes to other parts at the expense of what was in
experience before. The variable content of consciousness is a big problem for any proposal that
would leave out Observation as a separate phenomenon.

Of all the aspects of the variable content of consciousness, the most telling one is the fact that
we can attend at different levels of perception. We can attend selectively to intensity information
or to system concepts or anything between. But if the hierarchical control model is correct, in
order for any higher-level perception to be controlled, perceptions at all lower levels must also be
under control. This means that when we are attending, say, to the route we will be riding on our
bicycle, we are still controlling perceptions of balance, effort, and so forth. Those perceptual
signals MUST STILL BE PRESENT and they must be under active control -- yet they are not part of
experience at that time. If this were not so, the higher levels of control could not be working. If
the lower perceptual signals did not go right on existing as usual while we attend to higher ones,
we couldn't be riding the bicycle while we tried to remember whether to turn right or left at the
next corner.

What this demonstrates is that the existence of a perceptual signal is independent of the experience
of the perceptual signal. When I try to think of a model that would have that characteristic, all I
can come up with is some sort of receiver that can be connected selectively and variably to various
afferent signals in the hierarchy. When this receiver is receiving information from some set of
perceptual signals, those signals are consciously experienced. When it is receiving from elsewhere,
those same signals, even though they continue to exist, are not experienced.

It is very convenient, therefore, to have some evidence of the existence of this selective receiver:
it's me. _I_ can attend selectively to different subsets of the perceptual signals that exist in my
body. This is not the "I" that is characterized by my physical or mental attributes, because I can
easily attend to something other than those -- the form of the Orion Nebula in an eyepiece, for
example, while forgetting that my fingers are freezing. This "I" that is not any of those other
"I's" is the receiver, tuned into this or that aspect of perception, at high or low levels.

----------------- [From Rick Marken (980402.1330)]

-- the reorganization system (which is basically what we call "consciousness")

------------------ [From Bill Powers(980404.0024 MST)]

The way we know about reference signals for the highest conscious level of control is that some
perceptions seem right while others seem wrong. In the control mode, there is no separate
consciousness of the reference signal against which the perception is being compared; the sense of
right and wrong is like a consciousness of the error signal. But I suspect it is more like
consciousness of the effects of actions driven by the error signal, and not of the error signal
itself. At least this is consistent both with experience and with the general postulate that
perception is associated only with the afferent systems.

-------------------- [From Bruce Nevin (980407.1128)]

Bruce Gregory 9980407.1015 EDT) --

>Rick is stepping well beyond the bounds
>of HPCT when he identifies consciousness with the reorganization
>mechanism. Clearly reorganization can and does take place in the
>total absence of
consciousness. And, as Bill points out, HPCT
>works in exactly the same what with or without consciousness.

Doesn't follow. Consciousness can not be aware of itself. If you identify consciousness with
reorganization, it follows that we are not conscious of reorganization itself any more than we are
conscious of consciousness itself.

Bruce Nevin ---------------------------- [From Rick Marken (980407.2120)]

The perceptual signals are always there; what changes is which of these perceptual signals
becomes the object of awareness (you suddenly become aware of the Rachmaninoff playing in the
background) and which disappear from awareness (the pressure on the butt disappears from
consciousness -- oops, it popped back when I said that;-)).

--------------------------
[From Bill Powers (980408.0248 MST)]

Bruce Gregory (980407.1628 EDT)-- Bruce Nevin (980407.1519 EST)--

>> What is happening when I observe something without control? "Observing" is
>> itself a controlled perception. ...
> Exactly.

Is that true? It doesn't seem so to me. If I realize "Ah, I'm observing the apple," I am no longer
doing that. I can observe various things, but how do I choose them in order to observe them? To say
I control which of them I am observing is to say that I know of them before I start observing them,
which is self-contradictory (that is, I know of something without, at first, observing it). If I am
observing an apple, there is an apple in my consciousness. But if I am obnserving the act of
observing, my attention is no longer on the apple, but on the observing.

The Method of Levels is based on the hypothesis that the Observer is never aware of itself.

Best,

Bill P.
---------------------- [From Bill Powers (980408.0255 MST)]

I think of awareness as a sort of tunable receiver; tuning it brings various activities in the
perceptual hierarchy into view of the Observer. The perceptual signals that are tuned in appear in
awareness, so we can say that those signals are the content of consciousness. But perceptual signals
that are not tuned in are still present, like all the radio stations that are broadcasting signals
outside the range of the receiver's tuning.

...

The "perception of perceiving" is always a higher level perception involving a different type of
perception. That is the basis of the Method of Levels. If you observe what you are perceiving at one
level, you become aware of perceptions _about_ the first perceptions, but not of the same type as
the first perceptions.

------------------------ [From Bill Powers (980408.0255 MST)]

Bruce Nevin (980407.2208 EDT)--

>... paying attention is coincident with control. Whatever
>attention is, the statement is that we see an instance of it when a control
>system controls a perception.

But what about control systems that are controlling perceptions of which you are not, at the moment,
aware? Right now (unless you're in bed) you are balancing, and you have been doing so for some time.
But you were not, I'll bet, aware of controlling the perceptual variables of balance. There are many
perceptual signals at levels both higher and lower than the level at which you're habitually aware
that are under control, but they are not in the field of consciousness. The higher systems are
determining the reference signals that set the highest goals of which you're aware, and your
conscious control actions result in varying controlled perceptions at lower levels, also outside
awareness. You can become aware of many of these signals, but most of the time you are not aware of
them even though they are present and being controlled.

...

I don't think attention has anything to do (directly) with control. Attention is the reception of
information from perceptual channels by something outside those channels. If any influence on the
process of control is to occur, it would have to be carried out through a different path, going
_from_ the outside something _to_ the control system. The label for that kind of channel would be
more like "volition" than "attention" (or "awareness").

So attention may be a necessary ingredient for affecting control systems, but the direction of
effects is wrong; there must be some other process that generates effects ON the control systems, in
order to affect how they operate.

This is the sort of thing that led me to conjecture about a connection between awareness/volition
and the reorganizing system.

--------------------
[From Bruce Nevin (980413.1456)]

Rupert Young (980413.1800 BST)--

>What do you mean by attention ? One way of looking at it is as the
>terminology that is given to the _observed_ behaviour that a living system
>goes through when it carrying out some task relative to specific aspects of
>the environment (and is said to be 'attending' only to those aspects), or in
>other words, when it is controlling perceptions to reduce error. This is
>already explained within PCT and at best 'attention' is synonomous with 'the
>behaviour of reducing perceptual error' and as such has no place in the
>terminology and theory of PCT. For example, if you are looking for a blue
>object the higher levels set non-zero reference values for blue and zero
>reference values for red resulting in behaviour relative only to blue objects,
>ie, 'attending to blue'.

What about when there is no observable behavior but the person (e.g. yourself) reports shifting
attention from that blue object on the screen to that humming sound to the smell of smoke to a
memory of a camping experience to the imagined experience of going camping this weekend, now there's
a good idea, whoops where's that smoke coming from? And at this point you see some observable action
for the first time.

Could still be controlling perceptual inputs of a higher-level PIF to reduce error at that level?
What do you think?

>> BO
> >I'll send you some soap ?

Gee, and all this time I thought it was short for "best offer", instead of "best."

Be well,

BN
--------------------------- [From Rick Marken (980413.1250)]

According to PCT, reorganization is always going on (to some extent) since there is always some
small level of error in some control systems. Consciousness seems to be involved in this process;
maybe its trying to direct the reorganization process to the part of the hierarchy experiencing the
most error

------------------------- [From Rick Marken (980413.2230)]

Speculations about the relationship between consciousness and the HPCT model can be found in B:CP
(p. 197 - 201). On p. 201 Powers says "If there is anything on which most psychotherapists would
agree, I think it would be the principle that change demands consciousness from the _point of view_
that needs changing". When I said "gain a perspective" I meant the same thing Bill meant when he
discussed "point of view". Another nice discussion of this "point of view" notion is found in the
chapter on "An Experiment With Levels" in LCS II (p. 41).

------------------- [From Rick Marken (980414.1550)]

Because I see no way for the hierarchy itself to solve conflicts that exist within itself, and
because I see people (including myself) solve conflicts all the time -- sometimes with great ease --
I believe that there is something about people that lets them do this. My own experience with
solving conflicts leads me to believe that consciousness is involved in this process. Again, I don't
know how consciousness "works"; all I know is that the HPCT model can't solve it's own conflicts.
Reorganization, which we _can_ model, can solve some conflicts. But I can solve most of my little
conflicts "instantly" when I can see them from a new conscious perspective; this may be
"reorganization" but it is not the kind of reorganization we have modeled; it seems to involve a
change of conscious perspective on the problem; that's why I talk about going "up a level" as a way
to solve conflicts.

-------------------
[From Bill Powers (990311.1053 MST)]

Bruce Gregory (990310.1650 EST)--
>
>It seems to me that the process of learning and then performing a
>tracking task both demand attention. In the learning stage
>reorganization may be going on, but there seems to be no evidence that
>it is occurring after "good" control is achieved. Have I missed
>something? If not, what does this simple example tell us about >attention?

All good relevant questions.

It's hard to imagine learning any control task unconsciously, but there are those who claim it can
be done. I'm sure we're never consciously aware of all the details we have learned, but it seems to
me that some degree of consciousness accompanies all learning.

I wonder why it is that when our attention wanders from the variable we're most saliently
controlling, that control begins to deteriorate (I think). I always think of a grandchild excitedly
reporting on something that went on at school, while the hand holding the glass of milk droops
closer and closer to spilling the milk on my computer. Does loss of awareness lower the gain of the
control system? Or does it allow the reference signal or the perception of "upright" to drift? Some
very specific experimentation is needed to see what is actually changing. Anyone who does the needed
experiments will be the first in the world to do it right.

I also wonder about control systems that have been very well learned -- so well that they can
apparently work quite well with no conscious awareness at all. Is it true that this really happens?
And what is different between such "automatized" control systems, if they exist, and others that
seem to require some attention to be paid to them?

Lots of room for good research here.

Best,

Bill P.
------------------------

And one snippet from the (now defunct) ECACS bulletin board, 2004

Martin Taylor
> > A long time ago I made a proposal for what perceptions are in consciousness at any moment. My
proposal is no better than any of the many others that have been offered, but it ties in with the
foregoing and suggests a possible approach to the memory selection problem, so I'll offer it as
though there were some evidence to support it, even though there is actually none.
> >
> > My proposal is that we are conscious of those perceptions for which control is imperfect or
difficult, or that are candidates for being switched in or out of control. This latter set must
exist, since the degrees of freedom for control are orders of magnitude fewer than the number of
perceptions that are potentially controllable. Most perceptions vary freely at any moment, while a
few are being actively controlled. Now I am suggesting that the candidate perceptions for
associative storage are the perceptions of which we are conscious. One testable consequence of this
suggestion is that we would ordinarily not remember anything of which we were not conscious at the
time. This cannot be true absolutely, as the cases of the mnemonicist and the eidetiker woman
demonstrate, but it could be largely true of most people.

Bruce Gregory:
As I sit looking out at the trees behind our house I wonder to what extent they represent a
perception for which control is imperfect or difficult or are candidates for being switched in or
out of control. I agree that we seem to become conscious of perceptions that fall into these
categories, but they seem to constitute a minority of the perceptions of which I am conscious. What
about situations such as watching the nightly BBC news? Watching the program involves control, but
does the content that interests me?

[from Kent McClelland (2016.01.04.14.25)]

Thanks, Martin, for sending along your compilation. Lots of interesting observations about awareness there.

Here’s a speculation to add to the mix. In the message extracted below Bill P. talks about "the variable we're most saliently controlling�? at a given time. “Saliently controlling,�? as I read it, means to keep the reference signal for that particular perception at that level of the hierarchy fixed or stabilized at a given value. My speculation is that the focal point of awareness is precisely "the variable we're most saliently controlling�? at the moment, the one fixed point in our kaleidoscope of shifting perceptions.

···

On Jan 4, 2016, at 1:50 PM, Martin Taylor <mmt-csg@mmtaylor.net> wrote:

-------------------
[From Bill Powers (990311.1053 MST)]

Bruce Gregory (990310.1650 EST)--
>
>It seems to me that the process of learning and then performing a
>tracking task both demand attention. In the learning stage
>reorganization may be going on, but there seems to be no evidence that
>it is occurring after "good" control is achieved. Have I missed
>something? If not, what does this simple example tell us about >attention?

All good relevant questions.

It's hard to imagine learning any control task unconsciously, but there are those who claim it can
be done. I'm sure we're never consciously aware of all the details we have learned, but it seems to
me that some degree of consciousness accompanies all learning.

I wonder why it is that when our attention wanders from the variable we're most saliently
controlling, that control begins to deteriorate (I think). I always think of a grandchild excitedly
reporting on something that went on at school, while the hand holding the glass of milk droops
closer and closer to spilling the milk on my computer. Does loss of awareness lower the gain of the
control system? Or does it allow the reference signal or the perception of "upright" to drift? Some
very specific experimentation is needed to see what is actually changing. Anyone who does the needed
experiments will be the first in the world to do it right.

I also wonder about control systems that have been very well learned -- so well that they can
apparently work quite well with no conscious awareness at all. Is it true that this really happens?
And what is different between such "automatized" control systems, if they exist, and others that
seem to require some attention to be paid to them?

Lots of room for good research here.

Best,

Bill P.

KM: Rick Marken once made the comment, I don’t remember when but it stuck with me, to the effect that to control one perception in a hierarchy at a fixed reference value, the control systems above and below it must be free to vary. Presumably he meant that the lower-level perceptions attached to the fixed perception must remain flexible (have references that are free to vary) because the perception with the fixed reference will be sending out error signals in response to disturbances, which will require changes lower down in the hierarchy in order to keep that fixed perception in control.

As to the higher-level perceptions that connect to the fixed (most salient) perception, it would seem to me that they need to be relatively inactive, that is, not sending out error signals themselves (because of reduced gain?), in order to not to impose shifting reference signals on the focal perception, thus allowing it to remain fixed.

In the extract, Bill gives the example of a grandchild talking excitedly who almost spills milk on his computer, and asks, "Does loss of awareness lower the gain of the control system?�? The kid is no longer saliently controlling the position of the glass, because the focal point of salient control has become the exciting story about what happened at school.

It seems to me that the connection he draws between the gain of the control system and the amount of attention given to a controlled perception makes a lot of sense. We know that awareness shifts to control systems that are experiencing uncorrected errors, and the focusing of attention feels a lot like dialing up the gain of the problematic systems to increase the control efforts and (hopefully) to experience tighter control. If the increase of gain doesn’t do the trick, the next step, it seems to me, is for the reorganization system to get more active.

In sum, I would suggest that to give attention to a perceptual variable requires both fixing its reference at a constant value (temporarily) and (temporarily) increasing the gain of that control system. Of course, for the perceptual hierarchy to work this way, gain would have to be a manipulable variable that shifts from control system to control system “at will," and I suppose this speculation just pushes the problem back a step, forcing the question, "What controls the gain variable?�?

Kent

Dopamine?

···

On 4 Jan 2016, at 21:28, McClelland, Kent <MCCLEL@Grinnell.EDU> wrote:

[from Kent McClelland (2016.01.04.14.25)]

Thanks, Martin, for sending along your compilation. Lots of interesting observations about awareness there.

Here’s a speculation to add to the mix. In the message extracted below Bill P. talks about "the variable we're most saliently controlling�? at a given time. “Saliently controlling,�? as I read it, means to keep the reference signal for that particular perception at that level of the hierarchy fixed or stabilized at a given value. My speculation is that the focal point of awareness is precisely "the variable we're most saliently controlling�? at the moment, the one fixed point in our kaleidoscope of shifting perceptions.

On Jan 4, 2016, at 1:50 PM, Martin Taylor <mmt-csg@mmtaylor.net> wrote:

-------------------
[From Bill Powers (990311.1053 MST)]

Bruce Gregory (990310.1650 EST)--

It seems to me that the process of learning and then performing a
tracking task both demand attention. In the learning stage
reorganization may be going on, but there seems to be no evidence that
it is occurring after "good" control is achieved. Have I missed
something? If not, what does this simple example tell us about >attention?

All good relevant questions.

It's hard to imagine learning any control task unconsciously, but there are those who claim it can
be done. I'm sure we're never consciously aware of all the details we have learned, but it seems to
me that some degree of consciousness accompanies all learning.

I wonder why it is that when our attention wanders from the variable we're most saliently
controlling, that control begins to deteriorate (I think). I always think of a grandchild excitedly
reporting on something that went on at school, while the hand holding the glass of milk droops
closer and closer to spilling the milk on my computer. Does loss of awareness lower the gain of the
control system? Or does it allow the reference signal or the perception of "upright" to drift? Some
very specific experimentation is needed to see what is actually changing. Anyone who does the needed
experiments will be the first in the world to do it right.

I also wonder about control systems that have been very well learned -- so well that they can
apparently work quite well with no conscious awareness at all. Is it true that this really happens?
And what is different between such "automatized" control systems, if they exist, and others that
seem to require some attention to be paid to them?

Lots of room for good research here.

Best,

Bill P.

KM: Rick Marken once made the comment, I don’t remember when but it stuck with me, to the effect that to control one perception in a hierarchy at a fixed reference value, the control systems above and below it must be free to vary. Presumably he meant that the lower-level perceptions attached to the fixed perception must remain flexible (have references that are free to vary) because the perception with the fixed reference will be sending out error signals in response to disturbances, which will require changes lower down in the hierarchy in order to keep that fixed perception in control.

As to the higher-level perceptions that connect to the fixed (most salient) perception, it would seem to me that they need to be relatively inactive, that is, not sending out error signals themselves (because of reduced gain?), in order to not to impose shifting reference signals on the focal perception, thus allowing it to remain fixed.

In the extract, Bill gives the example of a grandchild talking excitedly who almost spills milk on his computer, and asks, "Does loss of awareness lower the gain of the control system?�? The kid is no longer saliently controlling the position of the glass, because the focal point of salient control has become the exciting story about what happened at school.

It seems to me that the connection he draws between the gain of the control system and the amount of attention given to a controlled perception makes a lot of sense. We know that awareness shifts to control systems that are experiencing uncorrected errors, and the focusing of attention feels a lot like dialing up the gain of the problematic systems to increase the control efforts and (hopefully) to experience tighter control. If the increase of gain doesn’t do the trick, the next step, it seems to me, is for the reorganization system to get more active.

In sum, I would suggest that to give attention to a perceptual variable requires both fixing its reference at a constant value (temporarily) and (temporarily) increasing the gain of that control system. Of course, for the perceptual hierarchy to work this way, gain would have to be a manipulable variable that shifts from control system to control system “at will," and I suppose this speculation just pushes the problem back a step, forcing the question, "What controls the gain variable?�?

Kent

[Martin Taylor 2016.01.05.23.10]

Yes, for a message posted a year ago in response to another year-old

message (Fn 1), this is a remarkably prescient observation of
something that has been missing in the thread so far – that a shift
of conscious awareness is dissociated from changes of sensory input.
There are moments when I am consciously aware of the fact that
Canada now has a new government, but most of the time I am not
conscious of it. The same is true of being consciously aware that
tomorrow is forecast to be warmer than today, or that I once had a
particularly amusing experience when on holiday, etc. etc. These
“apparitions” in consciousness can be adventitious, or they can be
sought. None of these involve changing the sensory input that
generates the perceptions I control, largely unconsciously, when
composing this message.

This sort of internal consciousness fan be (for me) at almost any

level of the hierarchy. I can be conscious of (perceive) “redness”
not associated with any object, or a red velvet cushion on a chair,
or the majesty of the Governor-General sitting on such a chair
reading the Speech from the Throne at the opening of Parliament (I
perceive the G-G, too, but the perception I was talking about was
the majesty), I can hear the opening bars of Beethoven’s Fifth
Symphony or other music if I want, but mostly I am not conscious of
hearing music at all (I’m talking about times when there is none in
the auditory environment). None of this depends on changing sensory
input, or on error in control systems.

I intend this just as a supplementary observation, and offer no

PCT-theoretic interpretation.

Martin

(Fn 1: I made the same typo myself when I first typed the header to

this message.)

···

[From
Bruce Nevin (2015.01.05 18:19 ET)]

          Rupert

Young (2015.01.03 17.30) –

          RY: I'd say awareness is having

conscious experience about things, consciousness is a
general term for beings who have awareness and attention
is an external observation that a being is focussing on
specific things (of which they may or may not be aware).

        BN: We're in agreement about

consciousness as a generic term; I don’t think you meant to
say it is a term for the beings who ‘have’ it. But to say
that awareness is “having conscious experience” makes it
dependent on the undefined generic term, consciousness. You
make the intriguing suggestion that attention can only be
attributed by an external observer. It is true that the
direction of certain sensory organs is observable, but there
are shifts of sensory attention that are imperceptible to an
external observer, especially e.g. in the aural, tactile,
and kinesthetic modalities, and a far more commonplace
shifting of attention among higher-level perceptions with no
change of sensory input.

[From Rupert Young (2016.01.15 21.15)]

Some nice stuff here. Here are some of particular interest. I'm not sure I agree with this. Although the experience is clearly

had by a single human organism, we could look at the different
control systems as different perspectives (different personalities
almost). So, as the experience shift levels so does the observer, as
well as the content. Perhaps the experience of the overall Self is
something of an illusion.
My emphasis.
I would have thought this was awareness rather than attention.
Attention being the observation of focused awareness.
Sounds about right. What a guy!
This isn’t quite right as there could be other control systems going
on (that are reducing error) but are not the focus of attention.

I like this point. It does seem to be the case that we remember
things to which our attention is drawn, such as unusual events on a
car journey (or death of JFK). But is not also the case that these
represent cases of (significant) error.
Rupert

···

On 04/01/2016 19:50, Martin Taylor
wrote:

  [Martin

Taylor 2016.01.04.14.38]

  Since Rupert's original essay  [From Rupert Young (2015.12.31

15.15)] sounded rather familiar, I just went back to a few months
of the CSGnet archives that I happen to have stored locally, and
searched for the word “consciousness”. I extracted a few chunks
out of some messages, and a couple of complete messages. I expect
that if I had searched for “Attention” or “Awareness” and similar
words, I would have found a bit more, but these give a flavour of
what was happening on CSGnet in the (more or less randomly chosen)
years 1997-99 (plus one fragment from the now defunct ECACS
(Explorations of Complex Adaptive Control Systems) bulletin
board).

  ------------------ [From Rick Marken (971010.1000)]




  I think consciousness moves up and down the hierarchy "at will",

going to the places where error is

  currently largest. If you cut your finger consciousness moves down

to the intensity/sensation level

  (pain). If you are having problems at work consciousness moves up

to levels having to do with

  relationships, rules and values.




  ------------------------- [From Rick Marken (971024.1000)]




  ... So there are, in a sense, as many conscious

me’s as there are levels in the hierarchy.

  -------------------




  [From Bill Powers (971103.1524 MST)]


  There is an Observer which remains the same no matter what is

being observed. When you shift

  attention from one aspect of yourself to another, the content of

consciousness changes but the

  Observer does not.
  ----------------------------[From

Bill Powers (971109.0639 MST)]

  This means that when we are attending, say, to the route we will

be riding on our
bicycle, we are still controlling perceptions of balance, effort,
and so forth. Those perceptual
signals MUST STILL BE PRESENT and they must be under active
control – yet they are not part of
experience at that time. If this were not so, the higher levels of
control could not be working.

What this demonstrates is that the existence of a perceptual
signal is independent of the experience
of the perceptual signal. …

  ------------------ [From Bill Powers(980404.0024 MST)]

  The way we know about reference signals for the highest conscious

level of control is that ** some** perceptions seem right while others seem wrong . In the
control mode, there is no separate
consciousness of the reference signal against which the perception
is being compared; the sense of
right and wrong is like a consciousness of the error signal .
But I suspect it is ** more like** ** consciousness of the effects of actions driven by the error
signal, and not of the error signal** itself . At least this is consistent both with
experience and with the general postulate that

  perception is associated only with the afferent systems.
  [From

Rick Marken (980407.2120)]

  The perceptual signals are always there; what changes is which of

these perceptual signals

  becomes the object of awareness (you suddenly  become aware of the

Rachmaninoff playing in the

  background) and which disappear from awareness (the pressure on

the butt disappears from

  consciousness -- oops, it popped back when I said that;-)).
  ----------------------

[From Bill Powers (980408.0255 MST)]

  I think of awareness as a sort of tunable receiver; tuning it

brings various activities in the

  perceptual hierarchy into view of the Observer. The perceptual

signals that are tuned in appear in

  awareness, so we can say that those signals are the content of

consciousness. But perceptual signals

  that are not tuned in are still present, like all the radio

stations that are broadcasting signals

  outside the range of the receiver's tuning.
  ------------------------

[From Bill Powers (980408.0255 MST)]

  Bruce Nevin (980407.2208 EDT)--






  >... paying attention is coincident with control. Whatever


  >attention is, the statement is that we see an instance of it

when a control

  >system controls a perception.




  But what about control systems that are controlling perceptions of

which you are not, at the moment,

  aware?
  ...


  I don't think attention has anything to do (directly)  with

control. Attention is the reception of

  information from perceptual channels by something outside those

channels.

  --------------------


  [From Bruce Nevin (980413.1456)]




  Rupert Young (980413.1800 BST)--




  >What do you mean by attention ?  One way of looking at it is

as the

  >terminology that is given to the _observed_ behaviour that a

living system

  >goes through when it carrying out some task relative to

specific aspects of

  >the environment (and is said to be 'attending' only to those

aspects),

  >

or in

  >other words, when it is controlling perceptions to reduce

error. This is

  >already explained within PCT and at best 'attention' is

synonomous with 'the

  >behaviour of reducing perceptual error' ...
  ---------------------------

[From Rick Marken (980413.1250)]

  According to PCT, reorganization is always going on (to some

extent) since there is ** always some** small level of error in some control systems.
Consciousness seems to be involved in this process;
maybe its trying to direct the reorganization process to
the part of the hierarchy experiencing the

  most error
  -------------------

[From Rick Marken (980414.1550)]

  But I can solve most of my little


  conflicts "instantly" when I can see them from a new conscious

perspective; this may be

  "reorganization" but it is not the kind of reorganization we have

modeled; it seems to involve a

  change of conscious perspective on the problem; that's why I talk

about going “up a level” as a way

  to solve conflicts.




  -------------------


  [From Bill Powers (990311.1053 MST)]




  I also wonder about control systems that have been very well

learned – so well that they can

  apparently work quite well with no conscious awareness at all.
  And

one snippet from the (now defunct) ECACS bulletin board, 2004

  Martin Taylor


  > > My proposal is that we are conscious of those

perceptions for which control is imperfect or

  difficult, ...
  Now

I am suggesting that the candidate perceptions for

  associative storage are the perceptions of which we are conscious.

[From Rupert Young (2016.01.16 13.00)]

I think there can also self-observed attention, such as "I was

paying attention to the speaker" meaning that the object of my
awareness was the speaker. So, internal observation, but “external”
to the object.
Well, I would say that it is awareness is being directed and we call
that attention.
Yep, these sound like disturbances, which results in error, to
higher level perceptual goals. Until there was error they were not a
focus of our attention (we were unaware).
I agree that it does seem as if we are sometimes ‘just noticing’,
but I think we need to think about what ‘just noticing’ means, and
why we are ‘just noticing’. If it is the case that shifting
awareness is a result of quality control then awareness may focus on
something like this and we experience it as ‘just noticing’. So, I
think we need to distinguish between our experience of consciousness
and what is happening underneath; and maybe our experience of
volitional direction or of being in control of our conscious
experience (now I am ‘just noticing’) is an illusion.
Yep, could be. Though aren’t these two the same process; the
autonomic ‘startle’ loop is ongoing monitoring.
Rather than calling it a scan I’d suggest that there is always
ongoing monitoring, at many different levels simultaneously, and
attention can shift between these according to error. If the overall
system has no error then there is no conscious experience!
Yep, sounds about right; though not sure what you mean by
“problem-solving systems get a crack first”.
Well yes, but that’s not quite what I mean. I’m just saying that
‘attention’ is an informal term an observer uses to describe focus
of awareness.
Not sure I understand this.
I’d say so; a desire to be happy. Do you mean without error? Then
perhaps not, a (true) harmonious environment is one without error,
and without conscious experience. Yep, I am suggesting that qualia may be the experience of quality of
control.
Rupert

···

On 05/01/2016 23:21, Bruce Nevin wrote:

          [From

Bruce Nevin (2015.01.05 18:19 ET)]

          Rupert

Young (2015.01.03 17.30) –

          RY: I'd say awareness is having

conscious experience about things, consciousness is a
general term for beings who have awareness and attention
is an external observation that a being is focussing on
specific things (of which they may or may not be aware).

        BN: We're in agreement about

consciousness as a generic term; I don’t think you meant to
say it is a term for the beings who ‘have’ it. But to say
that awareness is “having conscious experience” makes it
dependent on the undefined generic term, consciousness. You
make the intriguing suggestion that attention can only be
attributed by an external observer. It is true that the
direction of certain sensory organs is observable, but there
are shifts of sensory attention that are imperceptible to an
external observer, especially e.g. in the aural, tactile,
and kinesthetic modalities, and a far more commonplace
shifting of attention among higher-level perceptions with no
change of sensory input.

        I find a clear subjective distinction

between attention, which can be directed, and awareness,
which may passively occur, not always as a consequence of
one’s having directed one’s input functions and/or having
adjusted input gains appropriately (shifting attention).

            BN-: "Conscious attention can be

directed to a specific [skill] to be learned, such as
juggling" [as you said, but] Attention can also be
directed to systems where there is no error.

          RY-: It does seem like that but is

that the same level? Something can be the focus of
attention from different levels. It may be the case that
higher levels are consistently in persistent error, or, to
put it another way, resolving the quality of control.

        BN: My initial reaction is: of course

not, a system that directs attention to particular
perceptions must be at a higher level than those
perceptions. But maybe it’s not so clear. Consider an animal
startled by a noise. The “startle” behavior includes not
only preparation for fight or flight, but also observable
actions that increase sensory input: eyes and pupils widen,
ears prick up, nostrils flare. It is plausible that gain of
some perceptual signals within the hierarchy is increased. I
understand that all this is driven from ‘primitive’ parts of
the brain with no involvement of higher levels of the
hierarchy.

        BN: Yes, this reeks of a primitive

system attending to error. So, is the hypothesis that error
at a high level ‘triggers’ that same primitive mechanism?
Comes a contradiction to a cherished opinion/belief, mocking
of a favored athletic team, mention of our name or a work of
ours, and our ears prick up.

            BN-: Directing attention to a

well-functioning system that is in the course of
executing a practiced skill can interfere with
performance, but does not necessarily do so. You can
watch your fingers forming chord sequences on a guitar
neck. Attention is not an uncontrolled consequence of
error, it is subject to control.

          RY-: But why would you be doing

this? This sounds like at a higher level than the goal of
forming chord sequences. So this may be for the purpose of
resolving the quality of control at that higher level.

        BN: Yes, it could be for the purpose

of improving quality of control; but it could be ‘just
noticing’.

        This seems to be in that foggy

estuary where distinctions are difficult to test. Fretting
chords is a poor example, because we (or I, anyway) often
look to make sure of placement on the correct strings behind
the correct frets. Going for a walk, there are many things
that we ‘just notice’. Scanning for error? So in addition to
the autonomic ‘startle’ loop, there’s an ongoing monitoring
that can have the subjective effect of voluntary ‘noticing’?

        Your proposed quality-control system

might do a broad scan when not employed monitoring the
quality of some particular control loop. Is that a locus of
a sense of well-being, when all is well?

        BN: Reorganization, as postulated,

has no intelligence, no input other than error, and no
output other than ‘change something’. It doesn’t act
immediately, problem-solving systems get a crack first. And
not just any error. Our best model of how to address
reorganization to the right system is: Reorganization
follows attention; attention goes to error. This is
handwaving until we are clear what attention is and how it
works.

        BN: You proposed that attention can

be observed.

        I assume this refers to observable

use of means of perception, e.g. visual focus. Autonomic
processes all carry on at once. Evolution seems to have
arranged their requisite input and output requirements to
permit that. But when ‘voluntary’ control of a perception
uses the same finite input and output capacities (e.g. eyes,
hands) as control of another perception uses, they cannot
both be accomplished at once. (I have not seen this
mentioned in any discussion of conflict.) What restrictions
are there on ‘internal’ shifts of attention among imagined
or remembered perceptions within the hierarchy Such shifts
of attention do not engage sensory input functions with
physically limited capacity. Why is our internal attention
limited in the same way?

        BN: The answer seems to be awareness.

Awareness is limited. Awareness focuses, concentrates on a
perception. Which of course is the hand-waving answer that
is the topic of our conversation. But when we’re not focused
and concentrating, we notice things.

        BN: Reorganization seems to be

deferred until in some sense it is a disturbance to
well-being which persists uncorrected. Is there a system
scanning for ‘well-being’? That notices and appreciates
aspects of a harmonious environment without disturbances?

          RY-: But what does it mean to

become aware of a perception? I think we need something
more to explain what is going on, as awareness is about
qualia not just perceptual signals, which is the heart of
the hard problem. This is what I am trying to address with
the concept of quality control (though it may be a
linguistic sleight of hand to equate qualia and quality).

        BN: Not the same as quality of

control. Here:

          QUALIA: What it is like to have an

experience

          Qualia include the ways things

look, sound and smell, the way it feels to have a pain,
and more generally, what it’s like to have experiential
mental states. (‘Qualia’ is the plural of ‘quale’.) Qualia
are experiential properties of sensations, feelings,
perceptions and, more controversially, thoughts and
desires as well. But, so defined, who could deny that
qualia exist?

http://www.nyu.edu/gsas/dept/philo/faculty/block/papers/qualiagregory.pdf

          Author: "NED BLOCK (Ph.D.,

Harvard), Silver Professor of Philosophy, Psychology and
Neural Science, came to NYU in 1996 from MIT where he was
Chair of the Philosophy Program. He works in philosophy of
perception and foundations of neuroscience and cognitive
science and is currently writing a book on attention."

        BN: We've quoted Bill recently on the

difference between perceptions as we experience them and
perceptions defined as rates of firing in neurons. That
difference is in the qualia. I know I have subjective
experience that is not accounted for by rates of firing in
my neurons nor by electrical and other activities in your
robot, however sophisticated. And that, my friend, is the
jungle into which we have parachuted.

Perhaps relevant to this discussion is a half-century old experiment
on the division of attention (Described in Forbes, Taylor and
Lindsey, Percept. Motor Skills, 1967, 23, 113-120, and analyzed in
Taylor Lindsay and Forbes, 1967, Acta Psychologia, 27, (Attention
and Performance I), 223-229 and in more detail in Lindsay, Taylor
and Forbes, 1968, Perception and Psychophysics, 4, 113-117
).
We asked whether the ability to discriminate basic sensory variables
(the pitch or intensity of a tone burst, or the lateral and vertical
position of a dot on a screen) was affected by whether all four
variables were presented simultaneously, as opposed to only one or
two of them and by whether, when two or four were presented, it made
a difference whether the subject knew beforehand or only after the
presentation which one (and only one) they would have to identify
(high or low pitch, etc.). The results were quite clear.
Discrimination was best if only one of the variables was presented
and asked about, slightly worse if all were presented, but 2 or 4
seconds before the presentation the subjects were told which they
had to discriminate, and substantially worse when they were told
which to discriminate only after the presentation, though it did not
matter how long after, from 100 msec to 4 seconds.
I had earlier demonstrated
and

···

On 2016/01/16 7:46 AM, Rupert Young
wrote:

[From Rupert Young (2016.01.16 13.00)]

  I think there can also self-observed attention, such as "I was

paying attention to the speaker" meaning that the object of my
awareness was the speaker. So, internal observation, but
“external” to the object.
Well, I would say that it is awareness is being directed and we
call that attention.

http://download.springer.com/static/pdf/44/art%253A10.3758%252FBF03209520.pdf?originUrl=http%3A%2F%2Flink.springer.com%2Farticle%2F10.3758%2FBF03209520&token2=exp=1452962285~acl=%2Fstatic%2Fpdf%2F44%2Fart%25253A10.3758%25252FBF03209520.pdf%3ForiginUrl%3Dhttp%253A%252F%252Flink.springer.com%252Farticle%252F10.3758%252FBF03209520*~hmac=5ae51d723cdc61901e65c9da04113233f7cd42f488140e7aceb2fc6a52747317

http://www.mmtaylor.net/Academic/Bayesian_Seminar_1.PDFhttp://www.mmtaylor.net/Academic/Bayesian_Seminar_2.PDF

    On 05/01/2016 23:21, Bruce Nevin

wrote:

            [From

Bruce Nevin (2015.01.05 18:19 ET)]

            Rupert

Young (2015.01.03 17.30) –

            RY: I'd say awareness is having

conscious experience about things, consciousness is a
general term for beings who have awareness and attention
is an external observation that a being is focussing on
specific things (of which they may or may not be aware).

          BN: We're in agreement about

consciousness as a generic term; I don’t think you meant
to say it is a term for the beings who ‘have’ it. But to
say that awareness is “having conscious experience” makes
it dependent on the undefined generic term, consciousness.
You make the intriguing suggestion that attention can only
be attributed by an external observer. It is true that the
direction of certain sensory organs is observable, but
there are shifts of sensory attention that are
imperceptible to an external observer, especially e.g. in
the aural, tactile, and kinesthetic modalities, and a far
more commonplace shifting of attention among higher-level
perceptions with no change of sensory input.

          I find a clear subjective

distinction between attention, which can be directed, and
awareness, which may passively occur, not always as a
consequence of one’s having directed one’s input functions
and/or having adjusted input gains appropriately (shifting
attention).

[Bruce Nevin (2016.01.22 08:00 ET)]

Rupert Young (2015.12.31 15.15)–

Returning to your provocatively titled essay “Why we are not conscious”.

If consciousness is an epiphenomenon, a product of neural activity in the brain, then why does so much of the brain’s neural activity fail to have a conscious aspect? Why is it limited to only some neural activity?

You propose that we are conscious of those perceptions whose control is problematic, and therefore conscious awareness is associated with learning. Learning has two general forms, problem-solving and reorganization. Problem-solving is the adjustment of reference signals by a higher-level problem-solving control loop. Reorganization is the adjustment of gain and connectivity when problem-solving fails. (Is there more involved in reorganization?)

You suggest that consciousness is a quality control system, mobile within the hierarchy, controlling the parameters of whatever control loops are of poor quality. Its inputs are derived from the error signals of ordinary control loops. Taking error to be a measure of the quality of control, this is a restatement of the earlier proposal that reorganization starts tweaking the parameters of control loops that persist in generating error signals and continues tweaking until error diminishes. In that earlier proposal, attention goes to the locus of error and reorganization follows. In your new proposal, the reorganizing system is consciousness.

You say that this proposal is accessible to empirical test because it should be possible to identify the neural structures associated with the quality control systems. You are proposing that the quality control system “may involve a neural structure that extends across vast areas of the brain encompassing the basic control systems in different domains, and at different levels. In this way the focus could shift to specific systems according to the current goals of the hierarchy.” Since consciousness can go to perceptions at any level of the hierarchy in every sensory modality (viz. vipassyana meditation experience), and since it appears likely that any system in the hierarchy may become subject to reorganization, that is a broad reach indeed. Your proposal suggests a shadow hierarchy twinning every loop in the ordinary control hierarchy.

It will not be possible to locate separate neural structures associated with Bill’s proposed reorganization system, because reorganization is a property inherent in the very same neural structures that implement ordinary control loops. This is a much more parsimonious proposal, with some open questions. One concerns the trigger mechanism. I’ve suggested that in the neurons involved in frustrated control (or more specifically those transmitting error signals), or maybe in their extracellular environment, or both, neurochemicals accumulate that cause neurons to start making and breaking connections. That’s amenable to empirical test. More generally, what is going on in and around the cells that constitute a control loop that is controlling well and what changes when the loop is not controlling well?

A second open question is, what is awareness and what does it have to do with this zombie mechanism? I’ve suggested that, outwardly, for an observer, attention can be observed behaviorally as the directing of the limited input capacity of sensory organs and the limited output capacity of effectors to those perceptions that are problematic to control. Inwardly, within the hierarchy, attention is the directing of the limited input and output bandwidth of general-purpose problem-solving control systems to those perceptions that are problematic to control. It is there that I localize awareness. Note that reorganization kicks in when those problem-solving systems are unable to control well and reduce their error.

General-purpose problem-solving systems must be able to control inputs deriving ultimately from any place in the hierarchy (that interpretation of the Pope’s ruling is too strict; that passage for brass ensemble is not balanced; that light is too dim). The canonical view is that they take category perceptions as inputs; this pushes the ubiquity problem off for the Category level to solve. In recent writings I’ve suggested that categories are just extra complex relationship perceptions. Nomenclature questions are not critical, functional relationships are.

There is a confusion toward the end that we’ve discussed a bit already. Quality meaning quality of control, that is, amount and location of error, is not the same as qualia, the suchness of experiencing a given perception. Why a rate of firing in a nerve or nerve bundle should be experienced as the intensity of pressure, or the color orange, or the vowel o, etc. remains an inexplicable gap, as does the question whether your experience of a given variable that we perceive as being in our shared environment and the object of our discussion is the same as my experience of it. There, in the margins of our maps, be the dragons of qualia.

···

On Thu, Dec 31, 2015 at 10:11 AM, Rupert Young rupert@perceptualrobots.com wrote:


**Regards,
Rupert
**

[From Rupert Young (2015.12.31 15.15)]

I've been considering consciousness in the context of PCT, and have

put together a draft of some thoughts in the attached. Comments and
feedback welcome.

Here's a nice summary of the issue,

http://www.theguardian.com/science/2015/jan/21/-sp-why-cant-worlds-greatest-minds-solve-mystery-consciousness

BEN: Problem-solving is the adjustment of reference signals by a higher-level
problem-solving control loop. Reorganization is the adjustment of gain and connectivity when problem-solving fails.

PY: This is not an acceptable description. The difference between problem-solving and reorganization is like the difference between bending and twisting, and cutting and pasting (Einstein’s relativity vs. Kaku’s string theory). To describe what the brain is doing in terms of electrical currents through wires is fine. But you have to pretend you’re making a world class piece of electronic equipment. You can’t just say, “throw a loop in there and a higher level problem-solving loop will control it; if problem-solving fails, just adjust the gain and connectivity”. I wouldn’t pay you $5 for that.

It doesn’t take much to notice that PCT is no longer at the stage where ‘casual discussion’ can advance the science. The language needs to be modified (restricted) enough to describe the mathematical physics properly. This means that the way PCT is being described is wrong, but the error is not automatically detectable.

To append Bill’s theory to the end of a string of advancements in behavioral science is not the way to go. Not enough emphasis is placed upon Norbert Wiener’s work. And with that exclusion, you have lost your best mathematician/player. Don’t forget: Just as behavior IS control, mathematics IS control!! The trick is to demonstrate that all of mathematics follows from PCT. Of course, you would need to familiarize yourself with all of mathematics. But then you’ll at least be in the ballpark.

BEN: Why does so much of the brain’s neural activity fail to have a conscious
aspect? Why is it limited to only some neural activity?

PY: Probably because you’re not engaged in mathematical activities. If you do the math, cognition is forced to go as slow as possible…and voila - transient perceptual signals now exist long enough to be recorded and observed.

[Bruce Nevin (2016.01.22.15:50 ET)]

Perhaps you are misled by the words ‘problem solving’ and ‘reorganization’. They do not have their broad, generic meanings here. They refer to the two kinds of learning postulated in PCT.

One kind is purposeful, that is, it is a product of negative feedback control. Among the perceptions that are controlled in ‘problem solving’ are the operations of logic and mathematics.

The other kind of learning is not purposeful, it is random, although it appears to be teleological in hindsight, for exactly the same reason that evolution appears in hindsight to be teleological.

I have no useful reply to the rest of what you have said here. I look forward to seeing an actual contribution from you.

···

On Fri, Jan 22, 2016 at 11:14 AM, PHILIP JERAIR YERANOSIAN pyeranos@ucla.edu wrote:

BEN: Problem-solving is the adjustment of reference signals by a higher-level
problem-solving control loop. Reorganization is the adjustment of gain and connectivity when problem-solving fails.

PY: This is not an acceptable description. The difference between problem-solving and reorganization is like the difference between bending and twisting, and cutting and pasting (Einstein’s relativity vs. Kaku’s string theory). To describe what the brain is doing in terms of electrical currents through wires is fine. But you have to pretend you’re making a world class piece of electronic equipment. You can’t just say, “throw a loop in there and a higher level problem-solving loop will control it; if problem-solving fails, just adjust the gain and connectivity”. I wouldn’t pay you $5 for that.

It doesn’t take much to notice that PCT is no longer at the stage where ‘casual discussion’ can advance the science. The language needs to be modified (restricted) enough to describe the mathematical physics properly. This means that the way PCT is being described is wrong, but the error is not automatically detectable.

To append Bill’s theory to the end of a string of advancements in behavioral science is not the way to go. Not enough emphasis is placed upon Norbert Wiener’s work. And with that exclusion, you have lost your best mathematician/player. Don’t forget: Just as behavior IS control, mathematics IS control!! The trick is to demonstrate that all of mathematics follows from PCT. Of course, you would need to familiarize yourself with all of mathematics. But then you’ll at least be in the ballpark.

BEN: Why does so much of the brain’s neural activity fail to have a conscious
aspect? Why is it limited to only some neural activity?

PY: Probably because you’re not engaged in mathematical activities. If you do the math, cognition is forced to go as slow as possible…and voila - transient perceptual signals now exist long enough to be recorded and observed.

[From Rupert Young (2016.01.30 13.00)]

(Bruce Nevin (2016.01.22 08:00 ET)]

You say that this proposal is accessible to empirical test because it should be possible to identify the neural structures associated with the quality control systems. You are proposing that the quality control system "may involve a neural structure that extends across vast areas of the brain encompassing the basic control systems in different domains, and at different levels. In this way the focus could shift to specific systems according to the current goals of the hierarchy." Since consciousness can go to perceptions at any level of the hierarchy in every sensory modality (viz. vipassyana meditation experience), and since it appears likely that any system in the hierarchy may become subject to reorganization, that is a broad reach indeed. Your proposal suggests a shadow hierarchy twinning every loop in the ordinary control hierarchy.

It will not be possible to locate separate neural structures associated with Bill's proposed reorganization system, because reorganization is a property inherent in the very same neural structures that implement ordinary control loops.

I'd understood that Bill left it open, that it might be inherent or it may be a separate system. Even it is the same system it should be possible to identify the changes that are going on, or neural activity correlates with consciousness; unless Descartes was right.

However, what I am suggesting are just other perceptual systems, that perceive quality of control, so they are not "separate" neural structures.

A second open question is, what is awareness and what does it have to do with this zombie mechanism? I've suggested that, outwardly, for an observer, attention can be observed behaviorally as the directing of the limited input capacity of sensory organs and the limited output capacity of effectors to those perceptions that are problematic to control. Inwardly, within the hierarchy, attention is the directing of the limited input and output bandwidth of general-purpose problem-solving control systems to those perceptions that are problematic to control. It is there that I localize awareness. Note that reorganization kicks in when those problem-solving systems are unable to control well and reduce their error.

There are no "general-purpose problem-solving" control systems there are just perceptual control systems, though, as an observer you may want to call them general-purpose problem-solving control systems.

General-purpose problem-solving systems must be able to control inputs deriving ultimately from any place in the hierarchy .....

That sounds like what I was saying, though I was calling them perception-of-quality control systems.

There is a confusion toward the end that we've discussed a bit already. Quality meaning quality of control, that is, amount and location of error, is not the same as qualia, the suchness of experiencing a given perception. Why a rate of firing in a nerve or nerve bundle should be experienced as the intensity of pressure, or the color orange, or the vowel o, etc. remains an inexplicable gap, as does the question whether your experience of a given variable that we perceive as being in our shared environment and the object of our discussion is the same as my experience of it. There, in the margins of our maps, be the dragons of qualia.

Well, how do we know that perception of quality of control is not experienced as qualia? Though I do think there is some other dimension missing. Sticking to the control of quality tack then a question, for me, is what is it about qualia that allows an organism to distinguish between systems of differing quality? The reasons may be lost in the mists of evolutionary time, but perhaps there is a reason why, for example, that the subjective experience of red rather than green helps to identify and focus upon a system that has low quality of control?

Rupert

[Bruce Nevin (2016.02.02.20:20 ET)]

  1. How is the reorganization system organized?

The simplest mechanism that I can think of is just like the mechanism of memory. From a paper in preparation:

Memory is a recreated neural signal. Memories are constructed locally at each synapse (Powers 2005.207–230, Kandel 2006, Rudy 2008). Short-term memorry is due to neurochemicals at a synapse sustaining the neural signal. To establish a long-term memory, repeated firing of the synapse alters gene expression in the cell nucleus of the neuron, affecting protein synthesis mediated by membrane transport proteins, such as  RNA. The growth and maintenance of new branches and synaptic terminals, which is the general means for amplifying a neural signal, is also how memory is further strengthened and made more persistent (Kandel 2006). Ascending perceptual input signals and descending output/reference signals evoke and strengthen memory at synapses all along the neural pathways in which the remembered perceptual signal originated. Interruption of this process of ‘memory consolidation’ may prevent, long-term memory formation. Conversely, reconsolidation can change an established memory (Loftus 1998, 1999).Â

Efforts have been made to localize certain kinds of memory in certain parts of the brain. Given the distributed, synapse-local character of memory, this appears to reflect the location of systems for recognizing and controlling the given kind of perception, or of important contributors to it. Establishing and strengthening  memories appears also to involve associative links between different parts of the brain. For example, a memory that includes strong emotion involves signals to and from the amygdala. These links are the basis of Pavlovian conditioning (Mirolli et al. 2009).Â

What happens at the synapses that construct a comparator when the error output remains high? A simple mechanism for reorganization could depend upon neurochemical signals there and at distal synapses to which the connected neurons project, signals that stimulate random changes in connections and (more slowly) branching. These changes might disturb control at other connected comparators, spreading the scope of reorganization, and something must oppose this and keep it in check; and obviously there are many problems of detail. But it is a simple scheme, it answers how reorganization is targeted yet global (can happen anywhere needed), and it rests on a fundamental question that must be addressed regarding the nervous system and indeed regarding any multicelled organism: what’s in it for the cell?

Reorganization is the mechanism of development in embryology. We can’t posit a pre-existing reorganization system for that. Unless we go to Sheldrake’s morphogenic fields, maybe. IMO, if such fields are effective, they are effective as constraints on a random process as above.

  1. General-purpose problem-solving systems.

What I mean is Program-level and perhaps higher control systems (or control loops, or elementary control systems, if you like) within the hierarchy. They are general-purpose systems because of the variety of perceptual inputs that can be recognized as a complex relationship perception that we think of as a

category/symbol perception. These serve as the ‘argument’ symbols of a program perception. They are general-purpose control systems because the same sequence or program logic can apply to a variety of specific problems. A lot of human practical logic relies on analogy and pattern recognition, not what computer programs are best at. Could be that a principle-level perception maybe something like “try it and see if it works” controls a Sequence or Program whose initial inputs are matched, and moves on to another if that one fails. It’s only when the problem-solving routines fail and error persists that reorganization kicks in.

Two questions beckon here, one about attention, the other about what happens in MoL. In MoL, there are two instigators of an internal conflict, and it appears to me that the process brings both of them into a single relationship perception. I found an interesting clue about attention in Henry Yin’s beautiful paper about the basal ganglia, in the first paragraph of subsection 8.1–of course in context of the rest of the paper. I don’t see any criterion as to what constitutes salience. One would guess salience = error, e.g. in an animal error in systems controlling for a quiet and familiar environment, in systems controlling for food or prey if hungry, etc.

  1. Qualia.

It’s all well and good to say that the taste of lemon, the color teal, the opening melody of the Franck violin sonata in A, are all no more than rates of firing in my nerves, or voltages in your robot. But I know I do not experience rates of firing, and there is no way to convince me that your robot experiences lemon, teal, or Franck’s melody. This is the “But I’m not a solipsist, really!” problem that stumped Bill; he asked more than once if anyone knew a way out of it.Â

I accept that you do experience lemon, teal, and Franck’s melody because I perceive you (imagine you) to be a human like me, and at Principle and System Concept levels I control perceptions of commonality of my experience with that of fellow humans. Those perceptions are repeatedly tested in the course of interactions every day. Doubtless, there’s a lot of imagination involved, but I am able to control perceptions that matter to me and get along with living, and that’s the general criterion as to the veridicality of the whole universe of constructed perceptions, right down the the intensity at a taste bud for sourness or at the cochlear hair cells that resonate most strongly at 440 Hz.Â

The rates of firing of nerves at all levels support a unitary experience of sitting in a comfortable teal-colored chair sipping lemonade listening to the Franck violin sonata in A. The experience arises from the multitude of neural firings, but is none of them. (This is a completely imagined perception, by the way–I’m not on vacation in some warm place.)

···

On Sat, Jan 30, 2016 at 7:54 AM, Rupert Young rupert@perceptualrobots.com wrote:

[From Rupert Young (2016.01.30 13.00)]

(Bruce Nevin (2016.01.22 08:00 ET)]

You say that this proposal is accessible to empirical test because it should be possible to identify the neural structures associated with the quality control systems. You are proposing that the quality control system “may involve a neural structure that extends across vast areas of the brain encompassing the basic control systems in different domains, and at different levels. In this way the focus could shift to specific systems according to the current goals of the hierarchy.” Since consciousness can go to perceptions at any level of the hierarchy in every sensory modality (viz. vipassyana meditation experience), and since it appears likely that any system in the hierarchy may become subject to reorganization, that is a broad reach indeed. Your proposal suggests a shadow hierarchy twinning every loop in the ordinary control hierarchy.

It will not be possible to locate separate neural structures associated with Bill’s proposed reorganization system, because reorganization is a property inherent in the very same neural structures that implement ordinary control loops.

A second open question is, what is awareness and what does it have to do with this zombie mechanism? I’ve suggested that, outwardly, for an observer, attention can be observed behaviorally as the directing of the limited input capacity of sensory organs and the limited output capacity of effectors to those perceptions that are problematic to control. Inwardly, within the hierarchy, attention is the directing of the limited input and output bandwidth of general-purpose problem-solving control systems to those perceptions that are problematic to control. It is there that I localize awareness. Note that reorganization kicks in when those problem-solving systems are unable to control well and reduce their error.

I’d understood that Bill left it open, that it might be inherent or it may be a separate system. Even it is the same system it should be possible to identify the changes that are going on, or neural activity correlates with consciousness; unless Descartes was right.

However, what I am suggesting are just other perceptual systems, that perceive quality of control, so they are not “separate” neural structures.

There are no “general-purpose problem-solving” control systems there are just perceptual control systems, though, as an observer you may want to call them general-purpose problem-solving control systems.

General-purpose problem-solving systems must be able to control inputs deriving ultimately from any place in the hierarchy …
There is a confusion toward the end that we’ve discussed a bit already. Quality meaning quality of control, that is, amount and location of error, is not the same as qualia, the suchness of experiencing a given perception. Why a rate of firing in a nerve or nerve bundle should be experienced as the intensity of pressure, or the color orange, or the vowel o, etc. remains an inexplicable gap, as does the question whether your experience of a given variable that we perceive as being in our shared environment and the object of our discussion is the same as my experience of it. There, in the margins of our maps, be the dragons of qualia.

That sounds like what I was saying, though I was calling them perception-of-quality control systems.

Well, how do we know that perception of quality of control is not experienced as qualia? Though I do think there is some other dimension missing. Sticking to the control of quality tack then a question, for me, is what is it about qualia that allows an organism to distinguish between systems of differing quality? The reasons may be lost in the mists of evolutionary time, but perhaps there is a reason why, for example, that the subjective experience of red rather than green helps to identify and focus upon a system that has low quality of control?

Rupert