How are new input functions created?

OK thanks Rick, let me work on this a while!

Hi Bruce, I like your idea of the program level as hacking. But I was thinking that this process not level specific, and that, for example, we could learn to identify a new configuration, transition, relationship, category, etc, automatically once we have had enough experience with it, and this would be developed within reorganisation in awareness of a new input function (not a new type of input function) too? Imagine when Westerners visit isolated tribes with their various tools and gadgets…
What do you think?
Warren

You have forgotten your concession that your proposed demo of program control does not take account of what you believe are confounding variables. Just because the program code for the demo distinguishes sequences and if/then contingencies it does not follow that users of the demo are controlling program perceptions.

There is an alternative explanation of what users are doing and why their response time is slower, so response time does not clinch the matter either.

Until you change the demo so as to eliminate that alternative—confounding variables, as you prefer—it remains unconvincing.

We look at those new perceptions ‘analytically’ at first, as a sequence of perceptions, even the parts of a configuration sequentially, and then ‘teach’ the lower level by controlling them in imagination. That’s the proposal. Check it out subjectively and see how it works.

Hi Bruce

BN: You have forgotten your concession that your proposed demo of program control does not take account of what you believe are confounding variables. Just because the program code for the demo distinguishes sequences and if/then contingencies it does not follow that users of the demo are controlling program perceptions.

RM: It turns out that one can even be wrong about conceding that one was wrong. I thought there were confounding variables because I made the mistake of thinking that it was possible to conceive of program control the way you did: as control of the subsequences that make up the program. But, in fact, there is no control of subsequences – or any sequence – when one is controlling the occurrence of the program. The sequence of shapes and colors that occurs while maintaining (controlling) the program is always different.

BN: There is an alternative explanation of what users are doing and why their response time is slower, so response time does not clinch the matter either.

RM: It turns out that that is not the case. There is no alternate explanation of what users are doing when they are controlling the program. So the relative timing of control of a program versus control of a sequence clearly demonstrates that program control occurs at a higher level than sequence control. And the measure of control presented at the end of the demo shows that you are controlling two different variables when you are controlling the program versus the sequence.

BN: Until you change the demo so as to eliminate that alternative—confounding variables, as you prefer—it remains unconvincing.

RM: I’m pretty sure that it would be impossible to convince you that the program control demo is a demonstration of control of a program perception. You seem to have a deep intellectual investment in the idea that there is no such thing as program control (or that what we call program control is really just control of a sequence of sequences). And as PCT (and the embroidery on the wall at Bill’s aunt’s house) says “a man convinced against his will is of the same opinion still.”

Best

Rick

Oh crikey, sorry Bruce, I think you’re onto it. This actually fits really well with the idea of exchange between propositional and implicational subsystems in Barnard’s theory…

I’m coming to the conclusion that rather than the program level being an upper level within the perceptual hierarchy, it’s a separate hierarchical system that organises information in a non-subjective manner; it’s the system of language, knowledge, facts and semantics that we form about the world that other theories are so obsessed with (beliefs). They do exist! But they are a privileged addition from the human evolutionary heritage rather than a fundamental of our nervous system architecture - that’s still the perceptual hierarchy, and the perceptual hierarchy is still necessary for generating new perceptions. But Phil B explains the advantage of ‘decoupling’ this propositional knowledge from the (motivated; egocentric) perceptual hierarchy. It gives us logic, language, spatial reasoning, etc.
See what you think…

Warren,

I agree warmly with the view you are suggesting. One important support for it for me is in Bruce’s message [Bruce Nevin 2018-12-10_18:49:50 UTC] which I now managed to find again. Even very simple inferences where a perception is detached from its original subjective meaning (to a concept) and connected to a different such perception requires conscious use of logic and it happens with the help of language. And that seems to be generally possible only for above six years old normally educated human beings.

I still think that that kind of “strange” combination of perception, even if it is in a form of program – if it is simple enough – could be hardwired by practice in to the perceptual hierarchy so that it could be controlled also unconsciously.

Thanks Eetu, makes sense!

So this is my current ongoing definition of consciousness:

Consciousness is the ongoing simulation of the self in the world necessary for the development & maintenance of a biologically grounded universal perceptual control system that uses object manipulation, logic & communication to achieve individual & collective control

Warren, Eetu,

This is what my chapter in the Handbook (LCS IV) aims to delineate. Language is collectively controlled perceptions comprising a system that is in part collectively controlled and in part self-organizing. Language perceptions extend at least as high as configurations, transitions and sequences, but probably no higher, pace Chomsky and the algorithmists. They form a branch parallel to the behavioral and somatic branches of the hierarchy. Associations between language perceptions and non-language perceptions are certainly in associative memory; relationship perceptions may be involved. The parallel branch and its association with nonverbal perceptions gives a standpoint (in the nonverbal perceptions) for awareness ‘off to the side’ of and evaluating what is directly experienced and controlled in the behavioral and somatic branches.

Perfect!

Hi Warren

WM: I’m coming to the conclusion that rather than the program level being an upper level within the perceptual hierarchy, it’s a separate hierarchical system that organises information in a non-subjective manner…

RM: What is the evidence that led you to that conclusion?

Best
Rick

See Bruce’s 2020 chapter? In simple terms, language is decomposed into each of the perceptual levels below the program level. A symbolic description of our perception is not necessarily at a higher level. It differs in its attempt to be objective of the self rather than subjective. You’re right though - some empirical evidence would be handy!

Hi Warren

WM: See Bruce’s 2020 chapter?

RM: Yes.

WM: In simple terms, language is decomposed into each of the perceptual levels below the program level.

RM: Not simple enough for me. I have no idea what this means. I presume you are talking about the model, not about language itself. In my understanding of the model, language isn’t decomposed into perceptual levels below the program level; it is composed of perceptual variables from all levels of the control hierarchy, up to and including the program level.

WM: A symbolic description of our perception is not necessarily at a higher level.

RM: If what you are talking about are words, phrases and sentences, then I would say that they are symbol perceptions that are associated with experiential perceptions; they aren’t descriptions of perceptions. The word “red” doesn’t describe the experience of red; it is like the address of perceptions associated with that sound (or written) perception.

RM: And I’m pretty sure that many of these symbols are at least program type perceptions. Sentences, for examples, are symbols for perceptions, that differ in the perceptions they point to in terms of both their words and grammatical structure. For example, consider these two sentences:

Call me Ishmael.

Call me, Ishmael.

In the first we perceive a request that you call the person by the name Ishmael; in the second we perceive a request that Ishmael give you a call. The difference in meaning – in the perception each sentence points to – is called out by a comma in the written version and by intonation in the spoken version. The two sentences evoke different perceptions because of a difference in grammar, which seems to me to be a program level perception.

WM: It differs in its attempt to be objective of the self rather than subjective.

RM: That’s way over my head.

WM: You’re right though - some empirical evidence would be handy!

RM: I don’t see why; you seem to be able to do just fine without it;-)

Best

Rick

Rick, would it kill you to encourage people and tell them how you think they could do better rather than discouraging them and telling them everything you think they’re doing wrong? Considering mortality and the importance of PCT, you should be working as hard as you can not to be the top expert in PCT when people brighter than either of us are carrying it forward into the future.

I request that you, Ishmael, call me.
I request that you call me Ishmael.

These are homonymous but different verbs. In one, you can substitute telephone; in the other, you can substitute name. What’s going on here is more fundamental than commas and intonation. Also see A grammar of English on mathematical principles, p. 351, for starters, and sections 3.5 and 3.6 of the same work.

The actual structure of language—its grammar in fact—is more or less different from the various descriptions of that structure which have been called grammar, sometimes extremely so, and that includes operator grammar. Confusion of formalisms with that which is formalized is still pretty common. Recent work by Stephen Johnson at Columbia and NYU, showing rules etc. as emergent phenomena, is touched on toward the end of that essay (doi 10.1075/hl.00077.nev). Most of the math you probably won’t find relevant to your interests. Suffice to say, it doesn’t support the notion that grammar is control of program perceptions.

Very nice definition, Warren! It is quite packed so I would like to discuss it in parts. I interpret that the first part tells what the consciousness is and the second part tells its what is its function?

(1) “Consciousness is the ongoing simulation of the self in the world"

Traditionally in philosophy this idea is referred by the word reflection: The self reflects its existence in the world. Many think that here looms a kind of a vicious circle problem, because the self (selfness) already seems to require consciousness. If there is first the self and then it starts to simulate itself then the self must already know what it should simulate. In PCT this problem could be possible to solve thinking that the existence of the subject/self is first unconscious control (and then this control is reflected by another separate instance). My idea (in addition to that), however, is that control already requires or produces some kind of pre-reflective(?) or elementary consciousness. (See more in my Philosophies | Free Full-Text | Education, Consciousness and Negative Feedback: Towards the Renewal of Modern Philosophy of Education .)

(2) “[Consciousness is] necessary for the development & maintenance of a biologically grounded universal perceptual control system”

So this tells the function of consciousness. Here I agree. Similarly as perceptual control has developed to maintain homeostasis seems consciousness been (evolutionarily) developed to maintain - and develop - perceptual control. Here is a remarkable difference between these two developmental steps: Perceptual control mainly protects and enables the continuation of the homeostasis and it cannot much affect the way how homeostatic systems work. But instead consciousness seems strongly affect and develop the structures and functions of perceptual control. (Of course there are also the rationalization or self-deceptional functions of our consciousness…)

(3) “…that uses object manipulation, logic & communication to achieve individual & collective control.”

I interpret that this refers to how consciousness works and to what purpose. Here I suspect that this part suits only partly to the phenomenon of consciousness generally because part of it defines only the special human kind of consciousness. There is much discussion about animal consciousness (see the two references in the end) and the application of logic must there be very limited because of its tight connection to ability to use human language. So I think it is important to differentiate the level and scope of the consciousness which we are talking about: consciousness generally, specially human kind of consciousness or (perhaps different levels of) animal consciousnesses.

Thank you for discussing this important topic!
Eetu

Thanks Eetu! Yes, I am talking about human consciousness. I will read your article again and make sure I cite where it overlaps with my work in progress!
Talk to you soon,
Warren

Hi Bruce

BN: Rick, would it kill you to encourage people and tell them how you think they could do better rather than discouraging them and telling them everything you think they’re doing wrong?

RM: Would it kill you to take my advice when I tell you how you could do better rather than continuing to do things that way you do them? And I presume that you have stopped beating your wife;-) See, I can do it too;-)

BN: Considering mortality and the importance of PCT, you should be working as hard as you can not to be the top expert in PCT when people brighter than either of us are carrying it forward into the future.

RM: I’m not trying to be the top PCT expert; I’m not trying to be the top anything, really. I’m just trying to inject my understanding of PCT into this discussion group which, from my perspective, has drifted (and continues to drift) father and farther away from the heart of PCT.

RM: And I am, indeed, aware of mortality (though I admit that I don’t like to think about it all that much), I do believe PCT is important and I would like younger and brighter people to carry it forward. But I would like whatever they carry forward to by a correct understanding of what PCT is and how to study it, which is why I wrote The Study of Living Control Systems.

BN: I request that you, Ishmael, call me.
BN: I request that you call me Ishmael.

BN: These are homonymous but different verbs. In one, you can substitute telephone; in the other, you can substitute name. What’s going on here is more fundamental than commas and intonation. Also see A grammar of English on mathematical principles, p. 351, for starters, and sections 3.5 and 3.6 of the same work.

RM: My point is that these different meaning of “call” perceived by the person reading the sentences. I know that “call” evokes different meanings in the two sentences; but why? My guess is that it results from the way we perceive the programmatic structure of the two sentences.

BN: The actual structure of language—its grammar in fact—is more or less different from the various descriptions of that structure which have been called grammar, sometimes extremely so, and that includes operator grammar.

RM: I think that is surely true. But all languages have some structure that contributes to the meaning evoked by different sentences. In the sentences above the different meaning of “call” seems to depend on how one perceives that structure. Indeed, the meaning of “call” can be perceived both ways in both sentences depending on how one perceives the structure of the sentence.

RM: And it’s not just the sequence that is responsible for the difference: I can hear “Call me Ishmael” as a request to call him by the name “Ishmael” or as a request for Ishmael to give me a call. I can do this with no change of rhythm or intonation in the three words. It’s like the young/old woman reversible figure illusion; same physical input perceived in two different ways presumably because the different perceptions are constructed in different ways.

BN: Suffice to say, it doesn’t support the notion that grammar is control of program perceptions.

RM: I don’t think grammar is the control of program perceptions; I believe the meaning of sentences involves the control program perceptions. I think grammar itself is an “emergent” phenomenon (as you say) which is based on people noticing consistencies in the programs that are used to convey different meanings.

Best

Rick

I’m going back to try better to understand just what you’re asking, Warren. You say that you’re

and go on to mention experiential aspects of conflict resolution in MoL. Then

The latter part of this unfortunately focused my attention on “brain processes or anatomy” in the structure and function of an input function (I’m uncomfortable with the term ‘algorithm’ here). But you’re asking in experiential terms about “conditions or processes that are necessary to create” a unique input function

Eetu referred to a 2018 post about experimental work showing that ability to combine two sensory modalities (left-right direction plus color) is absent in rats and doesn’t come on line in humans until about the age of six, the age when we not only understand and use some language but also grasp, manipulate, and create concepts with help of language.

Recapping, Charles Fernyhough did the experiments, and Elizabeth Spelke followed up with babies. Both were briefly interviewed in this podcast . A blogger posted a Flash representation of the experiments in a free project space provided by MIT’s Media Lab. There, you can experience the phenomena subjectively.

The cross-modal ‘blue wall’ capability shown in this experiment develops around age 6, about 4.5 years after the emergence of systems concepts. The cerebellum, with its myriad connections into all levels of the hierarchy, can make those cross-modal connections. (I wonder if anyone has looked at the role of the cerebellum in synesthesia.) As Eetu points out, greater cognitive fluency with conceptualization also emerges then.

My subjective experience reviewing, juxtaposing, combining, and manipulating concepts—the ‘stuff’ of cogitation—is that it is very much like surveying, juxtaposing, combining, and manipulating objects—configuration perceptions. There is often imagery of visual configurations which, however, are not easy to describe and ‘make sense’ of. They are not perceptions of familiar environmental objects. There are kinesthetic and proprioceptive perceptions of configurations in body posture and their combinations and transitions in movement. (Is this indeed why we gesture when we speak?)

The patterning that constitutes language is configurational. All the hairy business of formal grammars, logical rules operating on symbolic representations abstracted from the observables of language, attempts to produce those configurations as output. Most of spoken language is not in complete, grammatically correct sentences.

Most experimental and modeling work in PCT has focused on the sensorimotor branch of the hierarchy. In that branch, the perceptual input functions of configuration control systems receive perceptual signals from the sensation level.

There is evidence that there are other configuration control systems that control inputs from other than the sensorimotor branch of the hierarchy. And there are obviously stages of cognitive development that develop after the top level of the physical brain architecture emerges at ca. 75 weeks (ca. 1 y 4 m). These are not hierarchical levels above systems concepts, but rather new uses of perceptual control structures of existing types (i.e. at existing perceptual control levels) for more abstract cognitive purposes.

Because not everyone has access to the Interdisciplinary Handbook of Perceptual Control Theory: LCS IV, I here quote an extended selection from pp. 216-217 of the excellent chapter by Frans Plooij:

Barton (2012) clarified, the “key aspect of human cognition is "the adaptation of sensorimotor brain mechanisms to serve new roles in reason and language, while retaining their original function as well.” (p. 2098) Because of extended connectivity between the neocortex and the cerebellum, essentially the same kinds of computation appear to underlie sensorimotor and more ‘cognitive’ control processes including speech (p. 2101). Barton (2012) gives examples of computation and control processes at hierarchically different levels concerning events, spatial relationships, sequences and programs. More recently, Barton and Venditti (2014) have even shown that humans and other apes deviated significantly from the general evolutionary trend for neocortex and cerebellum to change in tandem. Humans and other apes have significantly larger cerebella relative to neocortex size than other anthropoid primates. This suggests that the current, almost exclusive emphasis on the neocortex and the forebrain as the locus of advanced cognitive functions may be exaggerated. Instead the cerebellum may play a key role in human cognitive evolution. Recently, Verduzco-Flores and O’Reilly presented a cerebellar architecture “allowing the cerebellum to perform corrections at various levels of a hierarchical organization spanning from individual muscle contractions to complex cognitive operations” (Verduzco-Flores & O’Reilly, 2015). In human adults there is growing evidence that the cerebellum is not limited to sensorimotor control (Manto et al., 2012), but plays important cognitive roles as well (Koziol et al., 2014; Stoodley, 2012; Timmann, Richter, Schoch, & Frings, 2006), including social cognition (Van Overwalle, Baetens, Marie¨n, & Vandekerckhove, 2014), emotion (Schmahmann, 2010), language (Argyropoulos, 2015; Highnam & Bleile, 2011), and even music and timing (E, Chen, Ho, & Desmond, 2014). In addition, there is evidence that “the cerebellum takes an early role in processing external sensory and internally generated information to influence neocortical circuit refinement during developmental sensitive periods” and thus influences cognitive development (Wang, Kloth, & Badura, 2014). The latter authors propose the ‘developmental diaschisis hypothesis’ that states that “cerebellar dysfunction may disrupt the maturation of distant neocortical circuits.”

The above account indicates that the perceptual levels within the PCT hierarchy may form more complex functional processes where the same kinds of computation that have developed during the sensorimotor stage are used time and again to “serve new roles in reason and language, while retaining their original function as well,” (Barton, 2012) (p. 2098). The latter option is in line with the notion of embodied modes of thought (Barrett, 2011) and with the identification of “the origins of narrative in the innate sensorimotor intelligence of a hypermobile human body” (Delafield-Butt & Trevarthen, 2015). The latter authors “trace the ontogenesis of narrative form from its earliest expression in movement.” In light of this it is interesting to note that Homo sapiens is the only species that has language and the only species that has a life history with a childhood (Locke & Bogin, 2006). Childhood is the interval between infancy and the juvenile period. A great deal of language learning occurs during childhood.