In B:CP, Bill offers a tentative proposal for Program perceptions on analogy to the lower orders, and rejects it:
I am not sure how to deal with perceptions at this level. If I were to follow the pattern laid down at lower orders, I would assert that one perceives the existence of a program-like structure at this level, compares it with a reference structure, and on the basis of the error alters the lower-order relationships, and so on. But that doesn’t seem properly to fit the way programs work: they involve perceptions, but the perceptions are part of the if-then tests that create the network of contingencies which is the heart of the program. Perhaps a level is missing here.
He rejects this because the perceptions (plural) that are controlled in a program are part of the ‘network of contingencies’. A perception of a program is not what is controlled. Perceiving the structure of a ‘network of contingencies’ from the outside and comparing it to a reference value for such (a structure? structures?) does not account for “the way programs work”. Such a perception could be relevant to a system which creates programs, but it has nothing to do with how a program functions to produce a desired CV as perceptual input to the system that initiates the program.
Looking at the output of a program and comparing it to a reference value for what the output of that program should be seems closer to the mark. Programs in our PCT sense generally have a purpose, a final CV, just as sequences do. However, monitoring a program’s repetitive outputs to make sure it has not been replaced by another program is looking at the overt behavioral actions resulting from conflict between the system that starts program A and the system that starts program B. (These higher systems are invisible in your demo.) The action of pressing the spacebar is disturbingly like the action of a supervisor monitoring an employee’s observable work activities and intervening when he perceives him playing solitaire on the company computer. It does not account for “the way programs work”, i.e. it accounts for neither the game of solitaire nor the employee’s proper duties.
Bill wonders what is missing. In B:CP, there is no Sequence level. It goes right from Relationships to Programs. Including a sequence level, without choice points and branching (but with interruptions possible) helps clarify the problems of how programs are structured and how they function. I think we should first reach some clarity about how Sequences work and how they are structured.
What is missing is that programs and sequences (at least temporal sequences) are different in kind from perceptions of the orders below them. Control systems at levels below sequences are implemented with a single loop. Sequences and programs comprise plural loops, each with its own CV. Control of one CV is a prerequisite input requirement for initiating control of the next. The final CV is also the CV controlled by the higher-level system that initiates the sequence or program.
An important difference at these levels is that the error signal that initiates control at the beginning of a sequence or program is not reduced by the perception controlled by the first control loop in the sequence or program, nor by the perception controlled by any of the subsequent control loops except only the last one.
For a specific example, here is a sequence by which one may control a perception of tasting brava sauce.
- Control a perception of 1/3 cup olive oil in a small saucepan.
- Control a perception of the saucepan being over medium heat.
- Control a perception of 1/2 Tbsp of hot smoked paprika and 1-2 Tbsp of sweet smoked paprika stirred into and combining with the olive oil.
- Control a perception of the flour stirred into and combining with the mixture.
- Control a perception of stirring about a minute so the flour becomes slightly toasted.
- Control a perception of very gradually pouring one cup of chicken broth stirred into the mixture.
- Control a perception of velvety and smooth consistency, not thick enough to hold its shape alone. The means of control are adding a bit of flour to thicken, adding a bit of water to thin.
- Control a perception of the saucepan being over low heat.
- Control a perception of stirring occasionally during 3-5 minutes.
- Control a perception of tastiness. The means of control is adding salt, stirring, and sampling to taste.
This is ‘translated’ from a recipe. A recipe is a description of what a skilled cook does. A Spanish housewife would not consult a recipe to make brava sauce, because from experience since childhood she is skilled at making brava sauce. A recipe is a set of instructions addressed to people who are not skilled at making what the recipe describes making (here, brava sauce). That is why it is possible and legitimate to ‘translate’ a recipe into specifications of the succession of controlled variables that the skilled cook controls. It is also why the words of the recipe as normally written, perhaps published and printed, cannot be taken as-is as representative of sequence control or (if there happen to be choice-points) of program control.
Despite the allure and excitement about ‘thinking machines’ and ‘electronic brains’, it is well established that computers and human nervous systems function in very different ways. For example, there are many things that are easy for computers but challenging for humans, such as extracting cube roots of large numbers. Conversely, many tasks that are not difficult for humans are challenging for computers, such as perceiving depth information in a single image, summarizing a book or paper, and some classes of NP-Complete problems. Chess and Go are NP-Complete problems for which computers specialized to the particular game must devote enormous computational resources. Computer programs are not good models to presume in a PCT investigation of a program level of perceptual control.
As the brava sauce example shows, a sequence can comprise sub-sequences. The example of a personally experienced sequence that I showed in Manchester shows how a sequence can be interrupted by an unrelated sequence and then resume at the point of interruption. The neurochemical mechanisms for sequence control must provide some persistent signal representing the farthest point currently reached in the sequence. The error signal from the system that initiates sequence control is a persistent signal representing that the next step must be carried out, and the next after, until control of the final CV returns the desired perception. Bill’s diagram for an event (a brief, well-practiced sequence, the word “juice”) is only adequate for recognizing the perception. He did not attempt to show where a reference signal to produce the sequence might come from or where it would enter his proposed structure. Obviously, many words begin with “j”, all but one of them are not “juice”, likewise for the remainder, and only at the conclusion can the perception “juice” be returned to the system that called for it to be produced.
Program control has the same issues, so those issues should be solved first for sequence control, and then when we have a good model of sequence control we will have a basis for thinking about the additional requirements for programs.
The key difference between a sequence and a program is that the program contains branches. The current perception is compared to a standard. If it has one value, the first branch is taken; if it has a second value, another branch is taken; so on for however many values and branches are specified. Each branch is a sequence. What is the test to select one branch rather than others? It is the same as the test for advancing from one step of a sequence to the next. The CV of step m must be in good control as a perceptual input for starting to control step n.
Now what about choice points in a program? The CV at the conclusion of a sequence is the temperature of a dingus. The logic of the choice point is as follows: If the temperature is below 30° do A; if between 30° and 50° do B; if between 50° and 70° do C; if above 70° do D. The input function for sequence A includes a perception of a dingus with temperature below 30°; that for sequence B includes a perception of a dingus with a temperature between 30° and 50°; and so on. The sequences that are linked together to make a program each have perceptual input requirements keyed to a possible output (final CV) of the preceding sequence. The output of one is linked to the input functions of several, and that makes an if/then choice point in a network.
Continuing to the conclusion of that section in B:CP:
It is at this level that we think in a logical, deductive manner. I do not necessarily mean formal logic here; programs can be organized to obey any imaginable rules. At the program level we have not only deduction but superstition, grammatical rules, expectations about the consequences of behavior in the physical world (models), experimental procedures, mathematical algorithms, recipes for cooking and chemistry, and the strategies of business, games, conversation, and love-making. Bruner, Goodnow, and Austin used the same terms: “In dealing with the task of conceptualizing arbitrary sequences, human beings behave in a highly patterned, highly ‘rational’ manner.” One man’s rationality may be another man’s insanity, but that is only a matter of choice of programs. A program level is as necessary for a systematic delusion as it is for a physical theory; sometimes the difference is not readily evident simply because both “make sense” from the seventh-order point of view.
These activities are neither perceiving and adjusting the structure of a program, nor monitoring behavioral outputs to ensure that some alternate program or sequence isn’t usurping the available means of output.