A few years ago, I copied the following from B:CP.
This reorganizing system may prove to be no more than a convenient fiction; its functions and properties may some day prove to be aspects of the same systems that become organized.
Since this is the most generalized control system so far considered, it will also operate on the slowest time scale of all—a point to keep in mind as we consider how this system reacts to various events. To the reorganizing system, the disturbances associated with a single trial in an experiment may be as the blink of an eye—barely noticeable.
we will imagine a device that senses the set of quantities in question, and reports them in the form of one or several perceptual signals. Perception is a risky term here, however. Let us merely call such perceptual signals intrinsic signals, saying that they play the role of the reorganizing system’s inner representation of the organism’s intrinsic state. Postulating such signals is a convenient fiction, serving the same purpose as “temperature” serves in representing the kinetic state of molecules in our thinking.
To represent the fact that each intrinsic quantity has a genetically preferred state, we will provide the reorganizing system with intrinsic reference signals. These signals are also convenient fictions, representing the presence of stored information defining the state of the organism (as represented by intrinsic signals) that calls for no action on the part of the reorganizing system. Action is called for only when the intrinsic signals differ from the intrinsic reference signals. This stored reference-signal information may prove to be a message carried in our genes.
When there is a difference between sensed intrinsic state and the intrinsic reference signals, some device must convert this difference into action. As before, we insert a comparison function (a comparator) into the system, a device which emits an intrinsic error signal that drives the output of the system. The intrinsic error signal (perhaps multiple) will be zero only when intrinsic signals representing the state of the organism are all at their reference levels. Thus, the output of the system is driven by a condition of intrinsic error, ceasing only when intrinsic error falls to zero.
In the reorg and back paper, Bill says the ‘reorganization system’ concept “simply the old idea of “trial and error” reified, brought up to date, and described in terms suitable for modeling. The concept is irrefutable.”
In post #8 of this topic, I merely rearticulated the same ‘irrefutable’ concept of variation and selective retention. Comparing evolution and learning the domain of variation differs (generational populations vs. successive variations in the configuration of control systems. In evolution variation doesn’t stop when the currently most successful variant continues to be successful (the others just continue to be sporadic and less successful); I suspect the same is true in learning if only because of internal conflict (a common cause of ‘mistakes’) and because of environmental disturbances, both of which can disclose new possibilities for conscious consideration.
I have pursued how “its functions and properties may some day prove to be aspects of the same systems that become organized” by asking “what’s in it for the cell?” I touched on that in post 8 and won’t try to rehash those speculations here. That line of investigation suggests specific mechanisms, and accounts both for the restriction of reorganization to those systems which aren’t working well and for the spread of reorganization if the restricted scope of changes is inadequate.
In the 2009 ‘reorg-evolution-back’ paper, Bill alludes to the surmise that attention directs reorganization, which arose in connection with clinical experience with MoL. He sets this aside.
If awareness tended to seek out problem areas, we then had at least one way to keep reorganization focused where it was needed. But this introduced another bit of magic: awareness and its mobility. While those phenomena clearly exist, they are wild cards in any explanatory theory since we can’t explain them. We do not want any more wild cards in our explanatory theories that we absolutely have to have. Even when we have no alternative, they never stop nagging at the theoretician’s conscience.
The error is related to the post hoc ergo propter hoc fallacy and the truism that correlation does not mean causation.
He then proposes that reorganization begins with control systems in the somatic branch for intrinsic variables, and that when they are not able to regain control by reorganization, then reorganization spreads to just those systems in the behavioral branch which are affected by their loss of control.
In the example he gives, however, somatic sensations are inputs to a (behavioral) systems that construct a perception that we call ‘hunger’ and which control the acquisition and consumption of food. Though he does not say so, this is exactly parallel to somatic sensations (‘feelings’) as inputs to systems in the behavioral branch which construct perceptions that we call emotions (in Bill’s model of emotions). This is not reorganization, because those systems are already in place.
The abrupt transition at that point in the paper could do with an explicit reference to the prior discussion of the arm reorganization demo. Reorganization builds new systems from the bottom up, as in his 14-factorial demo of an arm-control system gaining control by pruning unneeded cross-connections for controllers of each of 14 joint angles specifications. The universe of systems subject to reorganization is constrained to controllers of just those behavioral variables which are not controlling well. How?
Attention is from the point of view of control by higher-level systems which observe behavioral outputs and their consequences, which are also active in deliberate learning and practice of skills. (Because they are slower, if they intrude they can disrupt the smooth operation of control systems which are normally well integrated: try analyzing your movements while riding a bicycle and observe yourself wavering and losing your balance.) When control which is normally ‘automatic’ falters, such higher systems ‘pay attention’. A stumble brings your attention from your conversation to the rough terrain you have entered. At higher levels, reorganization and planning can be complementary. Variations proposed by reorganization can be tested in imagination by those systems.