I think we have both been looking at this issue of “correct” or “restricted” model wrongly.
If we look at the paradigmatic situation, the PCT model or any other model is a dramatic simplification of what is assumed to be happening inside the organism (or even in the environment). For example, when Powers introduced the concept of a “neural current” he considered it a gross simplification that allowed him to look mathematically at the firings within a “neural bundle” (itself a major simplification) as a “neural current” that he thought might give results within perhaps 10% of observed results when used in a model of control. He reported being pleasantly surprised when the results were better than that.
It doesn’t matter how deep you go into the “neural bundle”, whatever you say about it will be a simplification. For example, in the bundle concept, a nerve fires at some moment in time and otherwise is silent. For the neural bundle concept, the rate of firings per second matters, because the bundle consists of fibres whose firing rates are highly correlated. But at a slightly less simple level of describtion, the polarization of the nerve with respect to the peri-neural medium changes both before and after the so-called moment of firing. Concentrations of Calcium, Sodium, Potassium, and more complex molecules begin to matter; individual dendrites perform what we simplify by calling them computations; individual synapses have varying sensitivities according to the supplies of so-called messenger molecules. And so forth. What happens in Nature is always simplified when we try to talk about it or use some property of it in modelling.
In the same way, what I called “Rick’s restricted model” of PCT and he calls a “correct model” is neither correct nor restricted. It is a model at a level of simplification that Powers (and many others subsequently) found useful in describing to an acceptable precision various effects observed in experiments or empirical observations. I was wrong to think and write about his model as though it was wrong in any respect that matters for what interests him. It’s wrong only when applied to matters that interest some other people, including me.
Rick’s model that he calls “correct” is just a simplification that works very well at one level of describing Nature. As to whether it represents a particular level of simplification on which Powers settled as optimal, that’s something on which I have no comment (at least not in this message).
An important general point about “scientific simplification” is that it is always important to understand that what you are dealing with IS a simplification of something more complicated, often in ways of which you as yet know nothing. Science, over the course of thousands of years, is fundamentally the change of “nothing” to “something”. Science changes the Storm God into a complicated set of models of how energy is transferred in the atmosphere from a source of heat such as the sun or a storage place such as the ocean or fossil fuel into lightning, rain, and wind.
The PCT model that excludes tolerance, for example, is a simplification of one that includes it. Neither is “correct” if your guide in Nature. One may be more incorrect than the other, purely because the less simplified model accounts for more about Nature than does the simpler model, and the added complexity may give insufficiently correct results in the areas it is supposed to address.
In the case of “tolerance” in particular, Bill pointed out that in engineering control, a loop with zero tolerance will “jitter” or “hunt”, the error oscillating between positive and negative. If you aren’t interested in that small discrepancy or it is hidden in other sources of experimental or observational “noise”, it is perfectly fine to use a simplified model without it. Sometimes you can even improve the performance of the model by adding external noise rather than tolerance bounds. But if you want to have fast recovery from a disturbance that changes value step-wise, some tolerance in the loop helps. Likewise, if you are interested in how groups of people can coexist without conflict, you will investigate the role of tolerance.
Rick says he wouldn’t pay attention to “all that stuff” unless there are empirical (in which I think he includes experimental) demonstrations of the effect. Empirical and experimental demonstrations are always important, but they are not always feasible when you get into social situations or into higher-level perceptions. The first explosion of an atom bomb at Alamagordo was described mathematically (and pretty accurately) before the test, and the bomb could not have been built if all the steps had needed to be empirically or experimentally tested. Of course the mathematics had to work with simplified assumptions, just as would any mathematics used to apply PCT to social interactions. It applies equally to supercomputer simulations of, say, galactic evolution and the workings of he brain.
My bottom line here is that there is no primacy in Science between observation (empirical or experimental) and theory. Theory suggests the possibility of tests, but the tests will not be done if they are infeasible or (perhaps) if they might cause irreversible damage. All models are simplifications of Nature, and the only appropriate question is whether a particular simplification answers to the purpose of an investigator.
As one of my professors said about theories: “When two schools of thought each contend that the other is wrong, they are probably both right.”