Local memory as habituation of a function?

Hi all,
This message is a question: “What do you think about memory?” with some introduction.

In B:CP memory is defined as: “the storage and retrieval of information carried by neural signals (p. 207)”. In addition it is assumed that the memory is decentralized to every neuron or even to every synapse (“memory has to be a local phenomenon (p. 209)”; “memory is associated with every synapse! (p. 210)”). This means (perhaps) that every neuron (or every synapse) stores and retrieves the information of the signals which has passed through it in the past.

In discussions and applications the information is often reduced to the strength of the signal (Martin is an exception), and I will stick here to this convention. So we can say that memory is the storage and retrieval of the strengths of past neural signals.

Because the strengths of a signal (the rates of peaks) can and probably do vary continuously the memory cannot store every single value sample or at least there would not seem to be much sense if it would. Instead only some selected values are stored. Perhaps the stored values could be either random samples at certain intervals, or they can be such values when something special happens at the same time in the chemical environment of the neuron or synapse (effect of emotions), or they can be values which last longer of repeat often. The last alternative would mean that memory is some kind of habituation in still lower (or smaller) level than Hebbian learning.

Somewhat inconsistently B:CP also utilizes a computer (or a card file) metaphor and talks about memory locations and addresses which I have difficulties to connect to the idea of local memory. According to a (digital) computer model the stored values are situated into places (locations) of a separate storage and every location has an individual address. For example a value x is stored in a location the address of which is a. If want to retrieve the value x you have send a to the storage and it will return x. So instead of x you have to remember its address a. This does not sound very natural and credible to me. Rather I would think that those locally stored values (the strengths of past signals) are some kind of attractors which increase the probability of the repetition of the signals with similar strength. I try to explain this:

Let’s think that a synapse is (initially) a pass through function:
y = x
This means output value (y) is the same as the input value (x). In addition we assume that the values can vary between 0 – 10. Then for some reason the value 7 is stored to local memory. It will change the function to something like this:
If (x < 5 or x > 9) then y = x else y = 7

This way, around every stored value there will become an area of input values which will produce just the stored value as an output. These areas can be fuzzy so that the nearer the input value is to the stored value the greater the probability that the output value is the stored value. And the better the stored value is remembered the larger is the area. (Does this also mean that every synapse can be a category detector?)

So what do you think, could the local basic memory work like this? How could this kind of memory affect control? Have you possibly even modelled and tested this?

Best
Eetu

In discussions and applications the information is often reduced to the strength of the signal (Martin is an exception), and I will stick here to this convention. So we can say that memory is the storage and retrieval of the strengths of past neural signals.

The strength is determined by the strength of the input(s) to the reference input function from higher-level error output, in combination with local modifications of the neurochemistry and the gate architecture of the synapse.

PCT discussions talk of rates of firing as though the transmission were entirely electrical. Chemical transmission is more common. This opens the process to the intercellular chemical environment. I believe that has important implications for the localizing and spread of reorganization.

The information stored at a synapse is a function of the location of that synapse: the locations and ‘information’ of its input sources that proved its perceptual input, its output effectors controlling perceptual signals immediately below and thence down-hierarchy until closed through the environment (unless imagined), and the perceptual input functions above to which it contributes its controlled signal. Isolated by itself, a neural rate of firing is no more than a rate of firing, it has significance only in respect to two questions: What controls it? For what does it set or affect the reference signal? The information is in this topology. Evocation of a memory can come from a higher system commanding that perception, and it is a transform of reference input for controlling the remembered signal. The vividness and detail of a memory comes from a cascade of lower systems likewise commanding input.

Evocation of a memory can come from association with a current perception. I do not know of a good PCT model of associative memory. The perception that evokes the remembered perception by association may be controlled out of awareness in the somatic branch of the hierarchy, or it may be controlled by the paleomammalian brain (limbic system), or it may itself be a remembered perception evoked in one of these ways.