[Hans Blom, 970501b]
(Bill Powers (970429.0953 MST))
White noise is defined as containing all frequencies equally. If
the noise power is finite, mathematics says that there is "no"
noise power in a finite frequency band. So there is "nothing" to
cancel.
Are you sure you're saying this correctly?
Yes.
That's wierd!
Yes. Some (all?) mathematical notions are extremely weird. Take a
sine-wave, for instance, as "embodied" in the sin-function. Have you
ever seen one? Of course not; at best something that resembles it in
some ways (a rather constant period between zero crossings, rather
constant positive and negative extrema, a similar shape) but not in
others (absolute constancy of these features, infinite extent). The
sin-function, for instance, maps from minus infinity to plus
infinity; it's a Platonic ideal that you will never see in real life.
Remember Einstein's quote about the relationship between mathematics
and reality!
White noise is such a mathematical notion as well. It does not exist
in the real world, yet it is an extremely useful concept (model?).
If you were to run an electronic noise signal through a resistor,
the resistor would get hot, wouldn't it?
That would depend on the _power_ of the signal, as you well know, not
on its (noise or no noise) _waveshape properties_.
Would you believe that a resistor, all by itself, _generates_ a white
noise signal? Physics says so. It's measurable as well (but only with
appropriate instruments).
Of course we could simply change the definition of white noise to
something sensible ...
No, let's not. And let's not change the definition of the sin-
function either. They are just too useful :-).
... such as equal power at all frequencies within a given
bandwidth.
That would be called bandwidth-limited white noise. Another very
useful concept. More realistic than white noise, too :-). But less
simple: one has to specify the bandwidth at which it is limited, and
the way in which it is limited. Regrettably, there is no physical
method through which one can construct a signal with equal power up
to frequency f and zero elsewhere; only approximations are known
(using physically realizable but mathematically non-ideal filters).
It is all these "real world" approximations that are discarded in
mathematics. Mathematics operates in an "ideal" world, where "ideal"
calculations exist. Translating the results of math back into the
real world is at best a hazardous affair. And translating the real
world into math (modeling) is an equally hazardous affair ;-).
Talking about frequencies and crazy notions, would you believe that
many electrical engineers routinely talk about _negative_ and even
_imaginary_ frequencies? Well, why not? Negative and imaginary
numbers are useful concepts as well. Do these "exist"?
Now that we're back to reality --
Which one?
If the "error" is the difference between reference and _filtered_
perception, that error might be small indeed. But then the next
question is: why _that_ error? And how to choose/tune that filter?
Not my problem. I'm trying to understand a system that already
exists, not design a system.
In "reverse engineering" (trying to understand a system that already
exists) it helps to be aware of or discover the design objectives
(the top level goals). If one can catch them, one frequently does not
have to worry much about low-level details. In software reverse
engineering ("hacking"), for instance, it would be enough to
understand that a certain section of code has the function of
plotting a line on the display from point X to point Y. With this
understanding, any competent programmer would be able to insert some
code that happens to be lying around. The overall result may be a
design that is -- at a low level -- very different from the original
one, yet functionally identical. Great stuff for patent lawyers!
Note that you do something similar when you "hack" sequences of
action potentials in multiple nerve fibers into one "nerve current".
But I doubt that when you do so the designs are still functionally
identical to the same degree.
Maybe the filtering simply represents an unavoidable feature of
neural transmission, or of synaptic and intracellular processes.
Every design has its constraints. It may be built up using vacuum
tubes, transistors, nerves or whatever. It is necessary to be aware
of the constraints that these impose. Yet, at the functional level
the designs may be quite comparable.
Or maybe it has a function shaped by evolution or a Divine Engineer.
That is a helpful thought. To me, at least :-).
If the bandwidth turns out to be variable, that would make a
difference: if the bandwidth of perception turns out to be under the
control of a higher system. We won't know if it's variable until we
find out that it's variable. And then we'll deal with it.
Don't you think perceptual bandwidth is variable? I tend to see more
or better if I pay attention. How shall we deal with attention?
Greetings,
Hans