Lindley, David; _The End of Physics: the myth of a unified
theory_. New York: BasicBooks, a division of Harper-Collins
Lindley works for _Nature_.
It isn't at all clear what of the following commentary is Lindley's, and
what is Bill's. I haven't read Lindley's book, and am not likely to do
so in the near future. My comments apply to whichever author is
responsible. I apologize for quoting Bill's posting in its entirety,
but given the nature of my comments I did not feel it right to omit any
individual element, lest the omission be considered significant.
Some of the next-to-last paragraph of the book is worth quoting:
The ideal theory of everything, in the minds of the
physicists searching for it, is a mathematical system of
uncommon tidiness and rigor, which may, if all works out
correctly, have the ability to accomodate the physical facts
we know to be true in our world. The mathematical neatness
comes first, the practical explanatory power second. Perhaps
physicists will one day find a theory of such compelling
beauty that its truth cannot be denied; truth will be beauty
and beauty will be truth -- because, in the absence of any
means to make practical tests, what is beautiful is declared
ipso facto to be the truth. (p. 255)
I find this a simplistic view. As I was taught physics in school and
university, and in my later reading of it, physics above all sciences is
based on a very strong interplay between mathematical development and
careful observation. However, it is in general correct that a beautiful
theory (read "simple") is extraordinarily more likely to be close to the
truth than is an ugly one (read "complicated"), but only if the two are
equally successful in accounting for observations. (I posted a precise
but generally misunderstood argument for this statement a year or so
ago, but quite apart from that, it has been pretty well established over
I would add a corollary with which I think Bill P. will agree--that we
can never know the truth of a theory, so that all we have to go on is
how well the theory agrees with observation, and over how wide a range
of conditions this agreement holds. The main thrust of the rest of this
posting is that physics fares very well according to this criterion.
The "beautiful" theories of physics provide predictions that link
observations in widely different areas (from the everyday viewpoint you
can't get much further apart than the structure of the universe and the
structure of an atomic nucleus, and when it comes to direct observation
you can't get much more dramatic than a supernova explosion). The
"beautiful" theories of physics account for weird and counterintuitive
observations that have no representation in pre-quantum physics. What
does happen in a Josephson junction or a field-effect transistor? How
does a scanning-tunneling microscope work? Going right back to Max
Planck, how does the black-body spectrum of a radiating body come to
pass? (Einstein more or less started quantum theory with his answer to
that one, which got him the Nobel prize).
I do not think that there is any science more devoted to the prediction
and fitting of the N-th decimal place of observation, or one that has
had as much success as has physics in linking the sublime (new stars in
the heavens) to the ridiculous (tiny flashes of light in water tanks
deep underground). Nor is there one for which the development of the
theories from precise observation to precisely realized prediction
relies more closely on exact mathematical procedures. (As opposed to the
comment by Dag Forssell (931217 1050): Mathematics tells us NOTHING
about nature. It is a pure abstraction.)
This book contains a long and careful analysis of the history of
physics -- how it got to be the way it is. Lindley emphasizes
periodically that the complexities of "fundamental" (i.e.,
particle) physics have never been invented for the fun of it;
physicists have always been trying to find the simplest
explanation they could find. The complexity of nature, says
Lindley, simply requires a complex theory. This is a very
charitable view, but I think it is open to question. I'm not sure
that Lindley meant it to be taken seriously.
What is meant by a "complex" theory here? The trend in particle physics
has always been toward simplification, from 92 independent kinds of atom
down to a few tens of "fundamental particles" and from there down to
three families, based on six kinds of quark, related by symmetries
previously understood from entirely different fields of enquiry. Which
is more complex, 92 unrelated elements or six symmetric quarks? Oh,
I'll grant you that the four elements Air, Fire, Water, and Earth, were
a lot simpler. Maybe you'd prefer us to stay with them? The only
trouble with them is that there aren't many predictions you can compare
with observations to within even one decimal place. But they are
One hint that he didn't comes in his observation that physicists
seem to treat the current state of particle theory as if it is
"ground zero" -- that is, to be taken as given without any
further attempt to explain why. This puts physics alone among the
sciences in declaring deeper questions to be out of bounds.
I'm not at all clear what is meant by "an attempt to explain why" here.
Is it a requirement to get into God's mind? Other sciences explain
"why" by reference to supporting sciences, physiology to biochemistry,
chemistry to physics. Where is physics to go to ask why? There is no
simpler science to support it, is there? All you have left is a possible
reference to the ineffable. That may satisfy a religious
fundamentalist, whether of Christian, Islamic, Buddhist, or whatever
stripe, but it is hardly a satisfactory objective for Science.
Science is a unity, not a collection of isolated topics each to be
contemplated in glorious isolation. Chemistry is "a science" simply
because it is easier to deal with the phenomena of interest to chemists
in terms of other chemical phenomena, but the "why" of chemistry is to
be found at the simpler level of physics. Some simple chemical effects
can be described in numerical detail by quantum physical computations,
but usually the calculations are beyond the power of our computers. PCT
is "a science" because it makes sense to deal with phenomena of control
in the terms of control, rather than going back to the basic dynamics or
the underlying physics for every different problem. Even so, the
computations rapidly become intractable for the simulation of realistic
numbers of control systems that interact in a single body. And
sometimes one gets a better insight into the "why" of PCT by looking at
the underlying physics, just as one does with chemistry.
As for taking the "standard model" as ground zero, I think Newton's laws
were taken that way for a few centuries, until something better came
along. Newton's laws work pretty well at accounting for a lot of
observations. So does the standard model. That doesn't stop a lot of
people from trying to find something better. If there is a common faith
among many physicists, it probably is that we will never know the
ultimate truth of the universe. It certainly includes the obvious fact
that the standard model is incomplete and quite probably wrong, if only
because it doesn't include gravity and may not account for the dark
matter that seems to have most of the mass of the universe.
There were numerous places during the historical discussions
where I wished that physicists had spent more time asking why
before they settled on an official view. Starting with the
troubles that led to the special theory of relativity, some
physicists seemed to become suddenly impatient with the slow
march of progress; it's as though they wanted to leapfrog the
usual way of going, to skip all the explorations of simple
problems, to get on with it. This is when the theoretical
explorations began to develop their own life, with longer and
longer stretches of uninterrupted computation being used to
bridge longer and longer gaps between experimental
The fact that these gaps CAN be bridged is a somewhat astounding
demonstration of the power and accuracy of the models and theories of
physics. It attests mightily to their strength and range of
application, to the power of human thought and to the importance of
mathematics in learning about nature. One of the most amazing of these
bridges from one observation over a mathematical superstructure to
another observation was the observation in underground detectors of
neutrinos that escaped from the supernova in the Large Magellanic cloud,
exactly as predicted from quantum-based analyses of the inner structures
and evolution of large stars, these analyses being derived from results
of laboratory experiments on Earth. And from other observations, we now
"know" that cobalt-56 was generated in the explosion just as had been
mathematically predicted. Magnificent bridges, indeed!
The world of observation and direct experience,
which is the only ultimate anchor for any theoretical framework,
began to fade into the background. A smaller and smaller
percentage of the critical assumptions were put to test; the
number and variety of actual phenomena involved got smaller and
Who says this, Powers or Lindley? I don't understand it, either way.
In the 19th century the range of actual phenomena considered in physics
was extremely limited as compared to now. Then, physicists dealt well
only with mechanics and heat, and puzzled quite a bit about light
without really coming to any consistent way of describing it. Current
physics takes in all of that, and a great deal more, although I suppose
I have to grant that both then and now it all comes down to what we can
see on meter dials or imagery, or otherwise get in through our senses.
That doesn't mean that the phenomena of physics was or is restricted to
what we see, feel, or hear directly. Those are the phenomena of
psychology. Physics helps us to understand how they occur, but it is
not about them. As with all our perceptions, we derive our physical
concepts eventually from observation, and use them to predict
observation, but physical concepts are not simple sensations, any more
than is our perception of "friendliness" or of "danger."
I wish, for example, that when the quantum nature of some
phenomena was discovered, physicists had taken more time to ask
how this kind of phenomenon might be generated in a continuous
universe, instead of instantly giving up on continuity. There are
many possibilities; think of standing waves in a string, which
occur only in whole-number ratios, yet are completely explainable
in terms of continuous relationships.
Well, that's exactly how I was taught in the mid-50's to see why the
Schroedinger equation predicted quantized values for electron orbitals--
as a kind of standing wave pattern. Just as in a standing wave, there
are only certain discretely separable solutions that are self-consistent
in the neighbourhood of the nucleus. At larger distances, the solution
space is continuous, rather like a travelling wave packet. Maybe they
don't teach it that way now, but they did then (a bit closer to the time
the equation was developed). The quantum nature falls out of the
continuous equation; it isn't imposed on it.
Perhaps physicists were
still in shock from the discovery of the constancy of the speed
of light; perhaps those who were happy to see the old Newtonian
scheme collapse (something of an exaggeration) were just the sort
who would seize on other apparent breaks with tradition without
asking too closely whether they were also necessary.
?? I don't understand either the purported facts behind this statement
or the pop psychology that "explains" them.
On the surface, the ideas that came out of Copenhagen are very
much in line with PCT. We know only what we can observe; the
universe itself is unknowable. If we can't simultaneouly measure
position and momentum, then we must accept that our observed
universe is basically uncertain. If we are limited to the
calculation of probabilities, then the world we are given to
analyze is probabilistic.
The odd thing about this latter assumption is that the main tool
of quantum physics, the Schroedinger wave equation, is basically
a continuous equation, with continuous derivatives. A conscious
decision was made to treat it not as a description of a
continuous phenomenon, but as a description of a probability
That's not clear. The issue is much more subtle than that. Even as
undergraduates, we were told that there was no unique way to interpret
the amplitudes involved in any particular solution for the electron
orbitals (we didn't deal with anything more complex). We could think of
them as representing the density of the electron at that point in the
orbital, or as a probability amplitude (never a probability as such)
relating to the probability of finding the electron at that point. (And
isn't a probability distribution a continuous phenomenon, anyway?) The
main thing our teachers tried to get across is that we shouldn't think
of the solution as something to be appreciated as if it could be seen
directly, were the scale to be magnified to macroscopic dimensions.
It's conceptually different, and the failure of "classical" physics was
related directly to the assumption that human-scale concepts could be
applied in the same way at atomic scales of space and time (or, in the
case of relativity, at intergalactic scales).
All at once, physicists started wearing quantized
and probability-colored glasses, apparently unaware that the same
principle applied: you see the world that is constructed by human
perceptions. The view through these spectables quickly came to
dominate physics; it was accepted that the world was
fundamentally quantized, not Einsteinian.
Only because this view accounted for lots of observations that had been
otherwise mystifying, and predicted many more that were straight-out
counterintuitive and would otherwise not have been looked for. THOSE
observations are made with human senses.
Lindley notes one of the penalties for this decision: general
relativity (which is about a continuous if distorted universe)
and quantum mechanics remain at odds with each other.
Yes, true. Quite a few people have tried and are trying ways of
developing a theory that contains the observations accounted for by
both. That's what the search for a Grand Unified Theory is about.
(Personally, I think part of the issue is that quantization is
essentially a problem of interaction among entities, whereas relativity
deals with embedding relations, in which the only interaction considered
is that between the observed and a hypothetical observer at a distance,
who does NOT affect the thing observed.) But in what way does general
relativity deal with a distorted universe? As far as I am aware, there
has not been a single observational test that it has failed, despite
some really surprisingly counter-intuitive predictions. As best we can
tell, general relativity is about the universe we see, provided we don't
look too small at any particular energy level.
Bang, according to general relativity, would have had to start
with a singularity. Quantum mechanics can't allow that
singularity to exist: only a finite probability cloud could have
The way quantum mechanics gets around the problem of
singularities is to use a trick that has had to be used often
during its development. Well, the physicists say, we know that
there was no singularity at t = 10^-24 sec, so we'll just
normalize to that time and forget about what happened earlier.
The time up to about 10^-35 sec is now considered quite interesting
(inflation). This concept does come out of quantum-mechanical
considerations. It matters, when considering the precise degree of
anisotropy in the 2.73 deg K background radiation. And that, in its
turn, ties in with how much hydrogen, dueterium, and helium there is in
the universe, and whether there are three, or four or more families of
fundamental particles, and how the galaxies cluster in space--and lots
more. It all comes right back to the requirement that the predictions
describe the world as we perceive it.
This same problem arose in trying to describe the electron in
quantum-mechanical terms. When the equations were solved, more or
less, infinities immediately cropped up, both in modeling a
single electron and in modeling the distributions of multiple
electrons around the same atom. So someone decided that if the
wave function could be defined at some small distance from the
singularity, we could just forget about the infinities. This was
It works, to a high number of decimal places. You are happy with 5%
error in your predictions. They don't like errors of one part in a
million in some of their predictions, and 5% would in many cases be
considered a very serious anomaly. Until a better way of looking at
things is developed, or it is shown to provide predictions that are
significantly in error in places where it shouldn't miss, I guess the
kluge will continue to be used. But I don't think "we'll forget about"
it is an appropriate way to paraphrase "we don't at the moment know how
to deal with" it.
In ordinary physics we have a similar problem. If a gravitation
field falls off as the inverse square of distance, what is the
gravitational field at the center of a mass? Infinity, of course.
For macro phenomena, the solution is easy: you recall that a
planet's mass is distributed, so when the distance shrinks to
less than the radius of the planet, the amount of mass
contributing to the field also shrinks and the field goes to zero
at the center of the planet. This leaves the description
believable all the way from infinite distance to zero distance.
But in quantum mechanics this solution was not available, or for
some reason was not considered. Since everything had to consist
of particles, infinities cropped up everywhere (until string
theories appeared), and one had to find an excuse for this
failing of the theoretical representation, or a way to ignore it.
This, I think, is where physics starting getting (a) sloppy, and
(b) mystical. Instead of admitting that there was a problem with
the model, physicists started drawing a veil of mathematics
across the scene. Renormalization was used basically because
without it, the theory failed.
And with it, observations and theory agree so well that the limit in
most (not all) cases is that of our ability to observe. It is not the
theory that fails, but the precision of our instruments. Where the
theory fails is where it fails to say anything at all, such as why
(there's that word) the fundamental particles have the specific masses
they happen to have.
It's a funny veil that reveals so much more than is visible without it!
The Schroedinger wave equation was
transformed from a mathematical expression into an illuminated
script on an altar. At that level of analysis, all search for an
alternative description that would not bring up those ugly
infinities was halted. Nobody ever seemed to think that they
might have been created by the theory: by the Schoedinger
I just don't think that "nobody ever seemed to think" is a reasonable
statement at all. In fact, if you look back at it, Dirac and Heisenberg
came up with quite different approaches to the same set of observational
problems--places where 19th century physics simply could not address
what our human eyes saw in our instruments. Only much later could it be
seen that each of the three formulations could be represented by any of
the others, and Schroedinger's approach proved the most tractable. The
problems were not created by the theory, so much as that the theory
explained so much that had been inexplicable, even though it did not
All these heretical ideas are mine, not Lindley's. Lindley does
not address the issue of what might have been or what the
critical decisions were in the development of fundamental
physics. In fact, Lindley doesn't speak about the influence of
the very early adoptions of premises in creating the difficulties
that physicists have had ever since. Nor does he remark on the
way in which the world of experimental quantum physics has shrunk
until all it seems concerned with is the discovery of a new
particle at longer and longer intervals.
Not the impression I get from my bathroom reading of "Physics Today."
He does point out that one of the latest gimmicks, supersymmetry,
seems to have put an end to experimental particle physics. As
soon as supersymmetry was invented, every known particle in
existence suddenly acquired an imaginary companion particle. The
least energetic of these new particles might possibly be
observable using a supercollider.
As I understand it, two critical observations would be of the Higgs
boson and of the top quark. If the Higgs boson were to be found, it
should resolve somewhat the question of "why" the other particles have
the masses they do, removing quite a few places where observations now
have a loose degree of freedom, and getting rid of those 5% tolerances
for prediction error that they entail.
Observing the rest of them
would require increasing the collision energy by a factor of
trillions. This means that supersymmetry will just have to remain
a figment of the imagination -- beautiful in the eyes of the
physicist, perhaps, but unverifiable.
The verification of such things is usually in the cosmos, not in the
laboratory. However, some of them are still being sought in cosmic
rays, which provide a sort of adventitious laboratory, as they were in
the days before particle accelerators. The energies can be quite large,
up to 3*10^20 ev in one case (Science 10 Dec 93, p1649), which is
"equivalent to a brick falling on your toe," and a factor of trillions
higher energy than we can get in our accelerators. So, direct
observation is not out of the question, even at those energies.
One of the continually astounding beauties of physics is the link
between the ultra-small and the ultra-large. The energies we cannot
achieve in the laboratory were once achieved in the universe, and if
these heavy particles did once matter to the development of the
universe, we should be able to detect the ghosts of their effects. If
they are predicted by theories based on laboratory-scale experiments,
and their effects show up in the observed structure of the universe, I
find that a most exciting prospect.
One cosmic observation we don't know how to deal with at the moment is
that most of the mass of the universe seems to be dark matter that we
can detect only by its gravitational effects on "normal" matter. Does
it require new ideas, or do the current ones fit? Is it an
"observation" that dark matter exists? Its existence is only required
if we are to believe that anything of classical Newtonian physics
remains valid at intermediate scales of time and distance. We could, I
guess, avoid dark matter by invoking magic, in the form of ad-hoc laws
to cover the rotations and interactions of galaxies, but physicists hate
that. They like their few, simple, laws to cover all phenomena,
everywhere. So far, they know well that they haven't achieved this
goal. Some hope they never will, some think we nearly have.
There is therefore nothing
left to prevent physicists from completing the grand unified
theory of everything. All that is now required is that it be
internally consistent, like any systematic delusion. Nature need
no longer be consulted.
Oh, what a sour Christmas! Scrooge Powers has taken away my nice shiny
old physics toy. All my life, I have thought that physics was the
business of explaining the world we see, and now I find that it isn't.
It's just rewriting the beautiful music of the spheres. Why did Scrooge
have to tell me that? Oh, well; I hope I have adequately played the
part of the Ghost of Christmas Future, and that Scrooge will begin to
shed joy into the world instead of gloom.