Philosophical probability question

[Hans Blom, 950629]

A cross-post from another list (sci.math). Enjoy (or not, as you like)...

···

From: jdadson@ix.netcom.com (Jive Dadson )
Subject: Re: Philosophical probability question
Date: 27 Jun 1995 02:56:02 GMT

In <3sf6kk$4ri@gozer.inri.com> Charles Yount <cyount> writes:

I'm having an argument with someone here at work about the finer
points of the definition of probability. Here's a scenario:

1. I conceal a coin in one hand and hold both my hands out in front of
me, saying "pick one."

2. You pick a hand.

3. At this moment, what is the *probability* that you picked the one
with the coin?

One response: 50% because based on your current knowledge, you don't
know any more than you knew before I hid the coin.

Another response: 0% or 100% (you simply don't know) because we are
talking about an event that has already happened. Probability only
refers to events in the future. If I show you which hand has the
coin, it shouldn't alter the probability.

What do you think? (Better yet, do you know the answer for sure?)

I know the answer for sure. I started to think about it even before I
knew I was thinking about it.

I had a casino in the bandhall instrument room in high school before
first period, but it was not until 1966 at the age of eighteen that I
became a professional gambler. The Summer of Love. I left my home in
Texas to play poker in the spittoon furnished public rooms of Seattle.
No it was not legal for me to play at 18, but I got in a game one day
when several old men who supported the place with their card fees were
stuck, and one of them yelled at the chip girl, "Sell the kid some
chips!" After that I was a regular. The shoe shine boy would whistle a
tune when the police came in, and I would promptly remove my chips from
the table and visit the toilet.

My pool playing friend Walter made more money than I did. I attributed
this not only to the fact that he was as good as any pool player in the
country, but also to the fact that poker had an inherent element of
chance that could not be bullied out of existence. Walter had a simpler
way of saying it. He said the chance of winning was always 50-50:
Either you win or you don't. That model worked admirably for Walter,
who never lost a match unless he wanted to.

In the mid seventies, I cashed in on the backgammon craze. When I was
in Houston, I used to play against a couple who regailed themselves in
the finest tie-die and crystal jewelry. On every toss they would beg
and plead with the dice. I told them they could not do that, because
influencing the dice was against the rules. My motive was to speed the
game. They were unconvinced by my argument. I was obligued to beat them
at their own pace. Those two and many others helped finance my master's
degree in math.

Years later I watched a dice game played by natives in a grass kiosk on
the island of Koh Larn in the Gulf of Siam. The banker would shake a
single, tiny die in a saucer covered with an inverted cup. At that
point Destiny had spoken. There was a betting round. When all players
were content that their bets were correct, given the state of the die
in the saucer, a designated player was given the option of giving the
saucer a tiny tilt, just enough to tip the die one way or the other. He
took his job very seriously, and would give the matter considerable
thought before taking any action. If he did act, there would be
considerable animation amoung the players. Most would change their bets
or retract them. Some who had not bet previously might get involved.
But I never saw any player add to a bet he had already made. To do so
would have been an admission that his previous bet had been a losing
one.

Obviously, of the two views you proposed, the Thai players took the
second -- or rather everyone other than the banker took that view. It
will not surprise you to learn that the bank enjoyed a considerable
advantage under the rules, and won comensurately.

I will not equivocate. View number one is 100% right. The second view
is 100% wrong. The probability that the hand contains the coin is 50%.

There are different ways to define probability, but the sensical ones
always define it in terms of a knowledge set, typically named "X" or
"K". The symbol P(S), read "the probability of S", is shorthand for
P(S|K), "the probability of S, given K", where K is the common
knowledge about the experiment. Probability is a function of
information. As useful information is obtained, the resulting
probabilities carry more information, measured by a formula called
"Shannon entropy". In cases where nothing is known about the experiment
other than its definition (e.g. two hands, one coin, or one die in a
saucer, six faces), then it is correct to "distribute the ignorance":
You assign probabilities in a proportion that gives them the minimum
information-theoretic content, maximum entropy.

Nowhere is this principle more in evidence than at the horse racing
track. You can make perfectly accurate probability estimates, based on
the information you have, but there can never be any guarantee that
someone who is wagering against you may not have information that
renders yours invalid. Aside from quantum mechanical theories that have
no practical effect at the "horse level", there is no way to say that
the "true" odds, given perfect information, would not give probabilty 1
to the eventual winner, and probability zero to the others.

There is a fairly new theory that attempts to come to grips with the
kind of uncertainty described in the other view. It is called
Dempster-Shafer evidence theory. In that theory, two numbers are
assigned to each proposition, a "belief" and a "plausibility". In the
case of your experiment, you can assign only zero belief to the
proposition that the right hand contains the coin, because you have no
evidence to support that proposition, but it is one hundred percent
plausible, because you have no evidence to rule it out either. The jury
is still out on just how useful DS evidence theory may turn out to be.
The thorny practical problem is that there is usually no known
"normalizing" factor with which to calibrate your belief and
plausibilty measures, because your awareness of your lack of
information tells you nothing about how much information might be
potentially available. Again it's the horse race problem.

Finally, probability has nothing to do with the timing of events.
Probabilty would be just as valid even if we knew that all the events
in the unfolding universe were already completely determined in the
mind of Destiny. It is our lack of knowledge about the events that
gives rise to probability, not the pending nature of the events.

Probability does however have a component that is a measure of
relevance, or "what matters". It is called "conditional independence".
When experience shows that learning the state of a variable B always
renders variable A of no further use for determining the state of B, we
say "C is independent of A, given B." The vague concept we call
"causality" can be attributed to our intuitive understanding of this
relevance relation.

       Dave