Oenobareus

From the Greek meaning 'heavy with wine'
A blog devoted to science and reason
Written after a glass or two of Pinot Noir.

Saturday, October 29, 2011

Don't Ask Marilyn - Part 2


I received the email below from a reader in response to last Sunday's post :
Dr. Vann Priest,

Regarding your Oct. 23, 2011 Oenobareus blog post about "Ask Marilyn":

Like everyone, Marilyn sometimes makes mistakes or writes ambiguously,
but not this time.

The reason is that, according to Marilyn's problem, the writing down
of the series occurs AFTER the 20 die rolls.  In addition, either (a)
or (b) MUST represent the actual result of that 20 die rolls.

Marilyn doesn't give the calculations, but (b) is vastly more likely
to have been the actually rolled series.

If Marilyn's problem had said that the 20 die rolls occurs AFTER
writing down (a) and (b), then your probability calculations for
rolling (b) would be correct.  But then the probability of either
(a) or (b) being the actually rolled series would be spectacularly
unlikely.

Best regards,
[name redacted]

I wish to thank this reader, because I it caused me to go back and more carefully consider probability theory.  I wondered if knowing that Marilyn actually did write down the sequence of rolls somehow affects the probability.

I no longer own my probability and statistics book from my undergraduate days, so did a quick Google search and found "Introduction to Probability" by Grinstead and Snell.  This text is published by the American Mathematical Society, is freely available, and may be distributed under the terms of the GNU Free Documentation License.

Knowledge can affect the probability.  [See chapter 4 - 'Conditional Probability']  Back in 1990, Marilyn wrote about the Monty Hall problem.   Many people, including me, were convinced Marilyn was wrong.  It was only after I sat down and carefully considered the effect of knowing what was behind door #3 that I was convinced that Marilyn was indeed correct, and I learned a lesson about conditional probability.  See Example 4.6, p. 136 for a good discussion of the Monty Hall problem.

Here is a conditional probability example from the text - 

Example 4.1 An experiment consists of rolling a die once. Let X be the outcome. Let F be the event {X=6},and let E be the event {X>4}. We assign the distribution function m(ω) = 1/6 for ω = 1,2,...,6. Thus, P(F) = 1/6. Now suppose that the die is rolled and we are told that the event E has occurred. This leaves only two possible outcomes: 5 and 6. In the absence of any other information, we would still regard these outcomes to be equally likely, so the probability of F becomes 1/2, making P (F |E) = 1/2.

So someone rolled a die, told someone else that the outcome was greater than 4.  So we now know the outcome was either a five or a six.  Since both are equally likely, the probability is 1/2.

Now let's discuss last Sunday's "Ask Marilyn" column.  My analysis of the probability of throwing the two sequences  (11111111111111111111 or 66234441536125563152) is correct. Both sequences are equally likely to be thrown.  

But what happens when Marilyn tells us one of them actually occurred?  Nothing!  Conditional probability deals with the probability of future events based on knowledge of past events.  There are no future events in this situation.

Now allow me to further explain why the sequence 66234441536125563152 is the more likely one.  I alluded to it in last Sunday's post.  The reason is entropy.

Entropy is defined to be a measure of the number of possible arrangements. Each distinct arrangement is called a microstate.  Each of the rolls 11111111111111111111 and 66234441536125563152 are microstates.  Both as I have shown are equally likely; this is the fundamental assumption in statistical mechanics  However, with a thermodynamic system (and a good analogy to thermodynamic systems is dice), we usually do not concern ourselves with which microstate the system has.  Physicists are concerned with the macrostate of the system.  The macrostate of the system is specified by some measurable parameters.  From the Second Law of Thermodynamics, we can infer that the most likely macrostate is the one with the largest number of microstates.

An example is in order.  Let's consider the air in your room.  To be able to write down the microstate, we would have to know the position and velocity of each molecule, but to write down the macrostate, we simply have to measure the temperature, air pressure, and the volume of the room.  The air in your room fills the entire volume, because the number of microstates where the gas fills the entire room is larger than astronomical.  The air does not occupy the bottom few inches, because the number of microstates, while still unbelieveably large, is tiny compared to when the air fills the room.

Here's the problem with the dice - it's too easy to emphasize either the microstates or the macrostate, and I'm convinced the confusion lies here.  There are 3,656,158,440,062,976 ways (microstates) to throw a die twenty times.  One of those must occur.  Which one?  We have no way to predict.  It could be 11111111111111111111 or 66234441536125563152 or 56241113533264432213 or one of the other 3,656,158,440,062,973 possible rolls.

For throwing a die twenty times, the most likely thing (macrostate) to happen is for four numbers to come up three times and two numbers to come up four times, and this is what Marilyn describes as jumbled.  For Marilyn's two choices, I pick 66234441536125563152, because this unlikely roll corresponds to the most likely macrostate.

After all this, I realize what really irks me about Marilyn's response to the math instructor.  She makes no attempt to explain.  All she writes in defense of her position is an appeal to her readers sense of what the correct answer is and a restatement that she is right.  If I ever attempt to do in a classroom what she does in this column, my students may print this blog post out, fold it into a paper airplane, and bombard me.  Just wait until my back is turned, so you don't poke my eye out.

No comments:

Post a Comment