## Thursday, May 04, 2006

Here's a fun puzzle from the same guys that introduced me to the infinite spheres of utility problem. This one's more along the lines of the Doomsday argument, or perhaps the Sleeping Beauty paradox. Anyway, here's the set up: a group of people are sent into a room. Two dice are rolled, and if they land double sixes then everyone in the room gets shot. Otherwise, they're released and the whole procedure is repeated again with a new group of ten times as many people. (And so on, until a group gets shot.) You're sent into the room. What is your chance of being shot?

Well, obviously 1/36, right? That's the chance of double sixes being rolled. But note that the vast majority (~90%) of people who enter the room get shot. (This is stipulated -- assume there is an unlimited stock of people, ammo, etc. There's no risk of "running out", no matter how many rounds the game goes on for.) You have no reason to consider yourself one of the lucky 10%. You have the same chance of being shot that anyone else who enters the room does. So you must all have a 9/10 chance of being shot.

Very puzzling. I'm inclined to insist that in fact each person only has a 1/36 chance of being shot, even though 90% of the people will be shot. Chance is about causal bases (or something), not frequencies.

Curiously, this means that an immortal with an indefinite overdraft could make a lot of money by betting with every single individual about to enter the room. But then, an immortal with an indefinite overdraft can strategically gamble and profit from almost any repeated game, no matter how poor the odds. (Just keep doubling your bet until you win, then pocket the winnings and start over.) So maybe the same kind of thing is going on here.

Categories:

1. I think this could be related to observing one infinity while ignoring another.

For example if there was some knockout tournament where each round the winner takes the losers money with an infinite number of rounds. In theory every one looses so you could thus conclude in a sense money must not exist afterwards. But the average amount of money remains the same every round. It seems strange to suddenly accept money isn’t conserved when it is a fundamental part of each stage – as fundamental as the fact that people keep loosing it.

So there are two solutions - either to accept that there is a potential world where double sixes are in a sense never rolled that fixes the probabilities (i.e. it is in a sense “infinitely” unlikely, but the number of people is large enough to resolve the problem)
OR
To say that the “infinity” just doesn’t exist so the problem doesn’t exist. I.e. the rules of probability don’t need to function in any problem where you utilize an “infinity” (such as the implied infinite population) that doesn’t exist.

2. What is it exactly that you take to be stipulated? That is, how are you calculating 90%? Aren't you ruling out the possibility of an infinite sucession of non-double-sixes? It would seem to me that given that a shooting has happened, 90% of the people who entered the room got shot. But that doesn't conflict with the chances of being shot being 1/36.

3. That is, I think I disagree with your claim that '90% of the people will be shot'.

4. "Aren't you ruling out the possibility of an infinite sucession of non-double-sixes?"

Yeah, good point. I guess we can't justifiably rule out that possibility, and so the apparent paradox dissolves. Nice spotting!

(I wonder whether the original formulation of the paradox took this into account? It may be that I've simply misremembered it.)

5. Incidentally, that also refutes my "keep doubling your bet until you win" gambling strategy. After all, you might never win. (I should've known that "get rich quick" schemes always fail. Even when we stretch the "quick" over eternity. *sigh*)

6. Probability can be very counter-intuitive; I'll never get over once meeting a trader who accepted the gambler's fallacy...

Interesting coda (for me, anyway, and as it's based on 5 minutes wikipedia research, mathematicians - feel free to correct!) - a problem arises when you try to calculate the odds of never getting double-sixes (or never winning). For a finite number of trials, I’d do that by raising the chance of not getting a double-six (or not winning) in one trial to the power of the number of trials. But you can’t do that with an infinite number of trials, as the operation is undefined.

7. I think there is a confusion of 1) the probability of a particular individual getting shot and 2) the expected number of people getting shot in a given round. The two are not equivalent, so 1) can have prob of 1/36 and 2) can have prob of 0.9 without any contradiction. They have the relationship of

P(particular individual getting shot)
=Expected number of people shot/ number of people in room.

If we start with a group of 32.4 people (shrugs), both probabilities will be consistent with your figures in the post.

On the other hand, if the game draws from the entire world population randomly with replacement (ie. a person can be playing the game twice in different rounds), starting the game with one person will give a very good chance of getting the entire world shot. In that sense, the probability of a particular individual getting shot will be >>90%.

8. Wait, no, ignore the above. Think I got my math wrong. There is still no chance of the expected number of people getting shot reaching 90%. Think Russ Gorman above had it. Confusion of conditional probabilities with unconditional.

9. Arthur - [where do you get your "expected number of people getting shot in a given round" from? What does the "given round" bit come in? Note that, for any given round, either the entire group dies or no-one does. If the former, then the number of people shot equals the number in the room. If the latter, then the number shot will be at least ten times greater than the number of people in the room. So your fraction "Expected number of people shot / number of people in room" will never be less than 1, and in particular it won't be 0.9.]

Never mind, I wrote the above before you posted your retraction. On your final point, note that the game is meant to have no replacement. Otherwise, as you note, that would remove the paradox right away.

Russ - you can take the limit of the sequence P(n) = { (35/36)^n } as n tends to infinity. It's zero. (That's what we'd intuitively expect: the chance of never winning gets asymptotically smaller as the number of trials increases.) But perhaps you were wanting something a little more direct.

10. Richard; I was wondering where the appeal of the thought that the probability of being shot was 0.9 came from (or that we can win by doubling for ever); my rough thought was (and I'm now worrying about demonstrating my mathematical ignorance!) that the conditional probability of being shot given a shooting is 0.9, so if one thought that the probability of a shooting was 1 (i.e. probability of no shooting was zero), one would get that the probability of being shot was 0.9. But the probability of no shooting is not zero. Does that make any sense (or am I babbling!)?

11. *sits around in a huff about not being recognised for saying it first* [as usual]

of course my posts are not usually the clearest things in the world.

BTW (over it already) RE the GF, I would very slightly favour a variation saying "a seemingly random event is more likely to occur because it recently happened."
I.e. there probably has never been a perfectly fair coin or pair of dice. but in the obvious cases I expect the expected earnings to be really really close to zero.

12. The probability as the problem is stated is 1/36!! Allow me to dare call that final answer... To elaborate on my justification from a mathmatical POV (which im generally good at):
The problem states: "You're sent into the room. What is your chance of being shot?", so what as happned before or after you got into the room DOESNT affect the result of this experiment; The fact that you are in the room is given and not to be debated... You either get shot this time, or never, and for that particular time the chance is 1/36!

The "get rich" strategy for gambling is worth thinking... CONSIDER if we calculate the worth of a the strategy, by finding the sum: sum( Weight(leaf-node) * Probability(leaf-node) ) for all leaf-nodes; where Weight is the amount gained from each choice (negative if amount is lost) ... the sum for an infinately large game-tree, we find that the sum is always negative!!

Im not sure if the sum( Weight * Probability ) is a better (or not better) analysis from the simple sum ( Probability(leaf-nodes) ) which is equal to 1 for an infinately large game-tree... Remmember that at the time the probability to lose is equal to zero, the amount of money you are gambling for is infinite!!!

13. I can't quite see where the paradox is. The probabilty of being shot is 1/36.

How do you derive the 90% figure? It's not the vast majority that get shot, it's 100%.

It seems that you might be sliding into calculating the probabilty after a certain number of rounds but then that would be for how many people are likely to get shot. Which sounds to me (not really being an expert on probabilty therory) like the stopping time for a martingale.

14. It also seemed to me like 1/36 was the obvious answer (I didn't see the paradox until Richard explained it), so let me try to make the other answer seem intuitive.

Suppose that two dice are rolled repeatedly until they land double sixes. Call the roll when that happens the nth roll. There is a computer with an endless list of people on it, and the people don't know where they are on that list. After the dice come up sixes, a large number of people at the top of the list (equal to (10^n)/9, rounded down to the nearest integer) are called into individual cubicles. Then just over 90% of the people in cubicles (10^(n-1) people, all but the first (10^(n-1))/9 people on the list) are shot. You're sent into a cubicle. What is your chance of being shot?

This can be created from the original version through a series of slight modifications:

1. As Richard describes, and we're assuming that the first group has just 1 person.

2. Like 1, except the dice are rolled before the group enters the room, not in front of the group, and the people are told what number came up when after they come into the room.

3. Like 2, except all of the dice rolls take place before any group comes into the room. If sixes come up on the nth roll, then the first n-1 groups that are called in are told the number and let go, and the nth group is called in and shot.

4. Like 3, except there are separate rooms (an unlimited number of them available) and each group enters a room simultaneously. If sixes appear on roll n, then n groups simultaneously enter their respective rooms, and n-1 of those groups are told that they can go free, and the nth group (the largest) is shot.

5. Like 4, except the people do not know what group they are in. That information is stored on a computer that the dice rollers have. Also, each room is separated into individual cubicles, so if a person's group goes into a room then each person walks alone into their own cubicle and never even sees the rest of the group.

I believe that #5 is identical to the alternate scenario that I described. If you give different answers to my scenario and Richard's, at what step do things change?

15. Blar, thanks, that's helpful. It looks like step #3 is the crucial one, since it guarantees that a double six was indeed rolled for some group or other (otherwise you wouldn't have entered the room yet).

Neil - The people in earlier groups get set free, not shot, which is why the total proportion shot is only 90%. (100% of the final group get shot, but we're talking about the total here.)

Russ - That sounds plausible. Except that the probability of never winning does appear, mathematically, to be zero. (Perhaps this is one of those odd situations where zero probability events are nevertheless possible? Like drawing an infinite lottery; the winning number had zero probability of being selected, and yet there it is!) That may pose problems for the standard probability formulas, preventing us from inferring that the unconditional Pr = the conditional Pr.

I'm not sure of any of this though. It's all rather confusing.

16. Thanks, that makes things a bit clearer.

But this seems to me to be a case of two different probabilities. The first being the probality of being shot in any one event of the dice being roled, which is 1/36. The second is the probability of eventually being in a group where sixes are thrown which is possibly 9/10 but I'm not quite sure since this is a sum of an infinite series and can't think of it off the top of my head.

(Since the sample population is presumed to be infinite then the probabilty would actually tend to zero perhaps).

17. FYI
It tends towards .90278 (.9 of 35/36 + 1 of 1/36) as it tends towards infinity at each incidence of a shooting.
It also happens to be 0% at each incidence of non shooting.

18. actually scrap the second sentance - too late at night clearly. the only thing is the no sixes effect above.

19. The problem uses language in a deceptive way, by saying "You're sent into the room," and making you assume you're picking the time to enter the room randomly. Consider the following analogy:

"Supermarkets only have Christmas decorations during December, 1/12 of the year. But, ten times as many people visit the supermarket in December as during all other months combined. you enter the supermarket. What is the chance you see Christmas decorations?"

In this problem, the sentence "you enter the supermarket" makes the reader assume that you're picking a random day to enter the supermarket, making the probability appear to be 1/12. However, this disguises the fact that statistically, "you" are much more likely to be entering the supermarket during December, making the probability you see Christmas decorations actually much higher than 1/12.

20. Followup: So we can see that in the problem, "you" are much more likely to be entering the room to die, due to the statistically larger size of the batches that die. The probability is 90% that you will die.

This is called a "base rate" problem and the "base rate fallacy" that causes people to answer 1/36 (in the original problem) is taught to CIA analysts, among others, so they can avoid doing it.

21. Hi

I agree with John Gathercole's comments about the relation between the probability of dying and the mechanism used to decide when you enter the room. In 2003 I wrote an analysis of the problem that I put on my web site.

http://www.geocities.com/dougtclark/ShootingRoom.htm

I concluded that if you were selected to enter the room as part of a random sample of participants, then your chance of dying was 0.9, whereas if it was predetermined that you would enter the room at a particular 'stage', (e.g. it might be predetermined that you would enter the room when or if the size of the group entering reached 1000) your chance of dying would be 1/36.

22. The confusion is that "90% of the people who play die" ***is not a probability*** - it's simply derived from that fact that the ultimate/final group is always 10 times the size of the penultimate group.

The independent events are ***cohorts*** - which by definition increase in size. Using percentages to express the relative size of the cohorts ***and*** the "odds" for each cohort is classic obfuscation. so a NOT deliberately confusing way of stating this is:

(a) for each cohort, the odds of dying are 1/36 - the odds of an independent roll of dice coming up double sixes

(b) each cohort is 10x the size of the previous - except the ***second*** cohort, sneakily chosen to be 9x the size of the first (1) - this each time, making the sum of all previous cohorts 1/9th the size of the current cohort, and keeping the 9-to-1 ratio - and the 90% mentioned.

Visitors: check my comments policy first.
Non-Blogger users: If the comment form isn't working for you, email me your comment and I can post it on your behalf. (If your comment is too long, first try breaking it into two parts.)