Tuesday, February 07, 2006

A question of probability

Over our smoking breaks ;), I and Tum have had quite a few discussions. 2 of them stick out right now because my mind was mulling over probability and chances:

1. Is there any difference between a pair of dice thrown together and one after the other? Well, this question arose when I was telling Tum about how easy it is to make money at the roulette table when you only bet on the outside. You know, odd-even, red-black and first-second-third-twelve. If you don't know what I'm talking about you can read this article about roulette on wikipedia. Anyways, what I was saying was that since the last 15 outcomes are displayed at the table, it is easy to be right about which way the ball is going to fall. She insisted that each game played is a purely random event, and as such does not depend on the previous results (obviously). I was talking about trends.

We simplified the problem to the roll of a die. If a die is rolled a hundred times, and the die is fair, each face should have come up on top ~16 to 17 times. Now if you were shown a streak of 6 even rolls (2, 4, 6) what would you bet your money on for the next roll? Odd or even? The real answer is that it doesn't matter which way, because it's a purely random event. But what I kept maintaining was that I'd keep putting more and more money (doubling it every time) on an odd roll, because an odd roll is bound to come up sooner or (rather than) later.

Eventually we decided that we are talking about different problems. But actually it boils down to the statement of the problem. She was talking about the difference between a pair of dice being thrown together, and the same die being thrown in subsequent throws. In the former you it is a single event (as it would be if a line of roulette tables were spun together), and the latter are distinct events and the bet depends on pure chance and no trends can be set.

2. The Monty Hall problem. Tum was reading "The curious incident of the dog in the night time" by Mark Haddon. This puzzle presented in the book as follows (pasted from wikipedia):

In this puzzle a player is shown three closed doors; behind one is a car, and behind each of the other two is a goat. The player is allowed to open one door, and will win whatever is behind the door. However, after the player selects a door but before opening it, the game host (who knows what's behind the doors) must open another door, revealing a goat. The host then must offer the player an option to switch to the other closed door. Does switching improve the player's chance of winning the car?

Now think about it before reading the next sentence, the answer is there.
.
.
Okay, the answer is yes. We had a lot of fun discussing this problem. While Tum got it instantly, I took some convincing. The answer is counter-intutive because most of us tend to think of the event as "choosing the door" instead of thinking of the event as switching. So...
  • The player picks goat number 1. The game host picks the other goat. Switching will win the car.
  • The player picks goat number 2. The game host picks the other goat. Switching will win the car.
  • The player picks the car. The game host picks either of the two goats. Switching will lose.
So if you switch the probability of winning is 2/3 as opposed to 1/3 when you first chose the door.

You can read more about the Monty Hall Problem on Wikipedia by clicking on the link.

No comments: