Find the lady probability trick

  • Thread starter Thread starter Saad Khan
  • Start date Start date
  • Tags Tags
    Probability
Saad Khan
Messages
4
Reaction score
0
really basic trick, I'm quite tired and can't wrap my head around how it works out to 2/3 chances

A contestant on a quiz show is asked to choose one of the three doors. Behind one of the 3 doors is a lady. But behind the others is a toy doll.

The contestant chooses one of the 3 doors.

The host opens one of the remaining two doors and reveals a toy doll. The host then asks the contestant if they want to stick with their original choice or switch to the other unopened door.

State what you would recommend the contestant to do in order to have the greatest probability of choosing the door with the lady. Show working!
 
Physics news on Phys.org
Saad Khan said:
really basic trick, I'm quite tired and can't wrap my head around how it works out to 2/3 chances

This is just a rehash of the Monty Hall problem: http://en.wikipedia.org/wiki/Monty_Hall_problem

That should help you find some more info about it.

Another hint would be to consider drawing a probability tree diagram for scenario.
 
It's a classic problem (http://en.wikipedia.org/wiki/Monty_Hall_problem), the easiest answer is that when you choose a door there's a 1/3 chance you get the woman, so it's more likely (twice as likely) that the woman is behind one of the other two doors. Of course, at first you can't know which one of the two, but after the host reveals the doll behind one of the doors, you're basically left with this choice:

Should I stick with my original choice (with a 1/3 chance of finding the woman behind it) or should I choose "one of the other two doors" (which in total have a 2/3 chance of delivering the desired lady)? In reality you're not choosing "one of the other two doors" (since the host already exposed one), but actually the only door left, which now, after the host revealed the bogus one of the two, has a 2/3 chance of having the woman behind it.
 
It is not hard understanding through a tree diagram, it is the semantics that I'm struggling with
I'm slightly convinced about the solution, but not entirely
can someone explain with unconditional and conditional probability? explain it entirely?

much appreciate your reply trollcast

just saw nowonda's reply. Yeah, I've understood just as much but what I'm confused with is that why the odds change? even in the wikipedia page:
To help explain why people thought that the probability of winning both sticking and switching was equal, vos Savant asked readers to consider the case where a little green woman emerges on stage from a UFO at the point that the player has to decide which door to choose, and the host asks the little green woman to point to one of the two unopened doors. In that case, vos Savant points out, the chances that she’ll randomly choose the door with the prize are 1/2, but that is because she lacks the information that a player would have from seeing how the two doors were chosen
I find this part to be quite critical. the wikipedia page doesn't clearly examine why many people wrongly believe the odds to be 1/2 and 1/2

can anybody elaborate?

in the 1,000,000 goat example what I don't quite understand is how we can assume the door that the door he did NOT choose has the probability 999999/1000000 of winning while the door he chose is almost LIKELY to have a goat.
 
Last edited:
Saad Khan said:
It is not hard understanding through a tree diagram, it is the semantics that I'm struggling with
I'm slightly convinced about the solution, but not entirely
can someone explain with unconditional and conditional probability? explain it entirely?

much appreciate your reply trollcast

just saw nowonda's reply. Yeah, I've understood just as much but what I'm confused with is that why the odds change? even in the wikipedia page:

I find this part to be quite critical. the wikipedia page doesn't clearly examine why many people wrongly believe the odds to be 1/2 and 1/2

can anybody elaborate?

http://www.math.cornell.edu/~mec/2008-2009/TianyiZheng/Conditional.html

That page should give a good explanation of the conditional probabilities in the Monty Hall Problem
 
thanks for that trollcast!

here are some trivial but cool tricks, if anyone wants to take the time and breakdown each one, i'd appreciate it! I've figured most out by now

http://nrich.maths.org/1441
 
I may be talking out of my ars, but I'll try an explanation that doesn't require extensive reading, just common sense. As a hint, it's not that "the odds change", magically surging from 33% to 50%, but the problem itself changes during the process, because there's more info than we had in the beginning. An example (stupid as it may be) is better, I suppose:

I'm playing a game with you, where I put both my hands in my pockets or I keep them both out (randomly), and you have to guess my choice, without looking in any way. Obviously, you have a 50/50 chance of guessing what I did. Now suppose you don't look intentionally, but somehow you catch a glimpse of my left hand tucked in my pocket. And although you didn't have access to all the information (i.e. you only saw my left hand, not both), that info is enough for you to make a re-assessment of the probabilities you initially assigned for the two cases (hands in pockets - 50% and hands out of the pockets - 50%). Obviously, now the probability for "hands in pockets" magically increased to 100%, because you took into account the new available information.

The example is quite retarded, but the principle is similar - new relevant info alters the problem, so you're not actually playing the same game. The game where you choose a door from three, the host reveals a bogus door (new relevant info) and then asks you if you want to change your choice is a completely different game compared to a game where you choose a door from three and the host asks you if you want to change, without showing you a bogus door (no new relevant info).
 
Let the doors be marked A,B,C. You select A. One of the remaining is opened to show empty. You go for the other. Why?
Because you win if the prize is behind any of B or C. So, by switching, you are allowed to bet for two doors and by sticking back to first choice you bet for one door. Resulting wining probabilities are 2/3 and 1/3 respectively.
 
Yet another approach:

If you select doors randomly, you will choose a door with a goat behind it 2/3 of the

time over the long run. In this case switching increases your odds , and 1/3 of the time you will select

the car, in which case switching lowers your odds. So 2/3 of the times you play over the long-run it makes sense to

change. IOW, you're twice as likely to have selected a door with a goat behind it than one with the car behind it,

so switching will increase your odds of winning 2/3 of the time.

My best understanding is that Marylin Idot-Savant did not pose the problem clearly-enough to eliminate

ambiguity, i.e., the problem was not well-posed.
 
  • #10
thanks for the replies everyone. i finally have a better grasp of it
 
Back
Top