really basic trick, i'm quite tired and can't wrap my head around how it works out to 2/3 chances
This is just a rehash of the Monty Hall problem: http://en.wikipedia.org/wiki/Monty_Hall_problem
That should help you find some more info about it.
Another hint would be to consider drawing a probability tree diagram for scenario.
It's a classic problem (http://en.wikipedia.org/wiki/Monty_Hall_problem), the easiest answer is that when you choose a door there's a 1/3 chance you get the woman, so it's more likely (twice as likely) that the woman is behind one of the other two doors. Of course, at first you can't know which one of the two, but after the host reveals the doll behind one of the doors, you're basically left with this choice:
Should I stick with my original choice (with a 1/3 chance of finding the woman behind it) or should I choose "one of the other two doors" (which in total have a 2/3 chance of delivering the desired lady)? In reality you're not choosing "one of the other two doors" (since the host already exposed one), but actually the only door left, which now, after the host revealed the bogus one of the two, has a 2/3 chance of having the woman behind it.
It is not hard understanding through a tree diagram, it is the semantics that i'm struggling with
I'm slightly convinced about the solution, but not entirely
can someone explain with unconditional and conditional probability? explain it entirely?
much appreciate your reply trollcast
just saw nowonda's reply. Yeah, i've understood just as much but what i'm confused with is that why the odds change? even in the wikipedia page:
I find this part to be quite critical. the wikipedia page doesn't clearly examine why many people wrongly believe the odds to be 1/2 and 1/2
can anybody elaborate?
in the 1,000,000 goat example what I don't quite understand is how we can assume the door that the door he did NOT choose has the probability 999999/1000000 of winning while the door he chose is almost LIKELY to have a goat.
That page should give a good explaination of the conditional probabilities in the Monty Hall Problem
thanks for that trollcast!
here are some trivial but cool tricks, if anyone wants to take the time and breakdown each one, i'd appreciate it! i've figured most out by now
I may be talking out of my ars, but I'll try an explanation that doesn't require extensive reading, just common sense. As a hint, it's not that "the odds change", magically surging from 33% to 50%, but the problem itself changes during the process, because there's more info than we had in the beginning. An example (stupid as it may be) is better, I suppose:
I'm playing a game with you, where I put both my hands in my pockets or I keep them both out (randomly), and you have to guess my choice, without looking in any way. Obviously, you have a 50/50 chance of guessing what I did. Now suppose you don't look intentionally, but somehow you catch a glimpse of my left hand tucked in my pocket. And although you didn't have access to all the information (i.e. you only saw my left hand, not both), that info is enough for you to make a re-assessment of the probabilities you initially assigned for the two cases (hands in pockets - 50% and hands out of the pockets - 50%). Obviously, now the probability for "hands in pockets" magically increased to 100%, because you took into account the new available information.
The example is quite retarded, but the principle is similar - new relevant info alters the problem, so you're not actually playing the same game. The game where you choose a door from three, the host reveals a bogus door (new relevant info) and then asks you if you want to change your choice is a completely different game compared to a game where you choose a door from three and the host asks you if you want to change, without showing you a bogus door (no new relevant info).
Let the doors be marked A,B,C. You select A. One of the remaining is opened to show empty. You go for the other. Why?
Because you win if the prize is behind any of B or C. So, by switching, you are allowed to bet for two doors and by sticking back to first choice you bet for one door. Resulting wining probabilities are 2/3 and 1/3 respectively.
Yet another approach:
If you select doors randomly, you will choose a door with a goat behind it 2/3 of the
time over the long run. In this case switching increases your odds , and 1/3 of the time you will select
the car, in which case switching lowers your odds. So 2/3 of the times you play over the long-run it makes sense to
change. IOW, you're twice as likely to have selected a door with a goat behind it than one with the car behind it,
so switching will increase your odds of winning 2/3 of the time.
My best understanding is that Marylin Idot-Savant did not pose the problem clearly-enough to eliminate
ambiguity, i.e., the problem was not well-posed.
thanks for the replies everyone. i finally have a better grasp of it
Separate names with a comma.