- #1

- 21

- 0

To try and illustrate just how I get myself into trouble please forgive the following abysmal attempt at formulating a question: let's take a coin flip. The probability of landing on heads when a coin is flipped is 1/2. Now, if I say "what does that mean?" someone might say it means the chance of landing on heads is 0.5. Someone might say, less precisely that it "will land on heads half the time." Or they might say that "on average the coin will land on heads 50% of the time." I say "sure, okay, I can buy that," but then I keep thinking. I may flip a coin 100 times and find that I get heads every single time, right? And there is nothing wrong with this. And someone might say that they can tell me the probability of such a thing happening...and assure me that it is small. And this makes sense to me. But then I start to wonder...wasn't it supposed to be 50%? I could even ask, what are the chances that if I flip a coin 100 times that I will get exactly 50 heads? I don't know the answer, but I know its obviously not 100%. I'm sure that it is the most likely number of heads to observe (whatever that means), but even the most likely number may not be a very likely outcome in an absolute sense.

So just what exactly is meant by the statement "the probability of heads is 0.5?" How can one verify empirically such a statement? Or how can that statement lead to predictions about what a set of coin flips will look like? With any given set, no matter how large, how can you know that you are not observing an unlikely series? Aren't all series of coin flips equally likely? H-H-H is no more or less likely than H-T-T or H-T-H right? So how can H-H-H...100 times in a row be considered to be unlikely? Isn't just as likely as any other set of outcomes?

I know this isn't a well formulated question, but I hope you can see the difficulty I have and I hope a bit of discussion will help enlighten me. Thanks for taking the time...