# Does probability depend on the amount of information one has?

1. Nov 20, 2007

### philosophe

The following is the account of an argument that I had with my friend the other day:

At 7.45 PM my friend asked me what time the mail room would be open till. I said that I did not know but that it was unlikely that the mail room would be open this late. He then said that he was not busy at the moment so he might as well just go and check it out; he then came back five minutes later with his mail and said that it was in fact open. I said that although it happened to be open, the probability of the mail room being open at 7.45 PM was quite low so taking that chance wasn't necessarily the best decision. But my friend argues that the probability of the mail room being open at 7.45 PM was 100% because there is no instance in which he could go down to the mail room at 7.45 PM and find it closed!

Compare this to a multiple choice quiz with four options for every question. Intuitively it seems that putting down A, B, C or D would give you a 25% chance of being correct. But if the answer is B and you mark it D, then you have a 0% chance of being correct. How are the two cases different, if they are different?

Basically I want to get to the question of whether or not probability depends on the amount of information one has.

Last edited: Nov 20, 2007
2. Nov 20, 2007

### Krusty

What you're trying to do is estimate a population parameter (the probability that the mail room will be open on any abitrary day at 7:45) based on a sample of size one.
The estimate of the population parameter is still unbiased, so your friend is right in saying his estimate is 100% since there is no evidence that it is ever closed, but his estimate has infinite variance.
By his logic, if you toss a coin and it comes up heads, all subsequent tosses of the same coin must be heads, since the coin has never shown a tail.

3. Nov 21, 2007

### CRGreathouse

You're talking about prior probability (the chance that it would be open on a given day at that time) and he's just talking about probability.

4. Nov 21, 2007

### Chris Hillman

The buzzwords you want are "the Baysesian interpretation of probability" and "prior information". The late Edward Jaynes was working on his magnum opus when he died; fortunately it has been made available over the web and makes for fascinating reading; see this website.

From your handle I guess you might be a 18th century natural philosopher (i.e. what we'd probably call a scientist), but if you are a philosophy student, I have argued that the most important problem in philosophy which philosophers are almost entirely ignoring (in preference to far less momentum problems like space and time) is the problem of the interpretation of probability (and statistics). I also have pointed out that a result in ergodic Ramsey theory widely regarded as the most important theorem of the previous century is almost unknown to the public: