What Is the Probability That Player 1 with More Coins Wins in a Coin Toss?

  • Thread starter Thread starter Flexington
  • Start date Start date
  • Tags Tags
    Probability
Flexington
Messages
17
Reaction score
0
Consider 2 players, of which 1 has one more coin than player 2. Both throw all their coins simultaneously and observe the number of heads.
If all coins are fair what is the probability that player 1 obtains more coins than player 2?
Equal a priori states same probability over all coins, 1/2. And assuming there distinguishable my gut tells me its a half. However, i am much more enclined to agree with solid mathematics, of which i have none..

Thank you
 
Physics news on Phys.org
Well, my intuition says that the probability will depend on the number of coins you throw. But let's do some mathematical analysis on the thing.

Let X be Binomial(n+1,1/2) distributed and let Y be Binomial(n,1/2) distributed. The question you ask is the probability P(X>Y). I will do this in two ways:

First we notice that

<br /> \begin{eqnarray}<br /> P(X&gt;Y)<br /> &amp; = &amp; \sum_{y=0}^n{ P(X&gt;y) }\\<br /> &amp; = &amp; \sum_{y=0}^n{ \sum_{x=y+1}^{n+1}{ P(X=x,Y=y) } }\\<br /> &amp; = &amp; \sum_{y=0}^n{ \sum_{x=y+1}^{n+1}{ P(X=x)P(Y=y) } }\\<br /> &amp; = &amp; \sum_{y=0}^n{ \sum_{x=y+1}^{n+1}{ \binom{n+1}{x}\left(\frac{1}{2}\right)^{n+1}\binom{n}{y}\left(\frac{1}{2}\right)^n } }\\<br /> &amp; = &amp; \left( \frac{1}{2}\right)^{2n+1}\sum_{y=0}^n{ \sum_{x=y+1}^{n+1}{ \binom{n+1}{x}\binom{n}{y} } }\\<br /> \end{eqnarray}<br />

Now, if I didn't make a mistake, then this is the exact probability. But calculating the sum is pretty hard (I have no idea on how to simplify it). So, I'll present another way to find the probability, using the Central Limit Theorem.

We wish to find the probability P(X-Y>0). If n is large, then X~Normal((n+1)/2,(n+1)/4), and Y~Normal(n/2,n/4). Thus X-Y~Normal(1/2,(2n+1)/4). So if Z is standard normal, then

P(X-Y&gt;0)\sim P\left(Z&gt;\frac{-1/2}{\sqrt{(2n+1)/4}}\right)=P\left(Z&gt;-\frac{1}{\sqrt{2n+1}}\right)

So, if n is large, then this probability is close to P(Z>0)=1/2, which reinforces your original intuition...
 
Coins don't care who throws them. There is a 1/2 chance of heads on any coin, regardless of who tosses it. If one player has more coins than the other, then his chances of getting more heads than the other player is greater.

Let's say player 1 has 100 coins and player 2 has 10. Expectation is that player 1 gets 50 heads, player 2 gets 5 heads.

"one more" doesn't lead to that extreme a result and I'm sure that micromass's analysis that says that as the number of coins gets very large, the difference gets very small. Still, the odds are that the player with more coins will get more heads.
 
No the probability is always 1/2. Using micromass's definitions put X=Z+B where Z~Y(iid) and B is bernoulli. So

P[X>Y] = (1/2)(P[Z>Y]+P[Z+1>Y])
= (1/2)(P[Z>Y]+P[Z>=Y])
= (1/2)(P[Z>Y]+P[Z<=Y])
= 1/2
 
bpet said:
No the probability is always 1/2. Using micromass's definitions put X=Z+B where Z~Y(iid) and B is bernoulli. So

P[X>Y] = (1/2)(P[Z>Y]+P[Z+1>Y])
= (1/2)(P[Z>Y]+P[Z>=Y])
= (1/2)(P[Z>Y]+P[Z<=Y])
= 1/2

bpet, THINK for a minute. Let's say one guy has 1 coin the other guy has 2 coins. how can you possibly say they will on average flip the same number of heads?
 
phinds said:
bpet, THINK for a minute. Let's say one guy has 1 coin the other guy has 2 coins. how can you possibly say they will on average flip the same number of heads?

That's not what the formula is saying, nor what the original question was asking.
 
can you not read? It says "Consider 2 players, of which 1 has one more coin than player 2"
 
could you explain your reasoning?
 
bpet said:
could you explain your reasoning?

Yeah, sorry I got snippy.

The problem says that there are two players and that one of them has one more coin than the other. Discounting the case where one of them doesn't even HAVE a coin, the lowest case is 1 coin and 2 coins. On the average the 1-coin guy will get 1/2 of a head and the 2-coin guy will get 1 head.

At 100 coins and 101 coins, it's 50 and 50.5, which isn't much of a difference, but of course it IS still technically true that the plus-one-coin guy will get more heads.
 
  • #10
phinds said:
Yeah, sorry I got snippy.

The problem says that there are two players and that one of them has one more coin than the other. Discounting the case where one of them doesn't even HAVE a coin, the lowest case is 1 coin and 2 coins. On the average the 1-coin guy will get 1/2 of a head and the 2-coin guy will get 1 head.

At 100 coins and 101 coins, it's 50 and 50.5, which isn't much of a difference, but of course it IS still technically true that the plus-one-coin guy will get more heads.

Having a higher expected number of points doesn't make them more likely to win (even though the law of large numbers says they should accumulate more points in total after many games), especially here where the second player wins a tie.

In fact, in the following example, player 1 has a higher expected number of points but less than half chance of winning: player 1 gets 0 points with probability 2/3 and 4 points otherwise; player 2 always gets 1 point.

Back to the 2 vs 1 case, it's easy to verify that P[X>=Y]=P[X=0,Y=0]+P[X=1,Y<=1]=1/2(1/4+3/4)=1/2.
 
Back
Top