I Coin tosses, values, and what I'm missing

  • I
  • Thread starter Thread starter member 428835
  • Start date Start date
member 428835
Suppose we toss a coin until we get HHH. On average this will take 14 tosses.

Now how many sequences of HHH would we get if we tossed the coin 14 times? We have 3 sequence positions and 14 units available, meaning we have 14-3+1 = 12 sequence positions available. Each position has 0.5 odds of being heads. So we have ##12*0.5^3 = 1.5 \neq 1##. I'm wondering why these two numbers aren't the same. I'm definitely misunderstanding something here.

As a follow up, so we know it takes 14 tosses on average to get HHH. So if we play a game where you pay 1$ per coin toss, then for the game to be fair if you made HHH you would receive 14 dollars (after you make HHH the game restarts, so you can't let it ride). But now suppose I charge you 10$ for a 10 toss game and give you 10$ if you make HHH, and if you do the game ends. It seems to me the expected number of HHH sequences given 10 tosses is now ##(10-2)0.5^3=1##, so I'm pretty confused. Any help?
 
Physics news on Phys.org
When we repeat sets of 14 tosses, the average of NNN counts is 1.5. When we repeat tosses until we get first NNN sequence, the average toss times is 14. I think we can distinguish them.
 
Last edited:
joshmccraney said:
Suppose we toss a coin until we get HHH. On average this will take 14 tosses.

Now how many sequences of HHH would we get if we tossed the coin 14 times? We have 3 sequence positions and 14 units available, meaning we have 14-3+1 = 12 sequence positions available. Each position has 0.5 odds of being heads. So we have ##12*0.5^3 = 1.5 \neq 1##. I'm wondering why these two numbers aren't the same. I'm definitely misunderstanding something here.
The main thing you are missing is independence. As long as the trials are independent the numbers should agree. For example, if you organise the game into separate sets of three tosses, then you have standard binomial trials with ##p = \frac 1 8##.

In the game you describe the trials are not independent. Once you get the first sequence of HHH you have a 50-50 chance of getting another one on the next turn. The sequences of HHH tend to clump together: you have a longish waiting time for the first one (or after a T), then often two or more together.

Note that for the expected time to get HHH you are always starting the game from scratch (effectively a T on a zeroth toss). Whereas, if you start from a random point in a sequence, then you may be starting with H or HH.

joshmccraney said:
As a follow up, so we know it takes 14 tosses on average to get HHH. So if we play a game where you pay 1$ per coin toss, then for the game to be fair if you made HHH you would receive 14 dollars (after you make HHH the game restarts, so you can't let it ride). But now suppose I charge you 10$ for a 10 toss game and give you 10$ if you make HHH, and if you do the game ends. It seems to me the expected number of HHH sequences given 10 tosses is now ##(10-2)0.5^3=1##, so I'm pretty confused. Any help?
Again, the expected number would only be 1 in 10 tosses starting from a random point in the sequence. The game effectively starts with the zeroth toss being a T (which is the worst possible starting point).
 
  • Like
Likes member 428835
PS to take an illustrative example. Suppose we have a strict sequence of failures followed by two successes repeated. The game always looks like FFFFFFFFSSFFFFFFFFSS ...

The expected number of successes in a random selection of 10 trials is 2. And, if we start at a random point in the sequence, then the probability the next trial is a success is ##0.2##. But, at the start of the game we always take 9 trials to get a success.
 
  • Like
Likes member 428835
Makes tons of sense, thanks!
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.

Similar threads

Replies
41
Views
7K
Replies
6
Views
2K
Replies
5
Views
1K
Replies
57
Views
6K
Replies
2
Views
4K
Replies
6
Views
3K
Replies
10
Views
2K
Back
Top