Autocorrelation of a Bernoulli Coin Flipping Experiment

Click For Summary

Discussion Overview

The discussion revolves around the autocorrelation properties of a Bernoulli coin flipping experiment, specifically focusing on the mathematical derivation and interpretation of autocorrelation values for different lags. Participants explore the implications of the probability of heads (p) and how it affects the correlation between successive flips.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant defines a random variable x[n] for a Bernoulli process and calculates its mean as (2p - 1), questioning the mathematical derivation of autocorrelation values for lag m = 0 and m ≠ 0.
  • Another participant notes that if p = 0.5, then the correlation for m = 0 is 1, while for m ≠ 0, the flips are uncorrelated, prompting a question about the confusion regarding this outcome.
  • A participant reiterates the autocorrelation values, emphasizing the second-order property and seeking clarification on why the calculation differs for m = 0 compared to m ≠ 0.
  • There is a discussion about whether the probabilities calculated for x[n] and x[n+m] are the same for m = 0 and m ≠ 0, with some participants expressing uncertainty about the expression of p in this context.
  • One participant mentions that if the flips are independent, the auto-covariance is zero for n ≠ k, leading to a correlation of zero, while for n = k, it relates to the variance of the process.

Areas of Agreement / Disagreement

Participants express confusion and seek clarification on the mathematical aspects of autocorrelation, particularly regarding the differences in values for different lags. There is no consensus on the interpretation of these calculations, and multiple viewpoints are presented.

Contextual Notes

Participants reference the independence of flips and the definitions of covariance and correlation, but there are unresolved questions about the implications of these definitions in the context of the Bernoulli process.

tworitdash
Messages
104
Reaction score
25
I am confused at one point. The coin flipping Bernoulli Process has a probability of p of getting HEADS and a probability of 1-p of getting TAILS. Let's define a random variable x[n], which takes the value +1 when it is a HEADS, and -1 when it is a TAILS. The mean or estimation of x[n] becomes (2p -1) and I can derive it as an integration of probability function with the function itself. Which results in the sum of +1 times p and -1 time (1-p). However, when it comes to autocorrelation if the lag is 0, That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?E{x[n]} = 2p−1

E{x[n+m]x[n]} = 1 for [m = 0]
E{x[n+m]x[n]} = (2p - 1)2 for [m != 0]
 
Physics news on Phys.org
If p = 0.5, then 2p - 1 = 0. That means that a flip is correlated with itself (##m=0##) but any two different flips (##m\neq 0##) are completely uncorrelated. That's what you'd expect. What is confusing you about that?
 
tworitdash said:
That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?
Presumably, you calculated the probabilities that x[n]=1 and x[n+m]=1; x[n]=1 and x[n+m]=-1; x[n]=-1 and x[n+m]=1; and x[n]=-1 and x[n+m]=-1. Are those probabilities the same when ##m=0## and ##m\ne 0##?
 
  • Like
Likes   Reactions: tworitdash
vela said:
Presumably, you calculated the probabilities that x[n]=1 and x[n+m]=1; x[n]=1 and x[n+m]=-1; x[n]=-1 and x[n+m]=1; and x[n]=-1 and x[n+m]=-1. Are those probabilities the same when ##m=0## and ##m\ne 0##?
Yes It should be the same when m = 0. Isn't it? But is it an expession of p or just 1?
 
tworitdash said:
I am confused at one point. The coin flipping Bernoulli Process has a probability of p of getting HEADS and a probability of 1-p of getting TAILS. Let's define a random variable x[n], which takes the value +1 when it is a HEADS, and -1 when it is a TAILS. The mean or estimation of x[n] becomes (2p -1) and I can derive it as an integration of probability function with the function itself. Which results in the sum of +1 times p and -1 time (1-p). However, when it comes to autocorrelation if the lag is 0, That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?E{x[n]} = 2p−1

E{x[n+m]x[n]} = 1 for [m = 0]
E{x[n+m]x[n]} = (2p - 1)2 for [m != 0]

If the successive flips are "independent", the ##X[n]## results are uncorrelated; that is, the auto-covariance is
$$\text{Cov}(X[n], X[k]) = 0 \; \text{for} \; n \neq k.$$ Since
$$\text{Cor}(X[n],X[k]) \equiv \frac{\text{Cov}(X[n],X[k])}{\sigma_{X[n]} \sigma_{X[k]}}, $$
(where ##\sigma_X## is the standard deviation of ##X##) it follows that the correlation is zero as well.
Remember: the covariance ##\text{Cov}## is defined as
$$\text{Cov}(X[n],X[k]) \equiv E[ (X[n]-E X[n]) (X[k] - E X[k])], $$
and this evaluates to
$$\text{Cov}(X[n],X[k]) = E (X[n] X[k] ) - (E X[n]) (E X[k]).$$ This is zero for a Bernoulli process.For ##n = k## the covariance reduces to the variance:
$$ \text{Var} (X[n]) = E( X[n]- E X[n])^2 = E(X^2[n]) - (E X[n])^2 = 4p(1-p).$$
 
Last edited:
  • Like
Likes   Reactions: tworitdash

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
530
  • · Replies 41 ·
2
Replies
41
Views
4K
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K