Autocorrelation of a Bernoulli Coin Flipping Experiment

In summary, the coin flipping Bernoulli Process has a mean or estimation of x[n] = (2p -1) and the autocorrelation is zero for lag 0 and (2p - 1)^2 for lag m not equal to 0. This is because, for a Bernoulli process, the successive flips are independent, resulting in a zero covariance and correlation.
  • #1
tworitdash
107
26
I am confused at one point. The coin flipping Bernoulli Process has a probability of p of getting HEADS and a probability of 1-p of getting TAILS. Let's define a random variable x[n], which takes the value +1 when it is a HEADS, and -1 when it is a TAILS. The mean or estimation of x[n] becomes (2p -1) and I can derive it as an integration of probability function with the function itself. Which results in the sum of +1 times p and -1 time (1-p). However, when it comes to autocorrelation if the lag is 0, That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?E{x[n]} = 2p−1

E{x[n+m]x[n]} = 1 for [m = 0]
E{x[n+m]x[n]} = (2p - 1)2 for [m != 0]
 
Physics news on Phys.org
  • #2
If p = 0.5, then 2p - 1 = 0. That means that a flip is correlated with itself (##m=0##) but any two different flips (##m\neq 0##) are completely uncorrelated. That's what you'd expect. What is confusing you about that?
 
  • #3
tworitdash said:
That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?
Presumably, you calculated the probabilities that x[n]=1 and x[n+m]=1; x[n]=1 and x[n+m]=-1; x[n]=-1 and x[n+m]=1; and x[n]=-1 and x[n+m]=-1. Are those probabilities the same when ##m=0## and ##m\ne 0##?
 
  • Like
Likes tworitdash
  • #4
vela said:
Presumably, you calculated the probabilities that x[n]=1 and x[n+m]=1; x[n]=1 and x[n+m]=-1; x[n]=-1 and x[n+m]=1; and x[n]=-1 and x[n+m]=-1. Are those probabilities the same when ##m=0## and ##m\ne 0##?
Yes It should be the same when m = 0. Isn't it? But is it an expession of p or just 1?
 
  • #5
tworitdash said:
I am confused at one point. The coin flipping Bernoulli Process has a probability of p of getting HEADS and a probability of 1-p of getting TAILS. Let's define a random variable x[n], which takes the value +1 when it is a HEADS, and -1 when it is a TAILS. The mean or estimation of x[n] becomes (2p -1) and I can derive it as an integration of probability function with the function itself. Which results in the sum of +1 times p and -1 time (1-p). However, when it comes to autocorrelation if the lag is 0, That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?E{x[n]} = 2p−1

E{x[n+m]x[n]} = 1 for [m = 0]
E{x[n+m]x[n]} = (2p - 1)2 for [m != 0]

If the successive flips are "independent", the ##X[n]## results are uncorrelated; that is, the auto-covariance is
$$\text{Cov}(X[n], X[k]) = 0 \; \text{for} \; n \neq k.$$ Since
$$\text{Cor}(X[n],X[k]) \equiv \frac{\text{Cov}(X[n],X[k])}{\sigma_{X[n]} \sigma_{X[k]}}, $$
(where ##\sigma_X## is the standard deviation of ##X##) it follows that the correlation is zero as well.
Remember: the covariance ##\text{Cov}## is defined as
$$\text{Cov}(X[n],X[k]) \equiv E[ (X[n]-E X[n]) (X[k] - E X[k])], $$
and this evaluates to
$$\text{Cov}(X[n],X[k]) = E (X[n] X[k] ) - (E X[n]) (E X[k]).$$ This is zero for a Bernoulli process.For ##n = k## the covariance reduces to the variance:
$$ \text{Var} (X[n]) = E( X[n]- E X[n])^2 = E(X^2[n]) - (E X[n])^2 = 4p(1-p).$$
 
Last edited:
  • Like
Likes tworitdash

1. What is autocorrelation in a Bernoulli coin flipping experiment?

Autocorrelation in a Bernoulli coin flipping experiment refers to the correlation between the outcomes of a coin flip and the previous outcomes. In other words, it is a measure of how related the current outcome is to the previous outcome.

2. How is autocorrelation calculated in a Bernoulli coin flipping experiment?

Autocorrelation in a Bernoulli coin flipping experiment can be calculated using the autocorrelation coefficient, which is a measure of the correlation between the current outcome and the previous outcome. This can be calculated using statistical methods such as the Pearson correlation coefficient or the autocorrelation function.

3. What does it mean if the autocorrelation coefficient is close to 0 in a Bernoulli coin flipping experiment?

If the autocorrelation coefficient is close to 0 in a Bernoulli coin flipping experiment, it means that there is no significant correlation between the current outcome and the previous outcome. This indicates that the coin flips are independent of each other and there is no pattern or trend in the outcomes.

4. Can autocorrelation affect the results of a Bernoulli coin flipping experiment?

Yes, autocorrelation can affect the results of a Bernoulli coin flipping experiment. If there is a significant autocorrelation, it means that the outcomes of the coin flips are not truly random and there is some underlying pattern or trend. This can lead to biased results and affect the validity of the experiment.

5. How can autocorrelation be reduced in a Bernoulli coin flipping experiment?

Autocorrelation in a Bernoulli coin flipping experiment can be reduced by ensuring that the coin flips are truly random and independent of each other. This can be achieved by using a properly balanced coin, flipping it with the same force and in the same manner for each flip, and keeping track of the results carefully. Additionally, increasing the number of coin flips can also help reduce autocorrelation.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
1
Views
848
  • Classical Physics
2
Replies
41
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
1K
  • Programming and Computer Science
Replies
10
Views
991
  • Engineering and Comp Sci Homework Help
Replies
1
Views
938
  • Engineering and Comp Sci Homework Help
Replies
14
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
988
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
990
  • Engineering and Comp Sci Homework Help
Replies
1
Views
818
  • Engineering and Comp Sci Homework Help
Replies
3
Views
2K
Back
Top