• Support PF! Buy your school textbooks, materials and every day products Here!

Autocorrelation of a Bernoulli Coin Flipping Experiment

  • #1
27
3
I am confused at one point. The coin flipping Bernoulli Process has a probability of p of getting HEADS and a probability of 1-p of getting TAILS. Let's define a random variable x[n], which takes the value +1 when it is a HEADS, and -1 when it is a TAILS. The mean or estimation of x[n] becomes (2p -1) and I can derive it as an integration of probability function with the function itself. Which results in the sum of +1 times p and -1 time (1-p). However, when it comes to autocorrelation if the lag is 0, That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?


E{x[n]} = 2p−1

E{x[n+m]x[n]} = 1 for [m = 0]
E{x[n+m]x[n]} = (2p - 1)2 for [m != 0]
 

Answers and Replies

  • #2
RPinPA
Science Advisor
Homework Helper
565
318
If p = 0.5, then 2p - 1 = 0. That means that a flip is correlated with itself (##m=0##) but any two different flips (##m\neq 0##) are completely uncorrelated. That's what you'd expect. What is confusing you about that?
 
  • #3
vela
Staff Emeritus
Science Advisor
Homework Helper
Education Advisor
14,580
1,200
That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?
Presumably, you calculated the probabilities that x[n]=1 and x[n+m]=1; x[n]=1 and x[n+m]=-1; x[n]=-1 and x[n+m]=1; and x[n]=-1 and x[n+m]=-1. Are those probabilities the same when ##m=0## and ##m\ne 0##?
 
  • #4
27
3
Presumably, you calculated the probabilities that x[n]=1 and x[n+m]=1; x[n]=1 and x[n+m]=-1; x[n]=-1 and x[n+m]=1; and x[n]=-1 and x[n+m]=-1. Are those probabilities the same when ##m=0## and ##m\ne 0##?
Yes It should be the same when m = 0. Isn't it? But is it an expession of p or just 1?
 
  • #5
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
I am confused at one point. The coin flipping Bernoulli Process has a probability of p of getting HEADS and a probability of 1-p of getting TAILS. Let's define a random variable x[n], which takes the value +1 when it is a HEADS, and -1 when it is a TAILS. The mean or estimation of x[n] becomes (2p -1) and I can derive it as an integration of probability function with the function itself. Which results in the sum of +1 times p and -1 time (1-p). However, when it comes to autocorrelation if the lag is 0, That is Estimate[x[n+m]x[n]] = 1 when m = 0 and if m is not equal to 0, it becomes (2p -1) ^ 2. I basically get it when I try to understand the physical meaning of it, but mathematically how this calculation is done? Because of the fact that the autocorrelation is a second-order property, it becomes (2p - 1) ^ 2. However, not for m = 0. Why and how?


E{x[n]} = 2p−1

E{x[n+m]x[n]} = 1 for [m = 0]
E{x[n+m]x[n]} = (2p - 1)2 for [m != 0]
If the successive flips are "independent", the ##X[n]## results are uncorrelated; that is, the auto-covariance is
$$\text{Cov}(X[n], X[k]) = 0 \; \text{for} \; n \neq k.$$ Since
$$\text{Cor}(X[n],X[k]) \equiv \frac{\text{Cov}(X[n],X[k])}{\sigma_{X[n]} \sigma_{X[k]}}, $$
(where ##\sigma_X## is the standard deviation of ##X##) it follows that the correlation is zero as well.
Remember: the covariance ##\text{Cov}## is defined as
$$\text{Cov}(X[n],X[k]) \equiv E[ (X[n]-E X[n]) (X[k] - E X[k])], $$
and this evaluates to
$$\text{Cov}(X[n],X[k]) = E (X[n] X[k] ) - (E X[n]) (E X[k]).$$ This is zero for a Bernoulli process.


For ##n = k## the covariance reduces to the variance:
$$ \text{Var} (X[n]) = E( X[n]- E X[n])^2 = E(X^2[n]) - (E X[n])^2 = 4p(1-p).$$
 
Last edited:

Related Threads on Autocorrelation of a Bernoulli Coin Flipping Experiment

Replies
4
Views
3K
  • Last Post
Replies
19
Views
4K
Replies
5
Views
924
Replies
1
Views
1K
  • Last Post
Replies
8
Views
1K
Replies
0
Views
8K
Replies
3
Views
8K
Replies
0
Views
3K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
1
Views
11K
Top