MHB Bivariate discrete random variable

Click For Summary
The discussion revolves around calculating the correlation coefficient between the number of heads (X) obtained from drawing a biased coin N times, where N follows a Poisson distribution with an expected value of 1. The independence of the events allows for the use of joint distributions to derive necessary statistics. The variance of N is straightforward, but the challenge lies in determining the covariance and variances for X. Key calculations involve summing over possible values of X and N to find the means and variances, ultimately leading to the correlation formula. The discussion emphasizes the importance of understanding the joint distribution and the independence of variables in this context.
Yankel
Messages
390
Reaction score
0
Hello

I am trying to solve this problem:

A coin is given with probability 1/3 for head (H) and 2/3 for tail (T).
The coin is being drawn N times, where N is a Poisson random variable with E(N)=1. The drawing of the coin and N are independent. Let X be the number of heads (H) in the N draws. What is the correlation coefficient of X and N ?

So I started this by creating a table as if it was a finite problem, just to see how it behaves, but it didn't lead me too far. Since there is independence, every event P(X=x , N=n) is equal to P(X=x|N=n)*P(N=n). So this is like a tree diagram sample space. In order to find the correlation, I need the covariance and the variances. The variance of N, it's easy, 1. How do I find the rest of the stuff ?

Thanks !
 
Physics news on Phys.org
Yankel said:
Hello

I am trying to solve this problem:

A coin is given with probability 1/3 for head (H) and 2/3 for tail (T).
The coin is being drawn N times, where N is a Poisson random variable with E(N)=1. The drawing of the coin and N are independent. Let X be the number of heads (H) in the N draws. What is the correlation coefficient of X and N ?

So I started this by creating a table as if it was a finite problem, just to see how it behaves, but it didn't lead me too far. Since there is independence, every event P(X=x , N=n) is equal to P(X=x|N=n)*P(N=n). So this is like a tree diagram sample space. In order to find the correlation, I need the covariance and the variances. The variance of N, it's easy, 1. How do I find the rest of the stuff ?

Thanks !

You have \(\bar{N}\), \(\sigma_N\) and the joint distribution, so:

$$ \bar{X} = \sum_{n=0..\infty, x=0,..n} x f_{X,N}(x,n)=\sum_{n=0..\infty} \frac{n}{3}f_N(n)=\frac{1}{3}\bar{N}$$

[math]\sigma^2_X= \sum_{n=0..\infty, x=0,..n} (x-\bar{X})^2 f_{X,N}(x,n)=\sum_{n=0..\infty}\frac{2n}{3}f_N(n)=\frac{2}{3}\bar{N}[/math]

$${\rm{Cov}}(X,N)= \sum_{n=0..\infty, x=0,..n} (x-\bar{X})(n-\bar{N}) f_{X,N}(x,n)=\sum_{n=0..\infty}\frac{(n-\bar{N})^2}{3}f_N(n)=\frac{\sigma^2_N}{3}$$

so:

$$\rho_{X,N}=\frac{{\rm{Cov}}(X,N)}{\sigma_X \sigma_N}=\ ...$$

The key idea here is that for the double summation you can always choose to do that over \(x\) first.

.
 
Last edited:
Greetings, I am studying probability theory [non-measure theory] from a textbook. I stumbled to the topic stating that Cauchy Distribution has no moments. It was not proved, and I tried working it via direct calculation of the improper integral of E[X^n] for the case n=1. Anyhow, I wanted to generalize this without success. I stumbled upon this thread here: https://www.physicsforums.com/threads/how-to-prove-the-cauchy-distribution-has-no-moments.992416/ I really enjoyed the proof...

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K