- #1
Runty_Grunty
- 6
- 0
I know assignment questions and such aren't meant to be placed in here, but there's no spot in the "homework section" that's meant for statistics. Move this to the appropriate section if necessary; I couldn't find it.
Here I have an image of the questions we are meant to answer.
The first five parts have been finished, courtesy of a helper from http://www.mathhelpforum.com/math-help/f8/using-joint-probability-mass-functions-multiple-parts-162646.html" (I give him credit for this).
a)
It is clear from the definition of a marginal pmf that the two would be the same, given that we are summing either from 0 to infinity of i, or j to calculate either marginal.
[tex]P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)[/tex]
b)
Letting [tex]S=X+Y[/tex] to calculate [tex]P(S=k)[/tex], we note;
For:
[tex]k=0 \implies X=0,Y=0 \text{ so } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}[/tex]
Whereby the definition of factorials [tex]0!=1![/tex]
[tex]k=1 \implies X=0,Y=1;X=1,Y=0 \text{ so } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}[/tex]
Since [tex]P(X=1,Y=0)=P(X=0,Y=1)[/tex].
[tex]k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{ so } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\frac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}[/tex]
So iterating for all k, we end up with the given formula.
c)
For any pmf, we know that the sum over all possible values equals 1. So we can solve for [tex]\alpha[/tex]
[tex]1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot e^1[/tex]
Therefore, [tex]\alpha=e^{-1}[/tex], so S is Poisson distributed with parameter [tex]\lambda =1[/tex].
d)
[tex]P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}[/tex]
By the definition of independence,
[tex]P(X=0,Y=0)=P(X=0)P(Y=0)[/tex]
LHS: [tex]P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679[/tex]
RHS: [tex]P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS}[/tex], as X and Y have the same distribution.
Hence, they are not independent.
e)
Since [tex]S \sim \text{Poi}(1)[/tex] its expected value is [tex]\lambda =1[/tex]
Using the fact from a), X and Y have the same distribution and hence same expected value;
[tex]E(S)=1=E(X)+E(Y)=2\cdot E(X)[/tex]
Therefore, [tex]E(X)=0.5[/tex].
f) and g) have not been answered yet, so I could use a hand on them. Also, if any mistakes have been made, can you point them out?
Here I have an image of the questions we are meant to answer.
The first five parts have been finished, courtesy of a helper from http://www.mathhelpforum.com/math-help/f8/using-joint-probability-mass-functions-multiple-parts-162646.html" (I give him credit for this).
a)
It is clear from the definition of a marginal pmf that the two would be the same, given that we are summing either from 0 to infinity of i, or j to calculate either marginal.
[tex]P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)[/tex]
b)
Letting [tex]S=X+Y[/tex] to calculate [tex]P(S=k)[/tex], we note;
For:
[tex]k=0 \implies X=0,Y=0 \text{ so } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}[/tex]
Whereby the definition of factorials [tex]0!=1![/tex]
[tex]k=1 \implies X=0,Y=1;X=1,Y=0 \text{ so } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}[/tex]
Since [tex]P(X=1,Y=0)=P(X=0,Y=1)[/tex].
[tex]k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{ so } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\frac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}[/tex]
So iterating for all k, we end up with the given formula.
c)
For any pmf, we know that the sum over all possible values equals 1. So we can solve for [tex]\alpha[/tex]
[tex]1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot e^1[/tex]
Therefore, [tex]\alpha=e^{-1}[/tex], so S is Poisson distributed with parameter [tex]\lambda =1[/tex].
d)
[tex]P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}[/tex]
By the definition of independence,
[tex]P(X=0,Y=0)=P(X=0)P(Y=0)[/tex]
LHS: [tex]P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679[/tex]
RHS: [tex]P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS}[/tex], as X and Y have the same distribution.
Hence, they are not independent.
e)
Since [tex]S \sim \text{Poi}(1)[/tex] its expected value is [tex]\lambda =1[/tex]
Using the fact from a), X and Y have the same distribution and hence same expected value;
[tex]E(S)=1=E(X)+E(Y)=2\cdot E(X)[/tex]
Therefore, [tex]E(X)=0.5[/tex].
f) and g) have not been answered yet, so I could use a hand on them. Also, if any mistakes have been made, can you point them out?
Last edited by a moderator: