Using joint probability mass functions

Runty_Grunty
Messages
5
Reaction score
0
I know assignment questions and such aren't meant to be placed in here, but there's no spot in the "homework section" that's meant for statistics. Move this to the appropriate section if necessary; I couldn't find it.

Here I have an image of the questions we are meant to answer.
Assignment3Q4.jpg


The first five parts have been finished, courtesy of a helper from http://www.mathhelpforum.com/math-help/f8/using-joint-probability-mass-functions-multiple-parts-162646.html" (I give him credit for this).
a)
It is clear from the definition of a marginal pmf that the two would be the same, given that we are summing either from 0 to infinity of i, or j to calculate either marginal.
P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)

b)
Letting S=X+Y to calculate P(S=k), we note;
For:
k=0 \implies X=0,Y=0 \text{ so } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}
Whereby the definition of factorials 0!=1!

k=1 \implies X=0,Y=1;X=1,Y=0 \text{ so } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}
Since P(X=1,Y=0)=P(X=0,Y=1).
k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{ so } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\frac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}
So iterating for all k, we end up with the given formula.

c)
For any pmf, we know that the sum over all possible values equals 1. So we can solve for \alpha
1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot e^1
Therefore, \alpha=e^{-1}, so S is Poisson distributed with parameter \lambda =1.

d)
P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}
By the definition of independence,
P(X=0,Y=0)=P(X=0)P(Y=0)
LHS: P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679
RHS: P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS}, as X and Y have the same distribution.
Hence, they are not independent.

e)
Since S \sim \text{Poi}(1) its expected value is \lambda =1
Using the fact from a), X and Y have the same distribution and hence same expected value;
E(S)=1=E(X)+E(Y)=2\cdot E(X)
Therefore, E(X)=0.5.

f) and g) have not been answered yet, so I could use a hand on them. Also, if any mistakes have been made, can you point them out?
 
Last edited by a moderator:
Physics news on Phys.org
Did you try the hint for (f)? How far did you get?

For (g), use the fact that P(X > Y) = P(X < Y). (Why?) And then observe that exactly one of {X = Y}, {X < Y}, and {X > Y} are true.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top