Using joint probability mass functions

In summary, the conversation discusses assignment questions and finding the appropriate section for statistics. They also discuss a helper who has provided answers for the first five parts of the assignment. The formula for calculating P(S=k) is also discussed, and it is determined that S is Poisson distributed with parameter \lambda =1. The independence of X and Y is also addressed, and it is concluded that they are not independent. The expected value of X is calculated to be 0.5. Parts f) and g) have not been answered yet and further assistance is requested. The hint for (f) is suggested to be tried, and for (g) it is mentioned that P(X > Y) = P(X < Y) and only one
  • #1
Runty_Grunty
6
0
I know assignment questions and such aren't meant to be placed in here, but there's no spot in the "homework section" that's meant for statistics. Move this to the appropriate section if necessary; I couldn't find it.

Here I have an image of the questions we are meant to answer.
Assignment3Q4.jpg


The first five parts have been finished, courtesy of a helper from http://www.mathhelpforum.com/math-help/f8/using-joint-probability-mass-functions-multiple-parts-162646.html" (I give him credit for this).
a)
It is clear from the definition of a marginal pmf that the two would be the same, given that we are summing either from 0 to infinity of i, or j to calculate either marginal.
[tex]P_X (X=x)=\sum_{y=0}^\infty P(X=x,Y=y) = \sum_{x=0}^\infty P(X=x,Y=y) = P_Y (Y=y)[/tex]

b)
Letting [tex]S=X+Y[/tex] to calculate [tex]P(S=k)[/tex], we note;
For:
[tex]k=0 \implies X=0,Y=0 \text{ so } P(S=0)=P(X=0,Y=0)=\frac{\alpha}{(1)!}[/tex]
Whereby the definition of factorials [tex]0!=1![/tex]

[tex]k=1 \implies X=0,Y=1;X=1,Y=0 \text{ so } P(S=1)=2 \cdot P(X=1,Y=0)=2\cdot \frac{\alpha}{(1+1+0)!} = \frac{2\alpha}{1\cdot 2}=\frac{\alpha}{1!}[/tex]
Since [tex]P(X=1,Y=0)=P(X=0,Y=1)[/tex].
[tex]k=2 \implies X=0,Y=2;X=2,Y=0;X=1,Y=1 \text{ so } P(S=2)= 2 \cdot \frac{\alpha}{(1+2+0)!}+\frac{\alpha}{(1+1+1)!}=\frac{3\cdot \alpha}{3!}=\frac{\alpha}{2!}[/tex]
So iterating for all k, we end up with the given formula.

c)
For any pmf, we know that the sum over all possible values equals 1. So we can solve for [tex]\alpha[/tex]
[tex]1=\sum_{k=0}^\infty \frac{\alpha}{k!}=\alpha \cdot e^1[/tex]
Therefore, [tex]\alpha=e^{-1}[/tex], so S is Poisson distributed with parameter [tex]\lambda =1[/tex].

d)
[tex]P(X=0)=\sum_{y=0}^\infty \frac{e^{-1}}{(1+y)!}=e^{-1}\sum_{y=1}^\infty \frac{1}{(y)!}=e^{-1}(e^1-1)=1-e^{-1}[/tex]
By the definition of independence,
[tex]P(X=0,Y=0)=P(X=0)P(Y=0)[/tex]
LHS: [tex]P(X=0,Y=0)=P(S=0)=e^{-1}=0.3679[/tex]
RHS: [tex]P(X=0)P(Y=0)=(1-e^{-1})^2=0.399 \neq \text{LHS}[/tex], as X and Y have the same distribution.
Hence, they are not independent.

e)
Since [tex]S \sim \text{Poi}(1)[/tex] its expected value is [tex]\lambda =1[/tex]
Using the fact from a), X and Y have the same distribution and hence same expected value;
[tex]E(S)=1=E(X)+E(Y)=2\cdot E(X)[/tex]
Therefore, [tex]E(X)=0.5[/tex].

f) and g) have not been answered yet, so I could use a hand on them. Also, if any mistakes have been made, can you point them out?
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Did you try the hint for (f)? How far did you get?

For (g), use the fact that P(X > Y) = P(X < Y). (Why?) And then observe that exactly one of {X = Y}, {X < Y}, and {X > Y} are true.
 

1. What is a joint probability mass function?

A joint probability mass function (PMF) is a mathematical function that assigns probabilities to a set of discrete random variables. It is used to calculate the probability of a specific outcome occurring when multiple random variables are involved.

2. How is a joint PMF different from a regular PMF?

A regular PMF only deals with a single random variable, while a joint PMF takes into account multiple random variables. It calculates the probability of a specific combination of outcomes occurring for each variable.

3. How do you calculate the joint PMF?

To calculate the joint PMF, you need to first determine the possible outcomes for each random variable. Then, you multiply the individual probabilities for each outcome to get the joint probability for that specific combination of outcomes.

4. What is the purpose of using a joint PMF?

The joint PMF is useful for understanding the relationship between multiple random variables and their outcomes. It can also be used to make predictions about future outcomes based on past data.

5. Can a joint PMF be used for continuous variables?

No, a joint PMF is only applicable to discrete random variables. For continuous variables, a joint probability density function (PDF) would need to be used instead.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
819
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
926
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
959
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
885
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
744
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
834
Back
Top