Mathman23
- 248
- 0
Hi Guys,
I have this probability Problem where I have become stuck, therefore I pray that there is somebody in here who can give me some advice :)
It goes a something like that:
The two dimensional discrete Stochastic vector (X,Y) has the probability function P_{X,Y} which is
P(X=x,Y=y) = \left\{ \begin{array}{ll}\frac{{c e^{- \lambda}{\lambda ^{y}}}}{{y!}} & \textrm{where} \ x \in (-2,-1,0,1) \ \textrm{and} \ \ y \in (0,1,\ldots)&\\0 & \textrm{other.}&\\\end{array} \right.
where \lambda > 0 and c > 0
(a) The support supp P_{X,Y} = {-2,-1,0,1, \ldots}
(b)
The Probability functions P_X and P_Y for X and Y are
P(X=x) = \left\{ \begin{array}{ll}c & \textrm{where} \ x \in (-2,-1,0,1) \\0 & \textrm{other.}&\\\end{array} \right.
P(Y=y) = \left\{ \begin{array}{ll}\frac{{c e^{- \lambda}{\lambda ^{y}}}}{{y!}} & \textrm{and} \ \ y \in (0,1,\ldots)&\\0 & \textrm{other.}&\\\end{array} \right.
This is done by showing that Y \ \~{} pol(\lambda),
\sum_{y=0} ^{\infty} \frac{{e^{- \lambda}{\lambda ^{y}}}}{{y!}} =\sum_{y=1} ^{\infty} \frac{{ e^{- \lambda}{\lambda ^{y}}}}{{y-1!}} = \lambda \sum_{y=1} ^{\infty} \frac{{ e^{- \lambda}{\lambda ^{(y-1)}}}}{{(y-1)!}} = <br /> \lambda \sum_{v=0} ^{\infty} \frac{{ e^{- \lambda}{\lambda ^{(v)}}}}{{(v)!}} = \lambda
where v = (y-1)
I have two question
(c) I need to find the constant 'c'. How do I go about doing that?
(d)
If \lambda = 1, then P(X=Y) = 1/2 e^ -1. Any hits on how I show that ?
Sincerely and God bless You
Fred
I have this probability Problem where I have become stuck, therefore I pray that there is somebody in here who can give me some advice :)
It goes a something like that:
The two dimensional discrete Stochastic vector (X,Y) has the probability function P_{X,Y} which is
P(X=x,Y=y) = \left\{ \begin{array}{ll}\frac{{c e^{- \lambda}{\lambda ^{y}}}}{{y!}} & \textrm{where} \ x \in (-2,-1,0,1) \ \textrm{and} \ \ y \in (0,1,\ldots)&\\0 & \textrm{other.}&\\\end{array} \right.
where \lambda > 0 and c > 0
(a) The support supp P_{X,Y} = {-2,-1,0,1, \ldots}
(b)
The Probability functions P_X and P_Y for X and Y are
P(X=x) = \left\{ \begin{array}{ll}c & \textrm{where} \ x \in (-2,-1,0,1) \\0 & \textrm{other.}&\\\end{array} \right.
P(Y=y) = \left\{ \begin{array}{ll}\frac{{c e^{- \lambda}{\lambda ^{y}}}}{{y!}} & \textrm{and} \ \ y \in (0,1,\ldots)&\\0 & \textrm{other.}&\\\end{array} \right.
This is done by showing that Y \ \~{} pol(\lambda),
\sum_{y=0} ^{\infty} \frac{{e^{- \lambda}{\lambda ^{y}}}}{{y!}} =\sum_{y=1} ^{\infty} \frac{{ e^{- \lambda}{\lambda ^{y}}}}{{y-1!}} = \lambda \sum_{y=1} ^{\infty} \frac{{ e^{- \lambda}{\lambda ^{(y-1)}}}}{{(y-1)!}} = <br /> \lambda \sum_{v=0} ^{\infty} \frac{{ e^{- \lambda}{\lambda ^{(v)}}}}{{(v)!}} = \lambda
where v = (y-1)
I have two question
(c) I need to find the constant 'c'. How do I go about doing that?
(d)
If \lambda = 1, then P(X=Y) = 1/2 e^ -1. Any hits on how I show that ?
Sincerely and God bless You
Fred
Last edited: