oyth94
- 32
- 0
Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)
oyth94 said:Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)
TheBigBadBen said:What is F(X,Y) here? Is this the cumulative distribution function?
Are we given any information about X and Y in the problem?
oyth94 said:Prove that lim y->infinity F(X,Y) (x,y) = F(X)(x)
chisigma said:By definition is...
$$F_{X,Y} (x,y) = P \{X<x,Y<y\}$$
... and because if y tends to infinity then $P \{Y<y\}$ tends to 1 we have...
$$\lim_{y \rightarrow \infty} F_{X,Y} (x,y) = P \{X<x\} = F_{X} (x)$$
Kind regards
$\chi$ $\sigma$
oyth94 said:So I did it correctly? No steps skipped?
oyth94 said:There is a similar question. But this one is not a limit question but does invoke joint probability, independence etc
The question is:
F(X,Y(x,y) <= FX(x),FY(y)
I know that when finding the integral for FX(x) it is in respect to y. And when finding the integral for FY(y) we find integral with respect to x. When we multiply the two together to find if it is independent we multiply the two together and see if it equals to FX,Y(x,y)
But I'm not sure if this question is regarding independence or something else. How must I go about proving this question?
TheBigBadBen said:I don't think that the premise of the question is true in general, you would have provide more information about the distribution on $X$ and $Y$.
oyth94 said:This was all that was given in the question. So I am confused now... Or can we prove by contradiction if possible?
TheBigBadBen said:I realize what you conceivably could have meant (and probably did mean) is
$$
\text{for any }y:\; F_{X,Y}(x,y)\leq F_X(x) \text{ AND }\\
\text{for any }x:\; F_{X,Y}(x,y)\leq F_Y(y)
$$
Is this what you meant to prove? Then yes, we can prove this by integration, as you rightly mentioned.
Please, please, please: try to be clearer in the future about what you mean, even if it makes your post a little longer.
TheBigBadBen said:Sorry, I was a little hasty with that; let me actually be sure about what the question states. My understanding is that you are to prove that for any $X,Y$ with some joint cdf $F_{X,Y}$ and where $X$ and $Y$ are not necessarily independent, we can state that
$$
F_{X,Y}(x,y)\leq F_X(x)\cdot F_Y(y)
$$
Or, phrased in a different way:
$$
P(X< x \text{ and } Y< y) \leq P(X < x)\cdot P(Y < y)
$$
If the above is what you meant, I would pose the following counterargument: we could equivalently state
$$
P(X< x|Y<y) \cdot P(Y< y) \leq P(X < x)\cdot P(Y < y) \Rightarrow \\
P(X<x|Y<y) \leq P(X<x)
$$
and that simply isn't true for all distributions. That is, prior knowledge of another variable can increase the probability of an event. Would you like a counter-example to this claim?
If that's not what you meant, or if there's more information about $X$ and $Y$ that you're leaving out, do say so.
oyth94 said:For the counter argument why did you use conditional probability? And multiplied by p(Y<y)?
oyth94 said:Hi my apologies this is actually what I meant to say. So how does it work after integration? An I doing: the Integral from 0 to y for FX(x)dy multiply with integral from 0 to x of FY(y)dx to get integral of FXY(x,y)? Okay something is wrong I don't think that makes sense does it?