- #1
Schlotkins
- 2
- 0
Good evening:
I have a probability proof question that is driving me crazy. I feel
like I must have forgot an easy trick. Any help is GREATLY
appreciated. Here's the setup:
Let's assume a,b are indepedent random variables from cummulative
distribution F.
I think it's safe to say:
P( a > b) = .5
Now, let's assume x,y are independent random variables from CDF G.
Again:
P(x > y) = 0.5
Assume CDFs G and F are indepedent. Now it seems straightforward that:
P(x + a > y + b) = 0.5
but I don't know how to show it without assuming a distribution type.
Again, any help is appreciated.
Thank you,
Chris
I have a probability proof question that is driving me crazy. I feel
like I must have forgot an easy trick. Any help is GREATLY
appreciated. Here's the setup:
Let's assume a,b are indepedent random variables from cummulative
distribution F.
I think it's safe to say:
P( a > b) = .5
Now, let's assume x,y are independent random variables from CDF G.
Again:
P(x > y) = 0.5
Assume CDFs G and F are indepedent. Now it seems straightforward that:
P(x + a > y + b) = 0.5
but I don't know how to show it without assuming a distribution type.
Again, any help is appreciated.
Thank you,
Chris