- #1
atrus_ovis
- 101
- 0
Homework Statement
A PC generates "random" numbers from [0,1], programmed such that
the distribution function F(x) of a continuous random variable X, which is satisfies the formula:
F(x) =
0 , x<0
x , 0<=x<0.25
0.25 , 0.25<=x<0.5
x2, 0.5<=x<1
1 , 1<=x
THe problem then asks the values of probabilities in ranges of X.
Homework Equations
-
The Attempt at a Solution
My question is, why is F(x) , x>1 = 1?
Isn't that, you know, non sensical?
And, how, for example will i measure the P(x>.75), when F(x)=1 for x>1 ?
(edit: or is big F of x, a standard notation for the cumulative distr. function?)
Last edited: