# Random Variable and Distribution Function Relationship

1. Oct 16, 2013

### shoeburg

I'm in a probability theory class and I feel like I'm missing something fundamental between random variables and their distribution functions. I was given the following questions:

1)Let θ be uniformly dist. on [0,1]. For each dist. function F, define G(y) = sup{x:F(x)≤y}. Prove G(θ) has the dist. function F.

My attempt: G(θ)=sup{x:F(x)≤θ}, which is saying G(θ) equals the biggest x that satisfies F(x)≤θ, which isn't too surprising since 0≤ F(x),θ ≤1. So for any value of θ, which is any value from [0,1], G(θ)=sup{x : Pr(ω : X(ω) ≤ x ) ≤ θ}, and this is where I get stuck. It's weird I feel like I totally believe the statement intuitively, but I can't figure out how to prove/explain it, so I think I'm missing out some details between dist. functions and random variables.

2) Let X have the continuous dist. function F. Prove F(X) has the uniform dist. on [0,1].

My attempt: I guess F(X) a composite function, taking sample space points to the reals, then to [0,1]? Or does F(X) = Pr(X≤X)? But that doesn't make sense to me. The problem also asked what if F is not continuous.

If anyone can help me out either in or outside the context of these problems, I'd really appreciate it. Maybe I'm getting too caught up thinking of what spaces these functions are mapping what into what, and measures and sigma-algebras and whatnot... Prob. Theory is kicking my butt haha