PDF of function of 3 continuous, uniform random variables?

Click For Summary
The discussion focuses on proving that the expression (XY)^Z, where X, Y, and Z are independent continuous uniform random variables on (0,1), is also uniformly distributed on (0,1). The user initially calculated the probability density function (pdf) of XY, arriving at -ln(w), but struggled with how to show that W^Z is uniformly distributed. A suggested approach involves calculating the expected value of the probability P(WZ ≤ x) by treating W as a constant and using properties of independence. The integration process is outlined, leading to the conclusion that the final result confirms the uniform distribution. The discussion emphasizes the importance of understanding the underlying concepts rather than relying solely on visual aids.
Phillips101
Messages
33
Reaction score
0
Hi. The question is:

Given X, Y and Z are all continuous, independant random variables uniformly distributed on (0,1), prove that (XY)^Z is also uniformly distributed on (0,1).

I worked out the pdf of XY=W. I think it's -ln(w). I have no idea at all how to show that W^Z is U(0,1).

What do I integrate, how do I know how to combine the pdfs, how do I know what the limits are, what substitutions should I make if I need to make one? Etc, really. I just don't know how to tackle this sort of problem at all. The pdf I have for W came from a picture, not any real understanding of what I was doing.

Thanks for any help :)
 
Physics news on Phys.org
Fix x in (0,1). We could start with

P(WZ ≤ x) = E[P(WZ ≤ x | W)].

Since Z and W are independent, we can calculate P(WZ ≤ x | W) by treating W as a constant. In this case, if W > x, then the probability is 0. Otherwise, WZ ≤ x iff Z ≥ ln(x)/ln(W), which has probability 1 - ln(x)/ln(W). Hence,

<br /> \begin{align*}<br /> E[P(W^Z \le x \mid W)] &amp;= E\left[{\left({1 - \frac{\ln(x)}{\ln(W)}}\right)1_{\{W\le x\}}}\right]\\<br /> &amp;= \int_0^x \left({1 - \frac{\ln(x)}{\ln(w)}}\right)(-\ln(w))\,dw.<br /> \end{align*}<br />

Now do the integral and check that the result is x.
 
Phillips101 said:
The pdf I have for W came from a picture, not any real understanding of what I was doing.
We can do this the same way. If w\in(0,1), then

\begin{align*}<br /> P(W\le w) &amp;= P(XY \le w)\\<br /> &amp;= E[P(XY\le w \mid Y)]\\<br /> &amp;= E\left[{P\left({X\le\frac wY\mid Y}\right)}\right].<br /> \end{align*}<br />

If Y\le w, then the probability is 1; otherwise, it is w/Y. Thus,

\begin{align*}<br /> P(W\le w) &amp;= E\left[{1_{\{Y\le w\}} + \frac wY1_{\{Y &gt; w\}}}\right]\\<br /> &amp;= P(Y \le w) + \int_w^1 \frac wy\,dy\\<br /> &amp;= w - w\ln(w).<br /> \end{align*}<br />

To get the density, we differentiate, which gives -\ln(w).
 
Thanks a lot, that's really very useful.

James
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K