How Do You Find the PDF of a Ratio of Exponential Random Variables?

Click For Summary
The discussion focuses on finding the probability density function (PDF) of the random variable Z defined as z = x/(x + y), where X and Y are independent exponential random variables with parameter 1. The initial approach to derive the cumulative distribution function (CDF) F_Z(z) involves calculating probabilities and integrating the joint PDF of X and Y. However, a critical error is noted when z exceeds 1, which invalidates the inequality used in the calculations. An alternative method is suggested, involving the relationship between Z and the ratio of Y to X, which may simplify the derivation of the PDF. The conversation emphasizes the importance of correctly handling the inequalities and exploring transformations of the random variables involved.
dionysian
Messages
51
Reaction score
1

Homework Statement


Let X and Y be two independent random variables each exponentially distributed with parameter 1. Define a new random variable:

z = \frac{x}{{x + y}}

Find the PDF of Z


Homework Equations





The Attempt at a Solution


\begin{array}{l}<br /> {F_Z}(z) = P(Z &lt; z) = P\left( {\frac{x}{{x + y}} &lt; z} \right) = P\left( {x \le \frac{{zy}}{{1 - z}}} \right) \\ <br /> {F_Z}(z) = \int\limits_0^\infty {\int_0^{\frac{{zy}}{{1 - z}}} {{f_{xy}}(x,y)dxdy} } \\ <br /> {f_{xy}}(x,y) = {f_x}(x){f_y}(y) \\ <br /> {F_Z}(z) = \int\limits_0^\infty {\int_0^{\frac{{zy}}{{1 - z}}} {{f_x}(x){f_y}(y)dxdy = } } \int\limits_0^\infty {\int_0^{\frac{{zy}}{{1 - z}}} {{e^{ - x}}{e^{ - y}}dxdy} } = \int\limits_0^\infty {{e^{ - y}}\left[ {\int_0^{\frac{{zy}}{{1 - z}}} {{e^{ - x}}dx} } \right]} dy \\ <br /> {F_Z}(z) = \int\limits_0^\infty {{e^{ - y}}\left[ { - {e^{ - \frac{{zy}}{{1 - z}}}} + 1} \right]} dy = \int\limits_0^\infty { - {e^{ - y}}{e^{ - \frac{{zy}}{{1 - z}}}} + {e^{ - y}}} dy = \int\limits_0^\infty { - {e^{ - \frac{{y(1 - z) - zy}}{{1 - z}}}} + {e^{ - y}}} dy \\ <br /> {F_Z}(z) = \int\limits_0^\infty { - {e^{ - \frac{y}{{1 - z}}}} + {e^{ - y}}} dy = (1 - z){e^{ - \frac{y}{{1 - z}}}}|_0^\infty - {e^{ - y}}|_0^\infty = z \\ <br /> \end{array}
Now i know that if i take the derivative of this i will get the "pdf" but its obviously wrong. Any thoughts?
 
Physics news on Phys.org
dionysian said:
P\left( {\frac{x}{{x + y}} &lt; z} \right) = P\left( {x \le \frac{{zy}}{{1 - z}}} \right)

This step is invalid if z &gt; 1. (The inequality gets reversed in that case.)

But you could instead write

P\left( \frac{x}{x+y} &lt; z \right) = P\left(y &gt; \frac{x(1-z)}{z}\right) = 1 - P\left(y \leq \frac{x(1-z)}{z}\right)

I'm not sure if that will be any more helpful, but at least it's correct.

I wonder if it would be helpful to work with the reciprocal:

\frac{1}{z} = \frac{x + y}{x} = 1 + \frac{y}{x}

It shouldn't be hard to work out the pdf of

\frac{y}{x}

as it is the quotient of two independent random variables. Adding 1 just shifts the pdf to the right by 1. Then do you know how to find the pdf of the reciprocal of a random variable with known pdf?
 
Last edited:
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
9
Views
2K
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
12
Views
1K
  • · Replies 4 ·
Replies
4
Views
1K