Confusion about determining distribution of sum of two random variables

  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Probability theory
psie
Messages
315
Reaction score
40
Homework Statement
Let ##X## and ##Y## be independent r.v. such that ##X\in U(0,1)## and ##Y\in U(0,\alpha)##. Find the density function of ##Z=X+Y##. Remark: Note that there are two cases: ##\alpha\geq 1## and ##\alpha <1##.
Relevant Equations
The relevant equation is that the pdf of the sum of two continuous random variables is a convolution.
Let's recall the densities of ##X## and ##Y##:
\begin{align}
f_X(x)=\mathbf{1}_{(0,1)}(x), \quad f_Y(y)=\frac{1}{\alpha}\mathbf{1}_{(0,\alpha)}(y)
\end{align}
Let ##z\in (0,1+\alpha)##. So we know that ##f_Z(z)## is given by:
\begin{align}
f_Z(z)=\int_\mathbb{R} f_X(t)f_Y(z-t)\,dt
\end{align}
Both ##f_X## and ##f_Y## are zero most of the time. We check ##f_X## and ##f_Y## one by one. We start with ##f_X(t)##; it is nonzero when ##0<t<1##. We have that ##f_Y(z-t)## is nonzero when ##0<z-t<\alpha##. That means ##t<z## and ##z-\alpha<t##. We want to satisfy all these inequality at once. So that means ##\max\{z-\alpha,0\}<t<\min\{1,z\}##. Hence:
\begin{align*}
f_Z(z)=\int_{\mathbb{R}}f_X(t)f_Y(z-t)\,dt=\int^{\min\{1,z\}}_{\max\{z-\alpha,0\}}\frac{1}{\alpha}\,dt=\frac{\min\{1,z\}- \max\{z-\alpha,0\}}{\alpha}
\end{align*}

Now, what troubles me is the remark in the problem statement. I don't see that there are two cases to consider. For me, the density is simply the one given in the last equation, or?
 
Physics news on Phys.org
Consider the case ##\alpha = 1## and ##z = -1##. Your function would be
$$
f_Z(-1) = \min(1,-1) - \max(-1-1,0) = -1 - 0 = -1.
$$
Is this reasonable?
 
psie said:
Homework Statement: Let ##X## and ##Y## be independent r.v. such that ##X\in U(0,1)## and ##Y\in U(0,\alpha)##. Find the density function of ##Z=X+Y##. Remark: Note that there are two cases: ##\alpha\geq 1## and ##\alpha <1##.
Relevant Equations: The relevant equation is that the pdf of the sum of two continuous random variables is a convolution.

Let's recall the densities of ##X## and ##Y##:
\begin{align}
f_X(x)=\mathbf{1}_{(0,1)}(x), \quad f_Y(y)=\frac{1}{\alpha}\mathbf{1}_{(0,\alpha)}(y)
\end{align}
Let ##z\in (0,1+\alpha)##. So we know that ##f_Z(z)## is given by:
\begin{align}
f_Z(z)=\int_\mathbb{R} f_X(t)f_Y(z-t)\,dt
\end{align}
Both ##f_X## and ##f_Y## are zero most of the time. We check ##f_X## and ##f_Y## one by one. We start with ##f_X(t)##; it is nonzero when ##0<t<1##. We have that ##f_Y(z-t)## is nonzero when ##0<z-t<\alpha##. That means ##t<z## and ##z-\alpha<t##. We want to satisfy all these inequality at once. So that means ##\max\{z-\alpha,0\}<t<\min\{1,z\}##. Hence:
\begin{align*}
f_Z(z)=\int_{\mathbb{R}}f_X(t)f_Y(z-t)\,dt=\int^{\min\{1,z\}}_{\max\{z-\alpha,0\}}\frac{1}{\alpha}\,dt=\frac{\min\{1,z\}- \max\{z-\alpha,0\}}{\alpha}
\end{align*}

Now, what troubles me is the remark in the problem statement. I don't see that there are two cases to consider. For me, the density is simply the one given in the last equation, or?
Nitpick: Convolutionof _Independent_ Random Variables.
 
Orodruin said:
Consider the case ##\alpha = 1## and ##z = -1##. Your function would be
$$
f_Z(-1) = \min(1,-1) - \max(-1-1,0) = -1 - 0 = -1.
$$
Is this reasonable?
How can ##z=-1##? If ##\alpha=1##, then ##z\in (0,2)##, no?
 
psie said:
How can ##z=-1##? If ##\alpha=1##, then ##z\in (0,2)##, no?
Exactly. The distribution should be zero there. Your expression is not only non-zero, but negative.
 
Orodruin said:
Exactly. The distribution should be zero there. Your expression is not only non-zero, but negative.
Ok. But couldn’t we simply say that the expression for ##f_Z(z)## I gave is the distribution for ##z\in (0,1+\alpha)## and ##0## otherwise?
 
psie said:
Ok. But couldn’t we simply say that the expression for ##f_Z(z)## I gave is the distribution for ##z\in (0,1+\alpha)## and ##0## otherwise?
Sure, but that is still kind of breaking it up into cases. So is using the min/max functions.
 
Back
Top