What is the distribution of the sum of two random vectors?

AI Thread Summary
The discussion revolves around deriving the distribution of the sum of two random vectors defined by their magnitudes and angles, both of which are uniformly distributed. The initial calculations focus on finding the probability density function (PDF) for the components of the vectors, particularly using the cosine and sine of the angles. Participants highlight the complexity of the problem, suggesting that the convolution of the individual distributions may be necessary, and mention the potential use of polar coordinates to simplify calculations. There is a consensus that while the PDF is important, starting with the cumulative distribution function (CDF) could be more manageable. The conversation emphasizes the need for a deeper understanding of probability theory to tackle the problem effectively.
thapyhap
Messages
8
Reaction score
0
I am trying to derive the distribution for the sum of two random vectors, such that:

<br /> \begin{align}<br /> X &amp;= L_1 \cos \Theta_1 + L_2 \cos \Theta_2 \\<br /> Y &amp;= L_1 \sin \Theta_1 + L_2 \sin \Theta_2<br /> \end{align}<br />

With:

<br /> \begin{align}<br /> L_1 &amp;\sim \mathcal{U}(0,m_1) \\<br /> L_2 &amp;\sim \mathcal{U}(0,m_2) \\<br /> \Theta_1 &amp;\sim \mathcal{U}(0, 2 \pi) \\<br /> \Theta_2 &amp;\sim \mathcal{U}(0, 2 \pi)<br /> \end{align}<br />

In other words, two vectors, each with a uniformly random direction, and each with a magnitude uniformly random between zero and m_1 or m_2, respectively. Is this even worth trying to calculate analytically?

I've tried to break the problem down into simpler parts. First, I calculated the PDF of S_1 = \cos \Theta_1 as:

<br /> f_{S_1}(s_1) = \frac{1}{\pi \sqrt{1 - {s_1}^2}}<br />

Then I thought, if we ignore L_1 and L_2, how can I find the PDF of S_1 + S_2 = \cos \Theta_1 + \cos \Theta_2? I thought I could try multiplying the characteristic functions of S_1 and S_2, so I tried taking the Fourier transform of f_{S_1}(s_1) in both MATLAB and Mathematica, but MATLAB just choked on it, and Mathematica returns something involving the Henkel function which looks too complex to use.

On Wikipedia I found something called the Arcsine distribution, which has a CDF similar to F_{S_1}. This is a special case of the Beta distribution, which Wikipedia does give the characteristic function for, but I'm not sure I can use it given that the CDF for the Arcsine distribution is slightly different than mine. However, this leads me to believe that the characteristic function for S_1 is tractable.

I really don't know anything about probability, I'm just reading Wikipedia and trying to make some sense of this problem. I would really appreciate someone telling me where to look next, or at least that what I'm trying to do is analytically impossible!
 
Physics news on Phys.org
Do you not know the rules for adding and multiplying probability density functions?
[edit]... hmmm, I think I misread: you are finding the distribution of the final values from adding 4 random numbers together.

found a discussion that may have some leads for you...
http://www.mathworks.com/matlabcentral/newsreader/view_thread/91982
... I'll have to think some more.

Basically: the probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions.
 
Last edited:
Perhaps it's easier to consider (X,Y) in polar coordinates, e.g. symmetry arguments show that the angle of (X,Y) is uniformly distributed. The magnitude is a little trickier but, say, the cdf of X^2+Y^2 could be written as a triple integral of an indicator function and then simplified somewhat.
 
Last edited:
Simon Bridge said:
Do you not know the rules for adding and multiplying probability density functions?
[edit]... hmmm, I think I misread: you are finding the distribution of the final values from adding 4 random numbers together.

found a discussion that may have some leads for you...
http://www.mathworks.com/matlabcentral/newsreader/view_thread/91982
... I'll have to think some more.

Basically: the probability distribution of the sum of two or more independent random variables is the convolution of their individual distributions.

Yeah, but the convolution of two distributions is the sum of their characteristic functions, i.e., the Fourier transform of their PDFs. Mathematica gave me a nice solution for this today. For Z = \cos \Theta_1 + \cos \Theta_2:

<br /> f_Z(z)=\frac{K(1-\frac{z^2}{4})}{\pi^2}<br />

Where K(k) is the complete elliptic integral of the first kind.

bpet said:
Perhaps it's easier to consider (X,Y) in polar coordinates, e.g. symmetry arguments show that the angle of (X,Y) is uniformly distributed. The magnitude is a little trickier but, say, the cdf of X^2+Y^2 could be written as a triple integral of an indicator function and then simplified somewhat.

Could you elaborate a bit more? I am not sure I see how this ends up simplifying things, it seems like I would have to do all the same calculations to get the magnitude. I can't find a nice way to calculate the distribution of L_1 \cos \Theta_1- Wikipedia has an article on calculating the product of distributions, which I thought would be easy considering that the PDF of a uniform random distribution is so simple, but I didn't really understand the calculus.

The article gives the PDF of Z = XY, for two random variables X and Y, with PDFs f_X and f_Y:

<br /> f_Z(z) = \int f_X(x) f_Y\left(\frac{z}{x}\right) \frac{1}{|x|}\, dx<br />

So I tried working it as follows, with Z=L_1 \cos \Theta_1:

<br /> \begin{align}<br /> f_Z(z) &amp;= \int \frac{1}{\pi\,m_1\,\left|x\right|\sqrt{1-(\frac{x}{z})^2}}\, dx \\<br /> &amp;= \frac{\log (x)-\log \left(\sqrt{\frac{z^2-x^2}{z^2}}+1\right)}{\pi m_1}<br /> \end{align}<br />

What does it mean that this is still a function of x? I have no clue what to try next.
 
thapyhap said:


Could you elaborate a bit more? I am not sure I see how this ends up simplifying things, it seems like I would have to do all the same calculations to get the magnitude. I can't find a nice way to calculate the distribution of L_1 \cos \Theta_1- Wikipedia has an article on calculating the product of distributions, which I thought would be easy considering that the PDF of a uniform random distribution is so simple, but I didn't really understand the calculus.

The article gives the PDF of Z = XY, for two random variables X and Y, with PDFs f_X and f_Y:

<br /> f_Z(z) = \int f_X(x) f_Y\left(\frac{z}{x}\right) \frac{1}{|x|}\, dx<br />

So I tried working it as follows, with Z=L_1 \cos \Theta_1:

<br /> \begin{align}<br /> f_Z(z) &amp;= \int \frac{1}{\pi\,m_1\,\left|x\right|\sqrt{1-(\frac{x}{z})^2}}\, dx \\<br /> &amp;= \frac{\log (x)-\log \left(\sqrt{\frac{z^2-x^2}{z^2}}+1\right)}{\pi m_1}<br /> \end{align}<br />

What does it mean that this is still a function of x? I have no clue what to try next.

Standard convolution formulas are not likely to be of much use for this approach because X and Y are dependent.

Also don't worry about the PDF just yet, it's trivial to calculate (if it exists) once you've got the CDF.

A CDF can be written as the expected value of a Boolean indicator function, which for this example will be a 4d integral, and if you consider the squared magnitude like I suggested this can be simplified to a 3d integral by symmetry arguments (or a 1d integral for the case L1=L2=1).

Symmetry arguments again show that the magnitude and angle are independent, but, if you must, you can calculate the joint PDF of X and Y by differentiating the cdf and using a transformation from polars back to Cartesians.
 
I'm curious whether using the PDF will give a 2 variable integration in a straightforward way.

Since the density function has the same value at all points on a circle of radius R, we may as well compute that value at the point (x = R, y = 0).

To break down how the sum of two vectors can land at (R,0) we can consider the vertical lines through points on the x-axis. There is an interval [x_min, x_max] where it is possible for the end of the first vector to land on the vertical line through a point (x1,0) with x1 in that interval. The values of x_min,x_max are a function of m1,m2,R. On such a vertical line, there is an interval [y_min, y_max] for the values y1 where the endpoint of the first vector can land at (x1,y1) and still allow the second vector to go from (x1,y1) to (R,0). These bounds are a function of x1, m1, m2,R.

The bounds [x_min, x_max] together with the bounds [y_min, y_max] determine some sort of geometric figure (not a rectangle since y_min and y_max are functions of x1.

If we knew the Joint density J_cartesian of (x1,y1,x2,y2) ( with (x2,y2) representing the components of the second vector) we could integrate J_cartesian(x1,y1,R-x1,-y1) over the above geometric figure as a double integral in the variables x1,y1. (At least that's my intuition - granted it's dangerous to reason about problems using PDFs.)

Since we don't know J_cartesian(x1,y1,x2,y2), we can use a change of variables that expresses the vectors (x1,y1), (R-x1,-y1) in polar coordinates. In polar coordinates, the joint density J_polar(L1,theta1,L2,theta2) is just the product of 4 constants The complications come from writing the bounds of integration in terms of the polar variables and in the "volume element" introduced by the change of variables. (It looks like we would be using a 2D "area element" since J_polar is evaluated as a function of only two variables - Is that correct?)

(As an aside, the endpoint of the first vector won't land uniformly distributed over the area of a circle of radius m1. The first vector isn't "a random vector" in that sense of "random".)
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top