Sum of two random variables- kind of

In summary, the code produces a histogram of the magnitude of the vector z, where the min and max are at .75 and 1.25, respectively. The histogram is symmetric at 1.
  • #1
jmckennon
42
0
I'm sitting here with an interesting problem that I can't seem to figure out. I'm given two random variables

X=a*exp(j*phi)
Y=b

where both a and b are known constants.

phi is uniformly distributed on the interval [0,2pi)

a third random variable Z=X+Y.

My goal, is to find the magnitude of the resulting vector.

At first, I thought that this was an easy problem that could be solved by use of convolution. This doesn't work here since phi makes X a random vector. I tried using MATLAB to help solve it. I wrote an mfile that tried solving it using convolution, and it failed. I tried turning X into a toeplitz matrix and doing matrix multiplication to do the convolution, but that too, failed.

Can anyone help me out?
 
Physics news on Phys.org
  • #2
Does anyone have an idea of how to do this?
 
  • #3
I believe you should try to clarify what you are trying to get.

Comments: Y=b, where b is constant, so Y is not particularly random.
Z is a complex random variable, uniformly distributed over a circle of radius a, centered at b.
What do want to know any further about Z?
 
  • #4
I'm essentially trying to find the pdf of Z and its' magnitude. I want to run a monte carlo simulation to verify my results and make a histogram of the results from the simulation in MATLAB for various values of A and B. The way phi is declared, (in matlab) is phi=rand(1,1000).*2*pi;
 
  • #5
Does that clarify things? I appreciate the help
 
  • #6
Z is complex so the usual concept of probability distribution [F(x)=P(X≤x)] can't be used, since X (random variable) has to be real. |Z|, being real, will have a probability distribution.
 
  • #7
I understand this, but obtaining the actual solution is where I'm stuck. I'm looking for the probability density function of Z so that I can create a histogram of the values of the magnitude of Z for various a and b values.
 
  • #8
Ordinary probability density functions are derivatives of ordinary distribution functions, which need real valued random variables. For a complex valued random variable you would need a two dimension density function treating the real and imaginary parts as (dependent) random variables.
 
  • #9
I think you're mis understanding my question a bit, I'll try to clarify. In MATLAB code, phi=rand(1,1000).*2*pi; this makes X a random vector, not a random variable. If it was a random variable, things would be much easier. I'm having trouble of addressing the magnitude of the density function of a complex random vector, X that has a constant, Y being added to it. I appreciate your help though!
 
  • #10
As far as I can tell, the density function for a random vector can only be expressed as a joint density function of its components.
 
  • #11
I've made progress on this one, but I'm confused on part of the theory behind it. Here is my Matlab code.
>> a = 1;
>> b = 0.25;
>> phi = 2*pi*rand(1,10000);
>> z = a+b*exp(j*phi);
>> hist(abs(z),100)
This code produces the histogram I was looking for. It is a U shaped histogram with its' smallest value at .75, largest at 1.25 and it looks to be symmetric at 1. I'm trying to come up with an expression for the |Z| in terms of a and b.

My biggest question, and what would really help me out the most is if some one could provide like a geometric or linear algebra argument for why this
problem is relevant to the problem of the eigenvectors of two random
matrices (the area I'm tip-toeing my way into learning). I'm having trouble understanding this piece.
 
  • #12
z=1 + 0.25(cosφ + isinφ)
|z|² = (1 + .25cosφ)² + (.25sinφ)² = 17/16 + .5cosφ

You should be able to do the rest. The min and max for |z| agree with what you observed.
 

1. What is the definition of a sum of two random variables?

The sum of two random variables is a new random variable that is calculated by adding the values of the two original random variables. It represents the combined outcome of two separate events or phenomena.

2. How is the sum of two random variables calculated?

The sum of two random variables is calculated by adding the probability distributions of the two original random variables. This can be done through convolution, which involves integrating the product of the two probability density functions.

3. Can the sum of two random variables be a random variable itself?

Yes, the sum of two random variables is a new random variable and can have its own probability distribution, mean, and variance. This is because the sum of two random variables is a combination of two separate outcomes, and thus can have its own unique characteristics.

4. What are some real-world applications of the sum of two random variables?

The sum of two random variables can be used in various fields such as finance, engineering, and statistics. For example, in finance, the sum of two random variables can represent the total return of a portfolio consisting of two different assets. In engineering, it can represent the combined strength of two materials. In statistics, it can be used to model the sum of two independent variables in a regression analysis.

5. What is the difference between the sum of two independent random variables and the sum of two dependent random variables?

The sum of two independent random variables is calculated by adding the probability distributions of two random variables that are not related to each other. On the other hand, the sum of two dependent random variables is calculated by adding the probability distributions of two random variables that are related to each other. This means that the outcomes of the two dependent random variables are influenced by each other, while the outcomes of the two independent random variables are not.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
426
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
454
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
719
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
877
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
Back
Top