I'm sitting here with an interesting problem that I can't seem to figure out. I'm given two random variables X=a*exp(j*phi) Y=b where both a and b are known constants. phi is uniformly distributed on the interval [0,2pi) a third random variable Z=X+Y. My goal, is to find the magnitude of the resulting vector. At first, I thought that this was an easy problem that could be solved by use of convolution. This doesn't work here since phi makes X a random vector. I tried using MATLAB to help solve it. I wrote an mfile that tried solving it using convolution, and it failed. I tried turning X into a toeplitz matrix and doing matrix multiplication to do the convolution, but that too, failed. Can anyone help me out?