# Joint probability distr. of 2 non-independent variables given only marginal distr.

1. Dec 12, 2011

### estebanox

Hi,
This is my first post in one of these forum; I hope someone can help me with this --thanks in advance for reading!

I'm trying to find the joint probability distribution of two non-independent variables given only their marginal distributions. In fact, I'm interested in the joint distribution of two random variables X and Y, both of which are related to transformations of the same (normally distributed) random variable Z. More specifically, I want f(X,Y)=Pr[X=x,Y=y], given that I know:
• X=Z+u
• Y=exp(Z)
• Where Z~N(0,1), u~N(0,1) and COV(Z,u)=0
I know the problem would be trivial if Y was a linear transformation, because the joint would simply be a multivariate normal distribution (i.e. N~(0,Ʃ), where the covariance matrix Ʃ could be expressed in terms of VAR[Z]). In my case the problem is trickier because it involves the joint of a normal and a log-normal distribution.

I hope it makes sense. Any help or hints (also to express the problem more clearly) are very much appreciated.

Thanks!

2. Dec 12, 2011

### mathman

Re: Joint probability distr. of 2 non-independent variables given only marginal distr

Is u a random variable or a constant? It makes a big difference!

3. Dec 12, 2011

### Stephen Tashi

Re: Joint probability distr. of 2 non-independent variables given only marginal distr

Other questions:

Is u independent of Z or do you only know COV(Z,u) = 0 ?

It's clear you can get an answer by a Monte-Carlo simulation. Are you looking for deerministic numerical methods to get an answer? Or are you only interested in a symbolic expression for the joint density of X and Y.

Do you actually need the joint density or are you only after certain moments of the joint density, like COV(X,Y)?

4. Dec 13, 2011

### estebanox

Re: Joint probability distr. of 2 non-independent variables given only marginal distr

Thanks for this. In response to your questions:
• u is a random variable: u~N(0,1)
• u is independent from Z (it's an independent error term in signal X)

Now, with respect to what I'm really looking for, I actually want a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c". Ideally I would like to get a symbolic expression, but if it's not possible, I could work with a numerical solution.

Thanks again.

5. Dec 14, 2011

### Stephen Tashi

Re: Joint probability distr. of 2 non-independent variables given only marginal distr

I don't know which bivariate distribution you are referring to. Z is univariate isn't it?

Can we compute E[Z| X > c] by conditioning on two cases: Z >= c and Z < c ?

Let $\phi$ be the density of n(0,1).

E[Z | Z >=c] is proportional to $\int_{c}^{\infty} z \phi(z) dz$

E[ Z | (Z < c) & (X > c)] = E[Z | ( c-1 < Z < c) & (u >= c-Z)]
My intuitive guess is that this is proportional to
$$\int_{c-1}^c (z + \frac{1-(z-c)}{2}) \phi(z) dx$$

(I'm guessing that we can account for the conditional expectation of the uniform distribution by adding the term $\frac{1-(z-c)}{2}$ )

6. Dec 19, 2011

### estebanox

Re: Joint probability distr. of 2 non-independent variables given only marginal distr

I'm really sorry; I made a mistake... I meant the expectation of the lognormal (*Y*) conditional on the normal: E[Y| X>c]. But yes, Y is not bivariate.
In any case, I understand your reasoning and splitting the analytical solution by cases is the best way around it.
Thanks