Joint Probability Distr. of 2 Non-Independent Variables Given Marginals

estebanox
Messages
26
Reaction score
0
Hi,
This is my first post in one of these forum; I hope someone can help me with this --thanks in advance for reading!

I'm trying to find the joint probability distribution of two non-independent variables given only their marginal distributions. In fact, I'm interested in the joint distribution of two random variables X and Y, both of which are related to transformations of the same (normally distributed) random variable Z. More specifically, I want f(X,Y)=Pr[X=x,Y=y], given that I know:
• X=Z+u
• Y=exp(Z)
• Where Z~N(0,1), u~N(0,1) and COV(Z,u)=0
I know the problem would be trivial if Y was a linear transformation, because the joint would simply be a multivariate normal distribution (i.e. N~(0,Ʃ), where the covariance matrix Ʃ could be expressed in terms of VAR[Z]). In my case the problem is trickier because it involves the joint of a normal and a log-normal distribution.

I hope it makes sense. Any help or hints (also to express the problem more clearly) are very much appreciated.


Thanks!
 
Physics news on Phys.org


Is u a random variable or a constant? It makes a big difference!
 


Other questions:

Is u independent of Z or do you only know COV(Z,u) = 0 ?

It's clear you can get an answer by a Monte-Carlo simulation. Are you looking for deerministic numerical methods to get an answer? Or are you only interested in a symbolic expression for the joint density of X and Y.

Do you actually need the joint density or are you only after certain moments of the joint density, like COV(X,Y)?
 


Thanks for this. In response to your questions:
  • u is a random variable: u~N(0,1)
  • u is independent from Z (it's an independent error term in signal X)

Now, with respect to what I'm really looking for, I actually want a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c". Ideally I would like to get a symbolic expression, but if it's not possible, I could work with a numerical solution.

Thanks again.
 


estebanox said:
a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c".

I don't know which bivariate distribution you are referring to. Z is univariate isn't it?

Can we compute E[Z| X > c] by conditioning on two cases: Z >= c and Z < c ?

Let \phi be the density of n(0,1).

E[Z | Z >=c] is proportional to \int_{c}^{\infty} z \phi(z) dz

E[ Z | (Z < c) & (X > c)] = E[Z | ( c-1 < Z < c) & (u >= c-Z)]
My intuitive guess is that this is proportional to
\int_{c-1}^c (z + \frac{1-(z-c)}{2}) \phi(z) dx

(I'm guessing that we can account for the conditional expectation of the uniform distribution by adding the term \frac{1-(z-c)}{2} )
 


I'm really sorry; I made a mistake... I meant the expectation of the lognormal (*Y*) conditional on the normal: E[Y| X>c]. But yes, Y is not bivariate.
In any case, I understand your reasoning and splitting the analytical solution by cases is the best way around it.
Thanks
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.

Similar threads

Replies
30
Views
4K
Replies
5
Views
2K
Replies
4
Views
3K
Replies
2
Views
2K
Replies
5
Views
2K
Replies
2
Views
1K
Back
Top