Joint Probability Distr. of 2 Non-Independent Variables Given Marginals

Click For Summary

Discussion Overview

The discussion revolves around finding the joint probability distribution of two non-independent random variables, X and Y, given their marginal distributions. The variables are defined in relation to a normally distributed random variable Z, and the focus is on the complexities arising from the relationship between a normal and a log-normal distribution.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant seeks to determine the joint probability distribution f(X,Y) given the transformations X=Z+u and Y=exp(Z), where Z is normally distributed and u is an independent random variable.
  • Another participant questions whether u is a random variable or a constant, noting the significance of this distinction.
  • Further inquiries are made regarding the independence of u from Z, and whether the original poster is looking for deterministic numerical methods or a symbolic expression for the joint density.
  • The original poster clarifies that u is a random variable independent of Z and expresses a desire for a solution to E[Z| X>c], where c is a constant.
  • One participant suggests conditioning on two cases to compute E[Z| X > c] and proposes an integral involving the density of the normal distribution.
  • The original poster later corrects a misunderstanding, indicating that they meant to find E[Y| X>c] instead of E[Z| X>c], acknowledging the complexity of the analytical solution.

Areas of Agreement / Disagreement

Participants express differing views on the nature of the joint distribution and the appropriate methods for finding expectations. There is no consensus on the best approach to solve the problem, and the discussion remains unresolved regarding the exact formulation of the joint distribution and the expectations sought.

Contextual Notes

Participants note the importance of the independence of variables and the implications of conditioning on different cases. There are unresolved mathematical steps regarding the expectations and the nature of the joint distribution.

estebanox
Messages
26
Reaction score
0
Hi,
This is my first post in one of these forum; I hope someone can help me with this --thanks in advance for reading!

I'm trying to find the joint probability distribution of two non-independent variables given only their marginal distributions. In fact, I'm interested in the joint distribution of two random variables X and Y, both of which are related to transformations of the same (normally distributed) random variable Z. More specifically, I want f(X,Y)=Pr[X=x,Y=y], given that I know:
• X=Z+u
• Y=exp(Z)
• Where Z~N(0,1), u~N(0,1) and COV(Z,u)=0
I know the problem would be trivial if Y was a linear transformation, because the joint would simply be a multivariate normal distribution (i.e. N~(0,Ʃ), where the covariance matrix Ʃ could be expressed in terms of VAR[Z]). In my case the problem is trickier because it involves the joint of a normal and a log-normal distribution.

I hope it makes sense. Any help or hints (also to express the problem more clearly) are very much appreciated.


Thanks!
 
Physics news on Phys.org


Is u a random variable or a constant? It makes a big difference!
 


Other questions:

Is u independent of Z or do you only know COV(Z,u) = 0 ?

It's clear you can get an answer by a Monte-Carlo simulation. Are you looking for deerministic numerical methods to get an answer? Or are you only interested in a symbolic expression for the joint density of X and Y.

Do you actually need the joint density or are you only after certain moments of the joint density, like COV(X,Y)?
 


Thanks for this. In response to your questions:
  • u is a random variable: u~N(0,1)
  • u is independent from Z (it's an independent error term in signal X)

Now, with respect to what I'm really looking for, I actually want a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c". Ideally I would like to get a symbolic expression, but if it's not possible, I could work with a numerical solution.

Thanks again.
 


estebanox said:
a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c".

I don't know which bivariate distribution you are referring to. Z is univariate isn't it?

Can we compute E[Z| X > c] by conditioning on two cases: Z >= c and Z < c ?

Let \phi be the density of n(0,1).

E[Z | Z >=c] is proportional to \int_{c}^{\infty} z \phi(z) dz

E[ Z | (Z < c) & (X > c)] = E[Z | ( c-1 < Z < c) & (u >= c-Z)]
My intuitive guess is that this is proportional to
\int_{c-1}^c (z + \frac{1-(z-c)}{2}) \phi(z) dx

(I'm guessing that we can account for the conditional expectation of the uniform distribution by adding the term \frac{1-(z-c)}{2} )
 


I'm really sorry; I made a mistake... I meant the expectation of the lognormal (*Y*) conditional on the normal: E[Y| X>c]. But yes, Y is not bivariate.
In any case, I understand your reasoning and splitting the analytical solution by cases is the best way around it.
Thanks
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K