Joint Probability Distr. of 2 Non-Independent Variables Given Marginals

In summary, you want to find the expected value of the truncated bivariate distribution, in terms of a constant.
  • #1
estebanox
26
0
Hi,
This is my first post in one of these forum; I hope someone can help me with this --thanks in advance for reading!

I'm trying to find the joint probability distribution of two non-independent variables given only their marginal distributions. In fact, I'm interested in the joint distribution of two random variables X and Y, both of which are related to transformations of the same (normally distributed) random variable Z. More specifically, I want f(X,Y)=Pr[X=x,Y=y], given that I know:
• X=Z+u
• Y=exp(Z)
• Where Z~N(0,1), u~N(0,1) and COV(Z,u)=0
I know the problem would be trivial if Y was a linear transformation, because the joint would simply be a multivariate normal distribution (i.e. N~(0,Ʃ), where the covariance matrix Ʃ could be expressed in terms of VAR[Z]). In my case the problem is trickier because it involves the joint of a normal and a log-normal distribution.

I hope it makes sense. Any help or hints (also to express the problem more clearly) are very much appreciated.


Thanks!
 
Physics news on Phys.org
  • #2


Is u a random variable or a constant? It makes a big difference!
 
  • #3


Other questions:

Is u independent of Z or do you only know COV(Z,u) = 0 ?

It's clear you can get an answer by a Monte-Carlo simulation. Are you looking for deerministic numerical methods to get an answer? Or are you only interested in a symbolic expression for the joint density of X and Y.

Do you actually need the joint density or are you only after certain moments of the joint density, like COV(X,Y)?
 
  • #4


Thanks for this. In response to your questions:
  • u is a random variable: u~N(0,1)
  • u is independent from Z (it's an independent error term in signal X)

Now, with respect to what I'm really looking for, I actually want a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c". Ideally I would like to get a symbolic expression, but if it's not possible, I could work with a numerical solution.

Thanks again.
 
  • #5


estebanox said:
a solution to E[Z| X>c] where c is a constant. In other words, I need the expected value of the truncated bivariate distribution, in terms of "c".

I don't know which bivariate distribution you are referring to. Z is univariate isn't it?

Can we compute E[Z| X > c] by conditioning on two cases: Z >= c and Z < c ?

Let [itex] \phi [/itex] be the density of n(0,1).

E[Z | Z >=c] is proportional to [itex] \int_{c}^{\infty} z \phi(z) dz [/itex]

E[ Z | (Z < c) & (X > c)] = E[Z | ( c-1 < Z < c) & (u >= c-Z)]
My intuitive guess is that this is proportional to
[tex] \int_{c-1}^c (z + \frac{1-(z-c)}{2}) \phi(z) dx [/tex]

(I'm guessing that we can account for the conditional expectation of the uniform distribution by adding the term [itex] \frac{1-(z-c)}{2} [/itex] )
 
  • #6


I'm really sorry; I made a mistake... I meant the expectation of the lognormal (*Y*) conditional on the normal: E[Y| X>c]. But yes, Y is not bivariate.
In any case, I understand your reasoning and splitting the analytical solution by cases is the best way around it.
Thanks
 

1. What is a joint probability distribution?

A joint probability distribution is a statistical measure that describes the probability of two or more random variables occurring simultaneously.

2. What are non-independent variables?

Non-independent variables are variables that are not completely unrelated and have some level of dependence on each other. In other words, the value of one variable can affect the value of the other variable.

3. How is the joint probability distribution calculated?

The joint probability distribution is calculated by multiplying the individual probabilities of each variable occurring together. This can be represented using a joint probability table or a joint probability function.

4. What are marginal probabilities?

Marginal probabilities are the probabilities of a single variable occurring, without taking into account the other variables. These probabilities are calculated by summing or integrating the joint probabilities over the other variables.

5. Can the joint probability distribution be used to make predictions?

Yes, the joint probability distribution can be used to make predictions about the likelihood of multiple events occurring together. It can also be used to calculate conditional probabilities, which can aid in making predictions about future events.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
468
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
845
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Replies
0
Views
345
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
814
Back
Top