Joint/Conditional distribution

In summary, to evaluate the distribution P(X > a, X - Y > 0), we can use the law of total probability and calculate the probability of X > a given each possible value of X - Y, and then sum these probabilities together.
  • #1
OFDM
3
0
I'm having a problem evaluating a distribution-

Suppose X and Y are Chi-square random variables, and a is some
constant greater than 0. X and Y are independent, but not identically distributed (they have different DOFs).
I want to find

P(X>a,X-Y>0). So I use Bayes' theorem to write

P(X>a,X-Y>0)
=P(X>a | X-Y > 0)*P(X-Y>0)
=P(X>a| X>Y)*P(X>Y)

Now I have an expression for P(X>a) and P(X>Y), but I am at a
loss as to how to evaluate the conditional distribution P(X>a|
X>Y).

I figured out that if Y was a constant (rather than a random variable), then I could write

P(X>a| X>Y) = { 1 if Y>a
{ P(X>a)/P(X>Y) if Y<a

But this does not help evalaute the distribution because I requires knowledge of the value of random variable Y.

I also tried to write

P(X>a,X-Y>0)
=P(X-Y > 0|X>a)*P(X>a)
=P(X>Y| X>a)*P(X>a)

So to evaluate P(X>Y| X>a) I write

P(X>Y| X>a) = int(a...inf (int(0...x f_XY)) dYdX

But this gives some ugly expression which I cannot relate to simply P(X>Y) or P(X>a)

Any help will be much appreciated.
 
Physics news on Phys.org
  • #2
The best approach in this case is to use the law of total probability. This states that the probability of an event A is equal to the sum of the probability of the event A given each of the possible values of a random variable. Using this law, we can write:P(X > a, X - Y > 0) = $\sum_{y=0}^{\infty}\ P(X>a\ |\ X-Y=y)\ P(X-Y=y)$Now, for each value of y, we can calculate the probability of X > a given that X - Y = y. We can then multiply this by the probability that X - Y = y and add it to the sum. This approach will allow us to evaluate the distribution without needing knowledge of the value of Y.
 

What is joint/conditional distribution?

Joint/conditional distribution refers to the probability of two or more random variables occurring together or the probability of one variable occurring given the occurrence of another variable.

How is joint/conditional distribution different from marginal distribution?

Joint/conditional distribution looks at the probability of multiple variables occurring together, while marginal distribution only looks at the probability of one variable occurring.

What is the formula for calculating joint/conditional distribution?

The formula for joint/conditional distribution is P(A,B) = P(A|B) * P(B), where P(A,B) is the joint probability of A and B occurring together, P(A|B) is the conditional probability of A occurring given B, and P(B) is the probability of B occurring.

What is the difference between joint probability and joint distribution?

Joint probability is the probability of two or more events occurring together, while joint distribution is a function that assigns probabilities to different combinations of outcomes for two or more random variables.

How are joint/conditional distributions used in data analysis?

Joint/conditional distributions are used to understand the relationship between multiple variables and to make predictions about the likelihood of certain outcomes based on the occurrence of other variables. They are commonly used in statistical modeling and machine learning algorithms.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
278
  • Calculus and Beyond Homework Help
Replies
8
Views
763
  • Calculus and Beyond Homework Help
Replies
14
Views
913
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
333
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
Back
Top