Joint Probability of Sum Normal

In summary, the joint probability of X>a and Y>b and Z>c is given by: P(Z>c|X>a and Y>b)P(X>a \text{ and }Y>b)
  • #1
zli034
107
0
Don't know there are anyone can help me out with this. This is just something I asking myself, not a homework I must say.

Let's define X and Y are 2 standard normal random variables. And random variable Z=X+Y.

For real number a, we know P(X>a), the probability of X is greater than the real number a.
For real number b, we know P(Y>b), the probability of Y is greater than the real number b.
For real number c, we know P(Z>c), the probability of Z is greater than the real number c.

These are simple things.

We also can determine a joint probability P(X>a and Y>b), the probability of X is greater than a, also Y is greater than b. Since X and Y are independent, this joint probability is still simple to know.

What about joint probability P(X>a and Y>b and Z>c)? Because Z is correlated with both X and Y, I don't know how to do this. Thanks for help
 
Physics news on Phys.org
  • #2
I think you can use the relationship [tex] P(X>a \text{ and } Y>b \text{ and }Z>c) = P(Z>c | X>a \text{ and }Y>b)P(X>a \text{ and }Y>b) [/tex]
 
  • #3
wayneckm said:
I think you can use the relationship [tex] P(X>a \text{ and } Y>b \text{ and }Z>c) = P(Z>c | X>a \text{ and }Y>b)P(X>a \text{ and }Y>b) [/tex]

This relationship is certainly true. However I still don't get this part, P(Z>c| X>a and Y>b).
when a+b>c, then P(Z>c|X>a and Y>b)=1. And how about a+b<c? How do I express this step function? Or there is just no such function exist.
 
  • #4
Put y on the vertical axis, x on the horizontal.

y = b is a horizontal line.
x = a is a vertical line.
x+y = c is a downward-sloping line from (x,y) = (0,c) to (c,0).

As long as x > a and y > b, x+y > c is satisfied if c < a+b. For those values, Prob(x>a, y>b, z>c) = Prob(x>a, y>b).

If c > a+b, you need to figure in the additional constraint z>c. In this region {x>a, y>b, z>c given c > a+b}, Prob(x>a, y>b, z>c) is given by the double-integral of the joint pdf of X and Y, f(x,y) = f(x)f(y), first w/r/t x, from x = max(a, c - y) to infinity, then w/r/t y, from y = b to infinity.
 
Last edited:
  • #5


I can provide some insights on how to approach this problem. First, it's important to understand the concept of joint probability. Joint probability is the likelihood of two or more events occurring simultaneously. In this case, we are interested in finding the joint probability of X being greater than a, Y being greater than b, and Z being greater than c.

To find this joint probability, we need to use the concept of conditional probability. Conditional probability is the probability of an event occurring given that another event has already occurred. In this case, we can find the joint probability by first finding the conditional probability of Z being greater than c given that X is greater than a and Y is greater than b. This can be written as P(Z>c|X>a and Y>b).

To find this conditional probability, we can use the joint probability density function of X and Y. Since X and Y are independent standard normal random variables, their joint probability density function is given by f(x,y) = (1/2π)e^(-(x^2+y^2)/2). Using this function, we can find the conditional probability as follows:

P(Z>c|X>a and Y>b) = ∫∫f(x,y)dxdy where the limits of integration are from a to infinity for x and b to infinity for y.

Once we have the conditional probability, we can use the definition of joint probability to find the joint probability P(X>a and Y>b and Z>c). This can be written as P(X>a and Y>b and Z>c) = P(Z>c|X>a and Y>b) * P(X>a and Y>b). Since X and Y are independent, we can multiply their individual probabilities to find the joint probability.

Therefore, the joint probability of P(X>a and Y>b and Z>c) can be found by using the conditional probability and the joint probability of X and Y. This approach can be used for any correlated random variables, not just for the sum of normal random variables.
 

1. What is the definition of joint probability of sum normal?

The joint probability of sum normal is a statistical concept that measures the likelihood of two or more normally distributed variables occurring together. It takes into account the combined effect of each variable on the overall probability.

2. How is the joint probability of sum normal calculated?

The joint probability of sum normal is calculated by multiplying the individual probabilities of each variable together. This is based on the principle of independence, where the occurrence of one event does not affect the occurrence of another event.

3. What is an example of a situation where joint probability of sum normal is used?

An example of using joint probability of sum normal is in risk analysis for insurance companies. They may use it to calculate the likelihood of multiple events, such as a car accident and a house fire, happening at the same time.

4. How is the concept of joint probability of sum normal related to the central limit theorem?

The central limit theorem states that as the sample size increases, the distribution of the sample mean approaches a normal distribution. This is related to the joint probability of sum normal because it assumes that the variables being measured are normally distributed.

5. Can the joint probability of sum normal be greater than 1?

No, the joint probability of sum normal cannot be greater than 1. This is because probabilities are always between 0 and 1, and when multiplied together, the result will always be smaller than the individual probabilities.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
473
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
920
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
682
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
846
Back
Top