Explicit joint probability distribution.

In summary, the conversation is discussing the joint probability distribution of three variables, X, Y, and Z, where Y and Z depend on X. The chain probability rule states that P(x,y,z) = P(x)P(y|x)P(z|x,y). The question is how to compute P(z|x,y) with the given data, and the individual is wondering if P(z|x,y) can be simplified to just P(z|x) due to Y and Z being independent of each other. The expert agrees with this assumption and states that the joint probability distribution is then P(x,y,z) = P(x)P(y|x)P(z|x).
  • #1
carllacan
274
3
Hi!

Suppose we have two variables Y and Z that depend on a third one, X. We are given P(x), P(y|x) and P(z|x). The joint probability distribution P(x,y,z), according to the chain probability rule, is given by P(x,y,z) = P(x)P(y|x)P(z|x,y)

But how can we compute P(z|x,y) with the given data?

Since Y does not depend on Z directly I "feel" that P(z|x,y) = P(z|x)(Px) but I can't find a logical reason for it.

Can you lend me a hand?

Thank you for your time.
 
Physics news on Phys.org
  • #2
Are y and z independent of each other?
 
  • #3
Stephen Tashi said:
Are y and z independent of each other?

This actually comes from an exercise in a book and neither P(y|z) nor P(z|y) are given, so I assume so.
 
Last edited:
  • #4
If z and y are independent then P(z|x,y) is just P(z|x). (NOT "P(z|x)P(x)" which is P(z))
 
  • Like
Likes 1 person
  • #5
And the joint probability distribution would simply be P(x,y,z) = P(x)P(y|x)P(z|x), right?

It which makes sense because Y and Z are, so to speak, symmetrical in the causal network, so they should also be symmetrical in this expression.
 

What is an explicit joint probability distribution?

An explicit joint probability distribution is a mathematical function that assigns a probability to each possible combination of values for two or more random variables. It describes the likelihood of each possible outcome occurring when multiple events are considered together.

How is an explicit joint probability distribution different from a marginal probability distribution?

An explicit joint probability distribution considers the probabilities of multiple events occurring together, while a marginal probability distribution only considers the probabilities of individual events occurring. In other words, an explicit joint probability distribution provides a more complete picture of the relationship between multiple random variables.

How are probabilities calculated for an explicit joint probability distribution?

Probabilities for an explicit joint probability distribution are calculated by multiplying the individual probabilities of each event occurring together. For example, if the probability of event A and event B occurring together is 0.4 and 0.5, respectively, the joint probability would be 0.4 x 0.5 = 0.2.

What is the difference between a discrete and continuous explicit joint probability distribution?

A discrete explicit joint probability distribution is used when the random variables being considered can only take on a finite number of values, while a continuous explicit joint probability distribution is used when the random variables can take on any value within a given range. This distinction affects the way probabilities are calculated and represented for each type of distribution.

How is an explicit joint probability distribution used in data analysis?

Explicit joint probability distributions are used to model and analyze the relationship between multiple variables in a dataset. They can help identify patterns and correlations between variables, and can be used to make predictions about the likelihood of certain outcomes occurring. This information can be used to inform decision making and improve overall understanding of a system or phenomenon.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
919
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
469
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
958
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
405
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Back
Top