Multiple events and using bayes theorem

In summary: For example, what other events are associated with H? If you don't know the answer to that, then you may need to consult with a mathematician.
  • #1
skyflashings
15
0
I am not sure if I am doing this correctly, so please check my attempt:

I have four discrete random variables: G, H, J, L

Say, I want to find P(H|J, G, L).

Then can I write as P(H|J, G, L) = P(J, G, L|H)*P(H) = P(J|G, L, H)*P(G|L, H)*P(L|H)*P(H)

Are those equivalent? I can't find an example exactly like this one; I just want to make sure that it works this way. Thanks!
 
Physics news on Phys.org
  • #2
skyflashings said:
Then can I write as P(H|J, G, L) = P(J, G, L|H)*P(H)

No. P(J,G,L | H) P(H) is equal to P(H,J,G,L), which is not the same as P(H|J,G,L)
[tex] P(H| J \cap G \cap L) = \frac{ P(H \cap J \cap G \cap L)}{P( J \cap G \cap L)} [/tex]

There are various ways to rewrite the right hand side.

For example, the numerator can be written as:

[tex] P( H \cap J \cap G \cap L) = P(H | J \cap G \cap L) P(J \cap G \cap L) [/tex]
[tex] = P(H| J \cap G \cap L) P(J | G \cap L) P(G \cap L) [/tex]
[tex] = P(H| J \cap G \cap L) P(J | G \cap L) P(G | L) P(L) [/tex]

which is similar to what you had in mind. You could also write it as you did, with P(H) as the last factor.
 
Last edited:
  • #3
Ok, I see how it unfolds now.

One of the reasons I wanted to get my final form is that I only have a distribution for P(H), P(J|H), P(G|L,H), and P(L|H) and there is a conditional independence assumption that P(J|H)=P(J|G,L,H).

So, I need to find P(H|J, G, L) in terms of those.

If I were to follow your steps to solve for P(H|J, G, L), would it look something like:

P(H|J, G, L) = P(H, J, G, L) / ((P(H|J, G, L)*P(J|G, L)*P(G|L)*P(L))

I'm not sure how to proceed at that point..

Thanks for the help, I hope I can wrap my head around this.
 
  • #4
I think you need more information to calculate P(H| J, G, L). You need to be able to calculate P(J,G,L) without the condition that the event H is true. If you know the problem has an answer, think about whether there is other known information.
 
  • #5


Your attempt is correct. Bayes' theorem states that the probability of an event A given an event B can be calculated by multiplying the probability of event B given event A (P(B|A)) with the prior probability of event A (P(A)) and dividing it by the probability of event B (P(B)). In your case, the events are G, H, J, and L, and you want to find the probability of H given J, G, and L. So, your formula is P(H|J, G, L) = P(J, G, L|H)*P(H)/P(J, G, L). This is equivalent to your calculation, where you have broken down P(J, G, L|H) into P(J|G, L, H)*P(G|L, H)*P(L|H). This is known as the chain rule or product rule of probability. So, your formula is valid and you can use it to find the desired probability. Just make sure to use the correct values for each probability and you should be able to get the correct result.
 

1. What is Bayes' theorem?

Bayes' theorem is a mathematical formula that describes the probability of an event occurring, based on prior knowledge of related events.

2. How is Bayes' theorem used in science?

Bayes' theorem is commonly used in science to update the probability of an event occurring as new evidence or information is obtained. It is particularly useful in situations where there are multiple events or variables that may influence the outcome.

3. What is the difference between prior and posterior probabilities in Bayes' theorem?

Prior probability refers to our initial belief or knowledge about the likelihood of an event occurring. Posterior probability, on the other hand, is the updated probability after taking into account new evidence or information.

4. How does Bayes' theorem account for multiple events?

Bayes' theorem uses conditional probabilities to take into account the influence of multiple events on the overall probability of an outcome. It allows us to update our beliefs as we gather more information about each event.

5. Can Bayes' theorem be applied to real-world situations?

Yes, Bayes' theorem has numerous applications in real-world situations, including in fields such as medicine, finance, and artificial intelligence. It is a powerful tool for making predictions and decisions based on uncertain or incomplete information.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
412
  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
694
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
1K
  • Precalculus Mathematics Homework Help
Replies
4
Views
3K
  • Advanced Physics Homework Help
Replies
1
Views
781
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
958
  • Precalculus Mathematics Homework Help
Replies
1
Views
940
  • Set Theory, Logic, Probability, Statistics
Replies
18
Views
890
Back
Top