Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Reverse Conditional Probabilities

  1. Oct 25, 2012 #1
    I've written a modified mutation algorithm that I am trying to derive a more analytical probability model for. The basic algorithm works like this:

    1. The probability of mutation is P(M) = 0.01.
    2. If mutation occurs, then:
    a. The probability that mutation-type A is P(A|M) = 0.50
    b. The probability that mutation-type B is P(B|M) = 0.40
    c. The probability that mutation-type C is P(C|M) = 0.10

    My algorithm requires that P(A|M) + P(B|M) + P(C|M) = 1.

    Now, I'm trying to derive what P(A), P(B), and P(C) are, but since it has been a long time since I've had a course in probability, I'm at a bit of a loss. My guess is to use Bayes' rule, but I'm not sure how I should be applying it.

    My numerical MATLAB model is suggesting values such as 0.01 for M (which is known), 0.005 for P(A), 0.004 for P(B), and 0.001 for P(C). This leads me to believe Bayes' rule does not apply, but my understanding is that it does...

    Can anyone provide me some help?
     
  2. jcsd
  3. Oct 25, 2012 #2

    jedishrfu

    Staff: Mentor

    Here's some info on bayes inference calculations that may help:

    http://en.wikipedia.org/wiki/Bayesian_inference

    Notice that in Bayes: P(A|M) = P(M|A) * P(M) / P(A) and it seems that P(M|A) =1 and you have P(M) so you should be able to compute P(A).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Reverse Conditional Probabilities
Loading...