B Regarding bayesian analysis/inference/ predictions

  • B
  • Thread starter Thread starter Javaxcore
  • Start date Start date
  • Tags Tags
    Bayesian
Javaxcore
Messages
6
Reaction score
1
I am forever hearing heady claims that bayesian (something or other) can help people to make better decisions and overall get closer to the truth of things.

However I have yet to discover an article, audio lecture or anything that really explains in a layway how to use or even understand this bayesian stuff please help?
 
Physics news on Phys.org
Thank you a lot it is extremely hard To get your head round
 
  • Like
Likes jedishrfu
Bayesian probability and inference is basically conditional probability applied to probability models and statistics (which is looking at taking data and finding estimates, inferences, hypotheses, and models with said data).

In terms of the conditional distributions assumptions and origin - that is where it gets philosophical.

If you understand conditional probability very well then Bayesian statistics is actually not that difficult.

In Bayesian statistics you have the constants of a distribution (like say mu and sigma in a Normal distribution) have their own distribution. It's an extra level of abstraction and what often happens in Bayesian statistics is that they try and find a prior distribution that makes sense both from a data and a deductive (strictly logical) point of view. The methods are derived to get the estimators for various statistics given various distributions and you also look at MCMC methods which can be used to simulate really complex distributions.

With the conditional distributions it means you have to look at how one event changes another if its not independent. If you can exhaust the state space of a random variable and do calculations to get the conditional distribution then you can do a lot of the problems because that's all it is - conditional probability applied to statistics.
 
  • Like
Likes Javaxcore
The core of Bayesian statistics is Bayes' rule, which is an elementary fact from elementary probability theory. Posterior odds = prior odds times likelihood ratio. Let H and K be two hypotheses, and D be some data. Then notice that
P(H | D) / P(K | D) = P(H) / P(D) * P(D | H) / P(D | K)
P(H) / P(K) is called the prior odds: your initial relative degree of belief in H versus K
P(D | H) / P(D | K) is called the likelihood ratio or Bayes factor: the ratio between the chances of getting data D, under the two hypotheses H and K
P(H | D) / P(K | D) is called the posterior odds: your final relative degree of belief in H versus K, after seeing the data D.

If you don't like odds (ratios of probabilities), we can state it in terms of straight probabilities as "posterior is proportional to prior times likelihood".

So for instance if you start by believing it 100 times more likely that H is true than that K is true, but then you observe data D which is 1 000 000 times more likely if K is true than if H is true, then you should revise your beliefs about H and K, and the correct revision is: you now believe it 10 000 times more likely that K is true than that H is true.
 
Bayesian analysis just tells you how to adjust your guesses as more information becomes available. You start with a guess at something based on certain (prior) probabilities. When you get more information, you should update your probabilities (posterior) and your guess. To do otherwise would be dumb. This gives you a connection between probability and information theory. As you get more information, you adjust your probabilities.
 
  • Like
Likes Javaxcore
Bayesian analysis is quite useful in cybernetics, including the cybernetics of human systems. I'm thinking for example of the work of Stafford Beer.
 
Back
Top