MHB Prior probability distributions

AI Thread Summary
The discussion centers on estimating a parameter k within a finite interval (a;b) using a series of measurements X with known standard deviations. Participants are inquiring about Jeffreys prior and Bernardo's prior in relation to this estimation process. The key question is whether these priors can ensure that the posterior distribution approximates the Maximum Likelihood Estimate closely. The conversation highlights the importance of prior distributions in Bayesian analysis and their influence on posterior outcomes. Understanding the relationship between priors and likelihood shapes is crucial for effective parameter estimation.
lotharson
Messages
2
Reaction score
0
Hi folks.

I've a question.

Let k be a parameter which must be estimated. It lies within the interval (a;b), a and b being finite real numbers.

Let us further assume we dispose of a series of measurements X of known standard deviations.
X is a complex function of k.

What are Jeffreys prior Bernardo's prior?

Many thanks for your answer :-)
 
Physics news on Phys.org
Do we dispose of some type of guarantee that these priors imitate the shape of the likelihood in such a way that the posterior distribution delivers us a result close enough to the Maximum Likelihood Estimate?
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Back
Top