# Insights Scientific Inference P3: Balancing predictive success with falsifiability - Comments

Tags:
1. May 12, 2016

### bapowell

2. May 12, 2016

### Greg Bernhardt

This series really is phenomenal!

3. May 12, 2016

### stevendaryl

Staff Emeritus
I second Greg's comment.

But it occurred to me that we really have no basis at all for assigning $P(H)$, the a priori probability of a hypothesis. I suppose that at any given time, there are only a handful of hypotheses that have actually been developed to the extent of making testable predictions, so maybe you can just weight them all equally?

4. May 12, 2016

### stevendaryl

Staff Emeritus
In the article, it's not $P(H)$ but $P(H | \mathcal{M})$, but I'm not sure that I understand the role of $\mathcal{M}$ here.

5. May 12, 2016

### Hornbein

There should be a basis for assigning the prior. It's just not part of the math.

You could collect data that 1% of the population has AIDS. That would be your prior for an individual having the condition.

6. May 12, 2016

### stevendaryl

Staff Emeritus
Okay, I was thinking of a different type of "hypothesis": a law-like hypothesis such as Newton's law of gravity, or the hypothesis that AIDS is caused by HIV. I don't know how you would assign a prior to such things.

7. May 13, 2016

### bapowell

You can think of the hypothesis as being the value of a certain parameter, like the curvature of the universe. The model is the underlying theory relating that parameter to the observation, and should include prior information like the range of the parameter.

8. May 14, 2016

### anorlunda

Suppose we observe something that is totally unrelated to the underlying theory. What is M in that case? What is P(H|M) in that case?

Edit: I should have asked what is P(O|M) in that case?

9. May 14, 2016

### bapowell

M can be thought of as the underlying theory, which in practice is a set of equations relating the observable quantities to a set of parameters (together with constraints on those parameters, like the ranges of permitted values). If an observation is made that is not well-accommodated by the model M, then we will find low posterior probabilities for the parameters of the model, p(H|O). This is a signal that we either need to consider additional parameters within M, or consider a new M altogether.

Last edited: May 16, 2016
10. May 16, 2016