# A How To Do Fisher Forecasting For Constraining Parameters

Tags:
1. Jun 9, 2017

### xdrgnh

I'm doing a Fisher Forecast to satisfy my experimental physics requirement. I never done anything like this before so I'm kind of in the dark of how to proceed. I have done some readings on Fisher Forecasting and would like to outline how I would do it and then ask if you can tell me if I'm right.

1. I have a list of observables and how they vary with parameters I want to know the errors bars for.

2. I also have a list of upcoming experiments which can constrain these parameters with there expected noise levels and instrumental uncertainty.

3. I have covariant matrices from experiments done in the past that constrain these parameters.

4. I then form my fisher matrix with the following equation
1. Once I have those matrices from all of my experiments for each parameter I then invert those matrices, add them onto the covariant matrices from past experiments. I then invert that to get my Fisher Information Matrix

2. In order to find the uncertainties for each of my parameters once I have the Fisher Information Matrix I compute where the 11 can be ij and the all I need to worry about are the diagonal elements of this inverse.
And then I'm done I think. Can anyone tell me if it is really this simple or am I missing a whole bunch of stuff. I never did data analysis before and I need to be fully done by the end of this month.

Thanks you

2. Jun 9, 2017

### xdrgnh

Oh and the model in equation is a modified friedmann equation and I'm assuming all of my errors are gaussian in nature.

3. Jun 10, 2017

### kimbyd

Right, the diagonal elements of the inverse of the Fisher information matrix are the uncertainties on the individual variables. This is because the inverse of the Fisher Information Matrix is the Covariance matrix, whose diagonal elements are just the variances.

In practice you have to be careful with this kind of calculation, because if your Fisher information matrix has any really small eigenvalues it can make the calculations unstable. Typically this means that you are using a model with more variables than your experiment can constrain. There are ways to deal with this, but it's always a bit finicky.

4. Jun 10, 2017

### xdrgnh

I'm trying to constrain 5 parameters and I will be doing all of this on Mathematica. In total from past experiments I have roughly 600 data points. From your experience how long does it take to perform a Fisher Forecast?

5. Jun 10, 2017

### kimbyd

In terms of calculation time, it should take a couple of seconds at the most (likely much less). All of the time will be spent on figuring out precisely how to do it, and getting it to work.

The number of data points isn't as important as what those data points do to the uncertainties. You could have ten million data points and there could still be a problem if you have variables (or a combination of variables) that aren't affected by the data.

To take one example, if you make use of supernova data, and allow the absolute magnitude of the supernovae to be a free parameter, then you cannot measure $H_0$ from the supernova data at all. You can get $H_0$ only if you combine the supernova data with something else (such as an estimate of the absolute magnitude of the supernovae, or of $H_0$ from a different source such as the CMB).

6. Jun 11, 2017

### xdrgnh

Well I'm using Supernova data, GRB data, BAO data and CMB data. I haven't begun yet because my adviser first wants me to outline to her the Fisher Forecast method. Once she thinks I understand it well enough she will give me papers for upcoming experiments and then I'll be on my way to Fisher Forecasting Tuesday of this upcoming week. Those variables that are not effected by the data, are those the nuisance parameters you are supposed to marginalize?

7. Jun 11, 2017

### kimbyd

That sounds like a pretty good combination of data that should work pretty well to constrain most cosmological parameters.

As for nuisance parameter, I think you're misunderstanding slightly. The absolute magnitude parameter is a nuisance parameter, because it's not one that helps you understand the expansion of the universe that's being measured. And yes, you have to marginalize over it.

What I was describing, however, was a degeneracy in the data. And while I described a degeneracy that was due to a nuisance parameter and a parameter of interest ($H_0$), they can happen in any parameters. Ideally you won't have this problem. If you do, it can be tricky to deal with. I'd forge ahead assuming it won't happen for now. Just bear in mind that it might, and the way you can tell is by looking at the eigenvalues of your Fisher matrix. If any of them are much, much smaller than the others, you've got a degeneracy (how much smaller depends upon the calculation method, but you're likely to have issues if one eigenvalue is more than about $10^{10}$ times another).