Graduate How To Do Fisher Forecasting For Constraining Parameters

Click For Summary
The discussion focuses on performing a Fisher Forecast for experimental physics, detailing the process of constructing a Fisher Information Matrix using observables, upcoming experiments, and past covariant matrices. The diagonal elements of the inverse Fisher Information Matrix provide the uncertainties for the parameters being analyzed. Caution is advised regarding small eigenvalues in the Fisher matrix, which can indicate instability and potential issues with model constraints. The participant plans to use various data sources, including supernova and CMB data, and is preparing to outline the method for their advisor. Overall, the conversation emphasizes the importance of understanding the underlying data relationships and potential degeneracies in parameter estimation.
xdrgnh
Messages
415
Reaction score
0
I'm doing a Fisher Forecast to satisfy my experimental physics requirement. I never done anything like this before so I'm kind of in the dark of how to proceed. I have done some readings on Fisher Forecasting and would like to outline how I would do it and then ask if you can tell me if I'm right.

  1. I have a list of observables and how they vary with parameters I want to know the errors bars for.
  2. I also have a list of upcoming experiments which can constrain these parameters with there expected noise levels and instrumental uncertainty.
  3. I have covariant matrices from experiments done in the past that constrain these parameters.
  4. I then form my fisher matrix with the following equation
    q94SK.png

  1. Once I have those matrices from all of my experiments for each parameter I then invert those matrices, add them onto the covariant matrices from past experiments. I then invert that to get my Fisher Information Matrix
  2. In order to find the uncertainties for each of my parameters once I have the Fisher Information Matrix I compute
    D6Szb.png
    where the 11 can be ij and the all I need to worry about are the diagonal elements of this inverse.
And then I'm done I think. Can anyone tell me if it is really this simple or am I missing a whole bunch of stuff. I never did data analysis before and I need to be fully done by the end of this month.

Thanks you
 
Space news on Phys.org
Oh and the model in equation is a modified friedmann equation and I'm assuming all of my errors are gaussian in nature.
 
xdrgnh said:
I'm doing a Fisher Forecast to satisfy my experimental physics requirement. I never done anything like this before so I'm kind of in the dark of how to proceed. I have done some readings on Fisher Forecasting and would like to outline how I would do it and then ask if you can tell me if I'm right.

  1. I have a list of observables and how they vary with parameters I want to know the errors bars for.
  2. I also have a list of upcoming experiments which can constrain these parameters with there expected noise levels and instrumental uncertainty.
  3. I have covariant matrices from experiments done in the past that constrain these parameters.
  4. I then form my fisher matrix with the following equationView attachment 205166
  1. Once I have those matrices from all of my experiments for each parameter I then invert those matrices, add them onto the covariant matrices from past experiments. I then invert that to get my Fisher Information Matrix
  2. In order to find the uncertainties for each of my parameters once I have the Fisher Information Matrix I compute View attachment 205167 where the 11 can be ij and the all I need to worry about are the diagonal elements of this inverse.
And then I'm done I think. Can anyone tell me if it is really this simple or am I missing a whole bunch of stuff. I never did data analysis before and I need to be fully done by the end of this month.

Thanks you
Right, the diagonal elements of the inverse of the Fisher information matrix are the uncertainties on the individual variables. This is because the inverse of the Fisher Information Matrix is the Covariance matrix, whose diagonal elements are just the variances.

In practice you have to be careful with this kind of calculation, because if your Fisher information matrix has any really small eigenvalues it can make the calculations unstable. Typically this means that you are using a model with more variables than your experiment can constrain. There are ways to deal with this, but it's always a bit finicky.
 
  • Like
Likes xdrgnh
kimbyd said:
Right, the diagonal elements of the inverse of the Fisher information matrix are the uncertainties on the individual variables. This is because the inverse of the Fisher Information Matrix is the Covariance matrix, whose diagonal elements are just the variances.

In practice you have to be careful with this kind of calculation, because if your Fisher information matrix has any really small eigenvalues it can make the calculations unstable. Typically this means that you are using a model with more variables than your experiment can constrain. There are ways to deal with this, but it's always a bit finicky.
I'm trying to constrain 5 parameters and I will be doing all of this on Mathematica. In total from past experiments I have roughly 600 data points. From your experience how long does it take to perform a Fisher Forecast?
 
xdrgnh said:
I'm trying to constrain 5 parameters and I will be doing all of this on Mathematica. In total from past experiments I have roughly 600 data points. From your experience how long does it take to perform a Fisher Forecast?
In terms of calculation time, it should take a couple of seconds at the most (likely much less). All of the time will be spent on figuring out precisely how to do it, and getting it to work.

The number of data points isn't as important as what those data points do to the uncertainties. You could have ten million data points and there could still be a problem if you have variables (or a combination of variables) that aren't affected by the data.

To take one example, if you make use of supernova data, and allow the absolute magnitude of the supernovae to be a free parameter, then you cannot measure ##H_0## from the supernova data at all. You can get ##H_0## only if you combine the supernova data with something else (such as an estimate of the absolute magnitude of the supernovae, or of ##H_0## from a different source such as the CMB).
 
  • Like
Likes xdrgnh
kimbyd said:
In terms of calculation time, it should take a couple of seconds at the most (likely much less). All of the time will be spent on figuring out precisely how to do it, and getting it to work.

The number of data points isn't as important as what those data points do to the uncertainties. You could have ten million data points and there could still be a problem if you have variables (or a combination of variables) that aren't affected by the data.

To take one example, if you make use of supernova data, and allow the absolute magnitude of the supernovae to be a free parameter, then you cannot measure ##H_0## from the supernova data at all. You can get ##H_0## only if you combine the supernova data with something else (such as an estimate of the absolute magnitude of the supernovae, or of ##H_0## from a different source such as the CMB).
Well I'm using Supernova data, GRB data, BAO data and CMB data. I haven't begun yet because my adviser first wants me to outline to her the Fisher Forecast method. Once she thinks I understand it well enough she will give me papers for upcoming experiments and then I'll be on my way to Fisher Forecasting Tuesday of this upcoming week. Those variables that are not effected by the data, are those the nuisance parameters you are supposed to marginalize?
 
That sounds like a pretty good combination of data that should work pretty well to constrain most cosmological parameters.

As for nuisance parameter, I think you're misunderstanding slightly. The absolute magnitude parameter is a nuisance parameter, because it's not one that helps you understand the expansion of the universe that's being measured. And yes, you have to marginalize over it.

What I was describing, however, was a degeneracy in the data. And while I described a degeneracy that was due to a nuisance parameter and a parameter of interest (##H_0##), they can happen in any parameters. Ideally you won't have this problem. If you do, it can be tricky to deal with. I'd forge ahead assuming it won't happen for now. Just bear in mind that it might, and the way you can tell is by looking at the eigenvalues of your Fisher matrix. If any of them are much, much smaller than the others, you've got a degeneracy (how much smaller depends upon the calculation method, but you're likely to have issues if one eigenvalue is more than about ##10^{10}## times another).
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
953
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K