Undergrad Fisher matrix - equivalence or not between sequences

Click For Summary
SUMMARY

The discussion focuses on the equivalence of two sequences in Fisher's formalism for parameter estimation. The first sequence involves starting from the full Fisher matrix, inverting it to obtain the covariance matrix, marginalizing a parameter, and then projecting the new Fisher matrix. The second sequence begins similarly but projects the Fisher matrix first before marginalizing. The participants debate whether both sequences yield the same final Fisher matrix and discuss the implications of parameter marginalization and change of variables in matrix algebra.

PREREQUISITES
  • Understanding of Fisher matrix and covariance matrix concepts
  • Familiarity with Jacobian matrices and their role in parameter transformations
  • Knowledge of matrix inversion techniques
  • Basic principles of marginalization in statistical estimation
NEXT STEPS
  • Study the properties of Fisher matrices and covariance matrices in detail
  • Learn about Jacobian transformations and their applications in parameter estimation
  • Explore matrix algebra techniques, particularly focusing on inversion and marginalization
  • Investigate the Schur Complement and its relevance in blocked matrix multiplication
USEFUL FOR

Statisticians, data scientists, and researchers involved in parameter estimation and statistical modeling, particularly those working with Fisher matrices and covariance analysis.

fab13
Messages
300
Reaction score
7
I am currently studying Fisher's formalism as part of parameter estimation.

From this documentation :

Iw60x.png


They that Fisher matrix is the inverse matrix of the covariance matrix. Initially, one builds a matrix "full" that takes into account all the parameters.

1) Projection : We can then do what we call a projection, that is to say that we express this matrix in another base by using the Jacobian matrix which involves the derivatives of the starting parameters with respect to the new parameters that one chooses: this matrix is called "projected matrix".

2) Marginalization : Once we have this projected matrix, we can do another operation which is the "marginalization of a parameter": that is to say that we delete in the projected matrix the row and the column corresponding to this marginalized parameter.

Finally, once I have the projected matrix, I can invert it to know the covariance matrix associated with the parameters (the new ones).

Now, I would like if the 2 following sequences gives the same final matrix :

1st sequence :

1.1) Starting from the Fisher matrix "full"

1.2) Invert "full" fisher Matrix to get Covariance matrix

1.3) Marginalize on Covariance matrix with respect to a parameter (or even several but I am interested first only in one), that is to say to remove the column and line corresponding to the parameter that one wants to marginalize.

1.4) Invert new Covariance matrix to get new Fisher

1.5) Project the new Fisher into new basis of parameters making the product: ## F_{\kappa \lambda} = \sum_ {i,j} \, J_{i \kappa} \, F_{ij} \, J _{\lambda j}##

2nd sequence :

2.1) Starting from the Fisher matrix "full"

2.2) Projecting with the Jacobian matrix

2.3) Invert to get new Covariance matrix

2.4) Marginalize the projected matrix = remove the column and line corresponding to the parameter that one wants to marginalize.

2.5) Invert to have the new Fisher matrix.

Will I get the same final Fisher matrix at the end of step 1.5) and step 2.5) ?

if this is the case, how could I prove it in an analytical way ?


Maybe the second sequence is not right to get the equivalence between sequence 1) and sequence 2) ?
 

Attachments

  • Iw60x.png
    Iw60x.png
    70.9 KB · Views: 841
Physics news on Phys.org
fab13 said:
I am currently studying Fisher's formalism as part of parameter estimation...

Now, I would like if the 2 following sequences gives the same final matrix :

1st sequence :

1.1) Starting from the Fisher matrix "full"

1.2) Invert "full" fisher Matrix to get Covariance matrix

1.3) Marginalize on Covariance matrix with respect to a parameter (or even several but I am interested first only in one), that is to say to remove the column and line corresponding to the parameter that one wants to marginalize.

1.4) Invert new Covariance matrix to get new Fisher

1.5) Project the new Fisher into new basis of parameters making the product: ## F_{\kappa \lambda} = \sum_ {i,j} \, J_{i \kappa} \, F_{ij} \, J _{\lambda j}##

2nd sequence ...

If it were me, I'd start by modelling all of part 1 with basic matrix algebra.

As a hint for ##1.3)##, consider what happens when you have ##\mathbf \Sigma## as an n x n covariance matrix and ##\mathbf S## as the n dimensional identity matrix with the final column deleted. now consider what happens when you compute

##\mathbf S^T \mathbf{\Sigma S}##

up to a graph isomorphism this is the deletion routine. (i.e. you can assume WLOG that you always want to delete final row and column)

- - - - -
After this, I'd then compare it with blocked multiplication using the Schur Complement which has "??" next to it in your text blurb.

Finally, write up the second sequence as well in terms of matrix algebra. Pinning down exactly what's going on with the Jacobian is going to be key.
 
@StoneTemplePython . Thanks for your quick answer.

I have a simple issue concerning the calculus of matrix algebra betwee 2 sequences : in the first case, if I chose a parameter ##\alpha## when I do a marginalization on covariance matrix (from full initial Fisher matrix), which new parameter ##\beta'## have I got to take in the second case when I want to marginalize on the inverse of Fisher's matrix projected (i.e I express the initial Fisher matrix into new basis of parameters with Jacobian) ?

Regards
 
fab13 said:
@StoneTemplePython . Thanks for your quick answer.

I have a simple issue concerning the calculus of matrix algebra betwee 2 sequences : in the first case, if I chose a parameter ##\alpha## when I do a marginalization on covariance matrix (from full initial Fisher matrix), which new parameter ##\beta'## have I got to take in the second case when I want to marginalize on the inverse of Fisher's matrix projected (i.e I express the initial Fisher matrix into new basis of parameters with Jacobian) ?

Regards

This really falls on you. It isn't clear to me what or why you want to do a change of variables. Writing this out in full, starting with sequence 1, as I suggested above, is the way forward. For an arbitrary change of variables, your deletion routine effected via ##\mathbf S## is going to lead to problems. The right way to do this is do your deletion / marginalization routine on the variables you want to get rid of... considering this problem for an arbitrary change of variable is going to not get you anywhere and will cause problems.

You may consider the case of a change of variables where you want to get marginalize ##x_n##... if you do a change of variables for all others except ##x_n## -- i.e. for ##\{x_1, ..., x_{n-1}\}## it should be ok. (Verify this with blocked matrix algebra.)

But if you do a change of variables incorporating ##x_n## so, say, the new variables are each a non-trivial convex combination of ##x_1, ..., x_{n-1}, x_n## then you can't possibly hope to delete / do the same marginalize via ##\mathbf S##.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K