Is the Likelihood Function a Multivariate Gaussian Near a Minimum?

  • Context: Undergrad 
  • Thread starter Thread starter kelly0303
  • Start date Start date
  • Tags Tags
    Function Likelihood
Click For Summary

Discussion Overview

The discussion revolves around the nature of the likelihood function near a minimum, particularly in the context of the chi-square minimization as described in a specific textbook. Participants explore whether the likelihood function can be considered a multivariate Gaussian and the conditions under which this holds true.

Discussion Character

  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant references a formula from a textbook suggesting that the likelihood function becomes Gaussian near a minimum, questioning if this is a general case or a simplification assuming zero correlation between parameters.
  • Another participant agrees that the Gaussian approximation holds when only one parameter is varied while others are fixed at their minimum values.
  • A follow-up question seeks clarification on whether the likelihood function becomes a multivariate Gaussian when considering correlations among parameters away from the minimum.
  • One participant asserts that a multivariate Gaussian can describe the function in a small region around the minimum.

Areas of Agreement / Disagreement

Participants generally agree that the likelihood function can be approximated as Gaussian under certain conditions, but there is uncertainty regarding the implications of correlations among parameters and whether this holds true away from the minimum.

Contextual Notes

The discussion does not resolve the mathematical details or provide a definitive proof of the claims regarding the likelihood function's behavior near a minimum.

kelly0303
Messages
573
Reaction score
33
Hello! I am reading Data Reduction and Error Analysis by Bevington, 3rd Edition and in Chapter 8.1, Variation of ##\chi^2## Near a Minimum he states that for enough data the likelihood function becomes a Gaussian function of each parameter, with the mean being the value that minimizes the chi-square: $$P(a_j)=Ae^{-(a_j-a_j')^2/2\sigma_j^2}$$ where ##A## is a function of the other parameters, but not ##a_j##. Is this the general formula or is it a simplification where the correlation between the parameters is zero? From some examples later I guess this is just a particular case and I assume the most general formula would be a multivariate gaussian, but he doesn't explicitly state this anywhere. Can someone tell me what's the actual formula? Also, can someone point me towards a proof of this? Thank you!
 
  • Informative
Likes   Reactions: BvU
Physics news on Phys.org
It is true if you only consider changes in aj and fix all other parameters to their value at the minimum, but not otherwise.
 
mfb said:
It is true if you only consider changes in aj and fix all other parameters to their value at the minimum, but not otherwise.
Thank you! Does it become a multivariate Gaussian, tho, away from the minimum (if there are correlations)? Or does this formula apply only at the minimum value?
 
A multivariate Gaussian can describe the function in a small region around the minimum.
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 5 ·
Replies
5
Views
9K
  • · Replies 7 ·
Replies
7
Views
3K