Can Maximum Likelihood Estimation Yield Multiple Solutions?

Click For Summary
SUMMARY

The discussion centers on the potential for multiple solutions in Maximum Likelihood Estimation (MLE) when estimating parameters such as 'a' and 'sigma'. Chen raises concerns about the uniqueness of estimators when the likelihood function is maximized with respect to both parameters, particularly when the variance of the statistical error is unknown. The response indicates that if the likelihood function is everywhere concave, it guarantees a unique maximum, thus avoiding the issue of multiple solutions.

PREREQUISITES
  • Understanding of Maximum Likelihood Estimation (MLE)
  • Familiarity with likelihood functions and their properties
  • Knowledge of statistical distributions, particularly the exponential distribution
  • Basic concepts of concavity in mathematical functions
NEXT STEPS
  • Study the properties of concave functions in optimization contexts
  • Explore the implications of unknown variance in MLE using statistical software
  • Learn about the behavior of likelihood functions in different statistical models
  • Investigate numerical methods for maximizing likelihood functions
USEFUL FOR

Statisticians, data scientists, and students of statistical methods who are working with Maximum Likelihood Estimation and seeking to understand the implications of parameter estimation in statistical models.

Chen
Messages
976
Reaction score
1
Hi,

I'm taking a basic course in statistical methods, and we recently learned of maximum likelihood estimation. We defined the likelihood as a function of some parameter 'a', and found the estimator of 'a' by requiring a maximum likelihood with respect to it.

As an example, we took the results of some fictional experiment, in which the measurements 't' are distributed according to exp(-at), with a statistical error that is normally distributed with a known variance. We defined the likelihood function and maximized it, numerically, in order to find the estimated value of 'a'.

However, in this example we assumed that we know the variance of the statistical error. If this value of 'sigma' is not known, then we must maximize the likelihood function with respect to both 'a' and 'sigma'. And now comes my question - isn't it possible that we will get more than one solution for this extremum problem? Couldn't it be that two different sets of 'a' and 'sigma' will give rise to the same distribution that we saw in the experiment? If that happens, what are the "real" estimators? And besides, even if we deal only with one parameter 'a', couldn't the likelihood function have maxima for two or more values of it?

Sorry about the lengthy post... :smile:

Thanks,
Chen
 
Physics news on Phys.org
All points you have raised are good points; it all depends on the lkhd function. If the lkhd is everywhere concave, then you won't have these problems.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K