Maximum of the Likelihood estimation

Click For Summary
SUMMARY

The discussion centers on the concept of Maximum Likelihood Estimation (MLE) in statistics, specifically how to interpret the maximization of the Likelihood function. Participants clarify that finding the maximum likelihood estimator, denoted as $\hat{\theta}=\frac{4}{5}$, allows for the most probable approximation of parameters for a given probability distribution. The example provided illustrates how to derive the Likelihood function from observed data and emphasizes its utility in predicting population characteristics based on assumed distributions.

PREREQUISITES
  • Understanding of Likelihood functions in statistics
  • Familiarity with probability distributions, particularly normal distribution
  • Basic knowledge of statistical parameters such as mean ($\mu$) and standard deviation ($\sigma$)
  • Experience with Maximum Likelihood Estimation techniques
NEXT STEPS
  • Explore the derivation of Likelihood functions for various distributions
  • Learn about the implications of MLE in real-world data analysis
  • Investigate the use of software tools like R or Python for performing MLE
  • Study the differences between MLE and Bayesian estimation methods
USEFUL FOR

Statisticians, data analysts, and researchers interested in parameter estimation and predictive modeling using statistical methods.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

I am looking at the Likelihood function. I have understood how we define that function, but having find the maximum of the Likelihood estimation, what is the meaning of that?

What information do we have, when we have found the $\theta$ so that the Likelihood function L is maximized?

(Wondering)
 
Physics news on Phys.org
mathmari said:
Hey! :o

I am looking at the Likelihood function. I have understood how we define that function, but having find the maximum of the Likelihood estimation, what is the meaning of that?

What information do we have, when we have found the $\theta$ so that the Likelihood function L is maximized?

Hey mathmari! (Smile)

In statistics we typically try to find a probability distribution that describes a population.
It helps us to understand the population, and moreover to make predictions.
Often enough that's a normal distribution with a mean value and a standard deviation.
It's usually fairly straight forward to find an approximation for those particular parameters.

More generally a population is described by the assumption of a distribution combined with the parameters defining it, like a normal distribution characterized by $\mu$ and $\sigma$, but we can also have different distributions, parameters, and it may not be so straight forward to find the parameters.
Maximizing the likelihood function means we find the most likely approximation of parameters given an assumed distribution. (Thinking)
 
I like Serena said:
In statistics we typically try to find a probability distribution that describes a population.
It helps us to understand the population, and moreover to make predictions.
Often enough that's a normal distribution with a mean value combined and a standard deviation.
It's usually fairly straight forward to find and approximation for those particular parameters.

More generally a population is described the assumption of a distribution combined with the parameters defining it, like a normal distribution characterized by $\mu$ and $\sigma$, but we can also have different distributions, parameters, and it may not be so straight forward to find the parameters.
Maximizing the likelihood function means we find the most likely approximation of parameters given an assumed distribution. (Thinking)

Let's consider a specific example.

An employee starts work around 8:00 am The general beginning of duty varies by up to $2$ minutes up or down. We have the following:

[table="width: 700"]
[tr]
[td]$X$ = Beginning of duty (Difference to 8 o'clock in minutes) [/td]
[td]$-2$ [/td]
[td]$-1$ [/td]
[td]$1$ [/td]
[td]$2$[/td]
[/tr]
[tr]
[td]$\mathbb{P}(X=x)$ [/td]
[td]$0.2\, \theta$[/td]
[td]$0.3\, \theta$[/td]
[td]$0.5\, \theta$[/td]
[td]$1-\theta$[/td]
[/tr]
[/table]
For $10$ consecutive working days, the following values have occurred:
\begin{equation*}-1, \ \ \ 2, \ -2, \ -2, \ \ \ 1, \ \ \ 1, \ \ \ 2, \ -1, \ \ \ 1, \ -1\end{equation*}

From these infotmation we get the Likelihood function:
\begin{equation*}L(-1, 2, -2, -2, 1, 1, 2, -1, 1, -1 \mid \theta ) = 0.000135\cdot \theta^8\cdot (1-\theta)^2 \end{equation*}

The maximum Likelihood estimator is $\hat{\theta}=\frac{4}{5}$.

Do we calculate each probabilities $\mathbb{P}(X=x_i)$ with this value of $\theta$ and then we have the predictions what time the duty begins? (Wondering)
 
mathmari said:
Do we calculate each probabilities $\mathbb{P}(X=x_i)$ with this value of $\theta$ and then we have the predictions what time the duty begins? (Wondering)

Yes.
Now we have found the most likely distribution based on an assumed distribution with an unknown parameter.
To be fair, it looks like a somewhat unrealistic distribution. Then again, it's just an example how maximum likelihood works. (Thinking)
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K