Maximum of the Likelihood estimation

Click For Summary

Discussion Overview

The discussion revolves around the concept of Maximum Likelihood Estimation (MLE) in statistics, specifically focusing on the interpretation of the maximum of the likelihood function and its implications for understanding a population's distribution. The scope includes theoretical aspects of MLE, practical examples, and predictions based on statistical models.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Mathematical reasoning

Main Points Raised

  • Some participants express curiosity about the meaning of finding the maximum of the likelihood function and what information it conveys about the parameter $\theta$.
  • One participant explains that maximizing the likelihood function helps in finding the most likely approximation of parameters given an assumed distribution, often leading to predictions about a population.
  • A specific example involving the beginning of duty times for employees is presented, illustrating how to construct a likelihood function based on observed data.
  • Another participant confirms that calculating probabilities with the estimated $\theta$ allows for predictions regarding the timing of duties, while also noting that the example may represent a somewhat unrealistic distribution.

Areas of Agreement / Disagreement

Participants generally agree on the process of MLE and its purpose in estimating parameters, but there is no consensus on the realism of the example distribution presented. The discussion remains exploratory without definitive conclusions.

Contextual Notes

The discussion includes assumptions about the underlying distribution and the parameters involved, which may not be explicitly stated. The example provided may not reflect practical scenarios, and the limitations of the model are acknowledged but not resolved.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

I am looking at the Likelihood function. I have understood how we define that function, but having find the maximum of the Likelihood estimation, what is the meaning of that?

What information do we have, when we have found the $\theta$ so that the Likelihood function L is maximized?

(Wondering)
 
Physics news on Phys.org
mathmari said:
Hey! :o

I am looking at the Likelihood function. I have understood how we define that function, but having find the maximum of the Likelihood estimation, what is the meaning of that?

What information do we have, when we have found the $\theta$ so that the Likelihood function L is maximized?

Hey mathmari! (Smile)

In statistics we typically try to find a probability distribution that describes a population.
It helps us to understand the population, and moreover to make predictions.
Often enough that's a normal distribution with a mean value and a standard deviation.
It's usually fairly straight forward to find an approximation for those particular parameters.

More generally a population is described by the assumption of a distribution combined with the parameters defining it, like a normal distribution characterized by $\mu$ and $\sigma$, but we can also have different distributions, parameters, and it may not be so straight forward to find the parameters.
Maximizing the likelihood function means we find the most likely approximation of parameters given an assumed distribution. (Thinking)
 
I like Serena said:
In statistics we typically try to find a probability distribution that describes a population.
It helps us to understand the population, and moreover to make predictions.
Often enough that's a normal distribution with a mean value combined and a standard deviation.
It's usually fairly straight forward to find and approximation for those particular parameters.

More generally a population is described the assumption of a distribution combined with the parameters defining it, like a normal distribution characterized by $\mu$ and $\sigma$, but we can also have different distributions, parameters, and it may not be so straight forward to find the parameters.
Maximizing the likelihood function means we find the most likely approximation of parameters given an assumed distribution. (Thinking)

Let's consider a specific example.

An employee starts work around 8:00 am The general beginning of duty varies by up to $2$ minutes up or down. We have the following:

[table="width: 700"]
[tr]
[td]$X$ = Beginning of duty (Difference to 8 o'clock in minutes) [/td]
[td]$-2$ [/td]
[td]$-1$ [/td]
[td]$1$ [/td]
[td]$2$[/td]
[/tr]
[tr]
[td]$\mathbb{P}(X=x)$ [/td]
[td]$0.2\, \theta$[/td]
[td]$0.3\, \theta$[/td]
[td]$0.5\, \theta$[/td]
[td]$1-\theta$[/td]
[/tr]
[/table]
For $10$ consecutive working days, the following values have occurred:
\begin{equation*}-1, \ \ \ 2, \ -2, \ -2, \ \ \ 1, \ \ \ 1, \ \ \ 2, \ -1, \ \ \ 1, \ -1\end{equation*}

From these infotmation we get the Likelihood function:
\begin{equation*}L(-1, 2, -2, -2, 1, 1, 2, -1, 1, -1 \mid \theta ) = 0.000135\cdot \theta^8\cdot (1-\theta)^2 \end{equation*}

The maximum Likelihood estimator is $\hat{\theta}=\frac{4}{5}$.

Do we calculate each probabilities $\mathbb{P}(X=x_i)$ with this value of $\theta$ and then we have the predictions what time the duty begins? (Wondering)
 
mathmari said:
Do we calculate each probabilities $\mathbb{P}(X=x_i)$ with this value of $\theta$ and then we have the predictions what time the duty begins? (Wondering)

Yes.
Now we have found the most likely distribution based on an assumed distribution with an unknown parameter.
To be fair, it looks like a somewhat unrealistic distribution. Then again, it's just an example how maximum likelihood works. (Thinking)
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K