Help with a statistical inference question regarding MLEs

  • Context: MHB 
  • Thread starter Thread starter cmk1300
  • Start date Start date
  • Tags Tags
    Statistical
Click For Summary
SUMMARY

The discussion focuses on deriving maximum likelihood estimators (MLEs) for parameters μ, σ, and τ from two independent normal distributions, N(μ, σ²) and N(μ, τ²), where σ² ≠ τ². Participants emphasize the importance of constructing the likelihood functions L(μ, σ²; x₁, ..., xₙ) and L(μ, τ²; y₁, ..., yₘ) and taking their logarithm to simplify the differentiation process. A practical example using the exponential distribution is provided to illustrate the steps involved in finding MLEs, including taking derivatives and setting them to zero to find parameter estimates.

PREREQUISITES
  • Understanding of maximum likelihood estimation (MLE)
  • Familiarity with probability density functions (PDFs) of normal and exponential distributions
  • Knowledge of calculus, particularly differentiation
  • Basic statistics concepts, including independent random samples
NEXT STEPS
  • Study the derivation of likelihood functions for different distributions
  • Learn about the properties of maximum likelihood estimators
  • Explore the use of log-likelihood functions in statistical inference
  • Investigate the implications of parameter estimation in real-world scenarios
USEFUL FOR

Students preparing for statistics exams, data scientists, and statisticians interested in understanding maximum likelihood estimation and its applications in statistical modeling.

cmk1300
Messages
2
Reaction score
0
Hello, I am wondering if anyone would be able to guide me through a practice question for an upcoming test. I am not feeling very confident at this point and would very much appreciate some help!

The question is as follows;

Let {X}_{1},{X}_{2}…, {X}_{n}and {Y}_{1},{Y}_{2}…,{Y}_{m} be two independent random samples from N(μ, {σ}^{2}) and N(μ,{τ}^{2}) respectively, where the parameters 𝜇, σ,τ, are unknown with -∞< μ<∞, σ > 0 and τ > 0. Assume that {σ}^{2}≠{τ}^{2} and both are unknown. Find the maximum likelihood estimators of μ, σ and τ (Hint: the likelihood function is the product of [L(μ,{σ}^{2};{x}_{1},{x}_{2}…,{x}_{n})] and [L(μ ,{τ}^{2};{y}_{1},{y}_{2}…,{y}_{m}]

Thank you in advance!
 
Physics news on Phys.org
As instructed by the hint you need the likelihood functions $\mathcal{L}(\mu, \sigma^2, x_1,\ldots,x_n)$ and $\mathcal{L}(\mu,\tau^2, y_1,\ldots,y_m)$. Can you derive those functions? For a normal distribution $\mathcal{N}(\mu,\sigma^2)$ the pdf is given by
$$f(x) = \frac{1}{\sigma \sqrt{2\pi}} \ \mbox{exp}\ \left\{ \frac{-1}{2} \left(\frac{x-\mu}{\sigma}\right)^2 \right\}$$
 
The derivation steps are the parts I struggle with the most... My professor does not exactly take the time to explain them either. Could you possibly guide me through this as well please?
 
Maybe it would be better to see how a simpler example works.
Try the exponential distribution whose pdf is f=(1/a)e^(-x/a)
Say you have 2 observations x_i=1,5
MLE is about estimating the parameter 'a' assuming that the observations actually happened.
What's the probability of those observations happening? It is fx_1*fx_2 which is then called the likelihood function L:

L(a)= [(1/a)e^(-1/a)] [(1/a)e^(-5/a)]
L(a)= (1/a)^2*e^(-(1+5)/a)

The next step is taking the log of the expression.
We do this because it is usually much easier to take partial derivatives with a function of + and - as opposed to * and /

l(a)= ln[(1/a)^2*e^(-(1+5)/a)]
l(a)= -2ln(a) -6/a

Then we take the derivative of l(a) with respect to 'a' (since this is the parameter we're estimating) and set the equation equal to 0 which is how we get the maximum of the function (from 1st year calculus)

0= -2/a + 6/a^2
a= 3

Try to see how L(a) would change if there were 3 observations or 10 or n.

Siron gave the pdf of a normal.
First figure out what L of n observations looks like.
Then multiply that with another L with n observations and a different shape parameter.

You'll estimate each of mu, sigma and t so once you have the loglikelihood function take the derivative with respect to each of these params and set to 0.

Hope this helps
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
1
Views
4K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K