Maximizing Likelihood Estimator of β

  • Context: MHB 
  • Thread starter Thread starter jmorgan
  • Start date Start date
  • Tags Tags
    Likelihood
Click For Summary
SUMMARY

The discussion focuses on deriving the maximum likelihood estimator (MLE) of the parameter β in a given probability distribution, assuming α is known. The likelihood function is defined as L(β) = (α!βα+1)-n.Σxiα.eΣxi/βn. The correct approach involves taking the logarithm of the likelihood function, leading to the equation log L = -n log(α) - n(α+1) log(β) + α Σ log(xi) - Σ xi/β. The derivative with respect to β is calculated, and setting it to zero allows for solving β, which yields the MLE.

PREREQUISITES
  • Understanding of maximum likelihood estimation (MLE)
  • Familiarity with probability distributions and likelihood functions
  • Knowledge of calculus, particularly differentiation
  • Basic statistics concepts, including parameters and estimators
NEXT STEPS
  • Study the derivation of maximum likelihood estimators for different distributions
  • Learn about the properties of likelihood functions and their applications
  • Explore the use of logarithmic transformations in statistical estimation
  • Investigate the implications of the MLE in real-world data analysis scenarios
USEFUL FOR

Statisticians, data scientists, and researchers involved in statistical modeling and estimation techniques will benefit from this discussion.

jmorgan
Messages
5
Reaction score
0
Assuming α is known, find the maximum likelihood estimator of β

f(x;α,β) = , 1 ,,,,,,, .(xα.e-x/β)
,,,,,, ,,,,,,α!βα+1

I know that firstly you must take the likelihood of L(β). But unsure if I have done it correctly. I came out with the answer below, please can someone tell me where/if I have gone wrong.

L(β)= (α!βα+1)-n.Σxiα.eΣxi/βn
 
Physics news on Phys.org
I don't understand your question. The "maximum Likelihood" estimator for a parameter is the value of the parameter that makes a given outcome most likely. But you have not given an "outcome" here.
 
I think that you're going in the right direction. However, your calculation is not entirely correct. Suppose that we have given observations $x_1,\ldots,x_n$ from the given distribution. The likelihood is then given by
$$\mathcal{L}(x_1,\ldots,x_n,\alpha,\beta) = \prod_{i=1}^{n} \frac{1}{\alpha ! \beta^{\alpha+1}} x_i^{\alpha}e^{-x_i/\beta}.$$
We wish to find the value of $\beta$ that maximizes the likelihood. Since it is quite common to work with the logarithm, let us first take the log of both sides:
$$\log \mathcal{L}(x_1,\ldots,x_n,\alpha,\beta) = -n \log(\alpha) - n (\alpha+1) \log(\beta)+ \alpha \sum_{i=1}^{n} \log(x_i) - \frac{\sum_{i=1}^{n} x_i}{\beta}.$$
Taking the derivative w.r.t $\beta$, we obtain
$$\frac{\partial \log \mathcal{L}(x_1,\ldots,x_n,\alpha,\beta)}{d\beta} = -n(\alpha+1)\frac{1}{\beta} - \frac{1}{\beta^2} \sum_{i=1}^{n} x_i.$$
To proceed, set the RHS equal to $0$ and solve for $\beta$. This is the required MLE.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
4K