Likelihood function of the gamma distribution

Click For Summary

Discussion Overview

The discussion revolves around formulating the likelihood function for a gamma distribution based on a random sample of size n, with a known parameter r. Participants explore the derivation of the likelihood function, its logarithmic transformation, and the subsequent steps to find the maximum likelihood estimator (MLE) for the parameter λ.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant presents the density function of the gamma distribution and attempts to formulate the likelihood function as a product of individual density functions, expressing uncertainty about handling the product of terms.
  • Another participant suggests taking the logarithm of the likelihood function, noting that maximizing the log likelihood is equivalent to maximizing the likelihood itself.
  • There is a correction regarding the formulation of the likelihood function, emphasizing that it should be a product over the sample size rather than raised to the power of n.
  • A participant attempts to derive the MLE for λ and expresses uncertainty about whether their result is correct, specifically questioning the assumption that the sample average equals the distribution mean.
  • Another participant cautions against assuming the sample average equals the distribution mean, highlighting that the sample average is a random variable.
  • Further clarification is provided on the MLE derivation, with a suggestion to stop at a specific form of the estimator without confirming its correctness.

Areas of Agreement / Disagreement

Participants generally agree on the need to derive the likelihood function and its logarithmic form but express differing views on the assumptions regarding the sample average and its relationship to the distribution mean. The discussion remains unresolved regarding the correctness of the final MLE expression.

Contextual Notes

There are limitations regarding assumptions made about the sample average and its relation to the distribution mean, as well as the dependence on the specific formulation of the likelihood function.

safina
Messages
26
Reaction score
0
There is a random sample of size n from a gamma distribution, with known r. Please help me formulate the likelihood function of the gamma distribution.

I understand that the density function is the following:
f\left(y;r,\lambda\right)=\frac{\lambda}{\Gamma\left(r\right)}\left(\lambda x\right)^{r-1}e^{-\lambda x}

I also understand that the likelihood function is the product of the individual density functions.
Assuming independence, I write it as:
L\left(\underline{y};r, \lambda\right)=\left[f\left(y;r,\lambda\right)\right]^{n}
=\left[\frac{\lambda^{r}y^{r-1}e^{-\lambda y}}{\Gamma\left(r\right)}\right]^{n}

I am now stuck with the product of the y^{r-1} and \Gamma\left(r\right).

Please help me what to do, since I need the answer to find the maximum likelihood estimator of \lambda.
 
Physics news on Phys.org
Take the log of both sides. The log function is monotonic so "\lambda maximizes log L" iff "\lambda maximizes L."
 
EnumaElish said:
Take the log of both sides. The log function is monotonic so "\lambda maximizes log L" iff "\lambda maximizes L."

Okay, thank for that. Can you help me further for the exact form of the likelihood function so that I can take the log on both sides afterwards?
 
safina said:
I also understand that the likelihood function is the product of the individual density functions.
Assuming independence, I write it as:
L\left(\underline{y};r, \lambda\right)=\left[f\left(y;r,\lambda\right)\right]^{n}

Not quite - the likelihood function is
L\left(\underline{y};r, \lambda\right)=\prod_{i=1}^n f\left(y_i;r,\lambda\right)
since it's for a sample of size n. After taking the log and differentiating with respect to \lambda you'll find that terms like \Gamma(r) disappear.
 
bpet said:
Not quite - the likelihood function is
L\left(\underline{y};r, \lambda\right)=\prod_{i=1}^n f\left(y_i;r,\lambda\right)
since it's for a sample of size n. After taking the log and differentiating with respect to \lambda you'll find that terms like \Gamma(r) disappear.

Alright, thank you for all your replies. I've tried figuring them out. Here are the outcomes. Kindly check if these are right.

\frac{d}{d\lambda} log L\left(\underline{y}; r, \lambda\right) = \frac{nr}{\lambda} - \sum y
Equating the derivative above to zero results to:
\frac{nr}{\hat{\lambda}}= \sum y
solving for \hat{\lambda}, I have replaced n\bar{y} for \sum y, and were able to come up with an equation \hat{\lambda} = \lambda.

Is this the result am I suppose to have?

If this really is it, is this MLE unbiased?
 
You may not assume sample average = distribution mean. The sample average is just a random variable, like y itself; it does not have a constant value.
 
Last edited:
EnumaElish said:
You may not assume sample average = distribution mean. The sample average is just a random variable, like y itself; it does not have a constant value.

Oh, here's what I've done.
\frac{nr}{\hat{\lambda}}= \sum x
Solving for \lambda:
\hat{\lambda} = \frac{rn}{\sum x} = \frac{rn}{n \bar{x}} = \frac{rn}{n \frac{r}{\lambda}} = \lambda

Is this not right?
 
I haven't checked your math, but assuming that you haven't made a mistake, you should stop at lambda hat = r n / Sum(x). That's your MLE of lambda.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K