MHB Maximizing Likelihood Estimator of β

  • Thread starter Thread starter jmorgan
  • Start date Start date
  • Tags Tags
    Likelihood
AI Thread Summary
To find the maximum likelihood estimator (MLE) of β, start with the likelihood function L(β) based on the given distribution. The correct likelihood expression incorporates the observations and is given by the product of individual probabilities. Taking the logarithm of the likelihood simplifies the calculations, leading to the log-likelihood function. By differentiating this log-likelihood with respect to β and setting the derivative equal to zero, you can solve for β to find the MLE. This process ensures that the value of β maximizes the likelihood of observing the given data.
jmorgan
Messages
5
Reaction score
0
Assuming α is known, find the maximum likelihood estimator of β

f(x;α,β) = , 1 ,,,,,,, .(xα.e-x/β)
,,,,,, ,,,,,,α!βα+1

I know that firstly you must take the likelihood of L(β). But unsure if I have done it correctly. I came out with the answer below, please can someone tell me where/if I have gone wrong.

L(β)= (α!βα+1)-n.Σxiα.eΣxi/βn
 
Physics news on Phys.org
I don't understand your question. The "maximum Likelihood" estimator for a parameter is the value of the parameter that makes a given outcome most likely. But you have not given an "outcome" here.
 
I think that you're going in the right direction. However, your calculation is not entirely correct. Suppose that we have given observations $x_1,\ldots,x_n$ from the given distribution. The likelihood is then given by
$$\mathcal{L}(x_1,\ldots,x_n,\alpha,\beta) = \prod_{i=1}^{n} \frac{1}{\alpha ! \beta^{\alpha+1}} x_i^{\alpha}e^{-x_i/\beta}.$$
We wish to find the value of $\beta$ that maximizes the likelihood. Since it is quite common to work with the logarithm, let us first take the log of both sides:
$$\log \mathcal{L}(x_1,\ldots,x_n,\alpha,\beta) = -n \log(\alpha) - n (\alpha+1) \log(\beta)+ \alpha \sum_{i=1}^{n} \log(x_i) - \frac{\sum_{i=1}^{n} x_i}{\beta}.$$
Taking the derivative w.r.t $\beta$, we obtain
$$\frac{\partial \log \mathcal{L}(x_1,\ldots,x_n,\alpha,\beta)}{d\beta} = -n(\alpha+1)\frac{1}{\beta} - \frac{1}{\beta^2} \sum_{i=1}^{n} x_i.$$
To proceed, set the RHS equal to $0$ and solve for $\beta$. This is the required MLE.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top