# Maximum likelyhood extimater(MLE)

1. Feb 16, 2005

### semidevil

so by definition, the likelyhood function(w, theta) is the product of the pdf fw(w, theta) evalutated at n data points.

but I dont know how they do those calculations.....

so for example:

fy(y, theta) = $$1/\theta^2 ye^{-y/\theta}$$

L(theta) = $$\theta^{-2n}\prod y_{i}e^{-1/\theta}\sum y_{i}$$

so first of all, I'm looking at this but I dont know how they went from this to that....I look at another problme

how did they go from $$e^{-(y-\theta)} [\tex]to [tex] \pro e^{-(y_{i}- \theta)}$$ [/theta]

I dotn see a pattern...I compared it w/ the definitoin, but I just dont get it....
I mean, when they did the L(theta) it seems that they added some "n' and i's somewhere....and I dotn know where they added these things.

Last edited: Feb 16, 2005
2. Feb 16, 2005

### juvenal

It looks like there are some typos in your expression.

You are trying to estimate $$\theta$$ using n data points, labelled as $$y_i$$. The single likelihood for $$\theta$$ given one data point$$y_i$$ is:

$$1/\theta^2 y_{i}e^{-y_{i}/\theta}$$

In order to get the likelihood of $$\theta$$for all n data points, then you need to multiply the single likelihoods together. And that's just a simple matter of multiplying terms and summing up what's in the exponential term.

3. Feb 16, 2005

### hypermorphism

The indices are a shorthand for indeterminately long products. For example, given your definition of L, we evaluate the pdf at n values of y, {y1, y2, ..., yn} and take the product. So if
$$f(y) = \frac{1}{\theta^2} ye^{-\frac{y}{\theta}}$$
we get
$$f(y_1)f(y_2)...f(y_n) &= \prod_{i=1}^n f(y_i)$$

$$= \prod_{i=1}^n \frac{1}{\theta^2} y_i e^{-\frac{y_i}{\theta}}$$

$$= \frac{1}{\theta^{2n}} \prod_{i=1}^n y_i e^{-\frac{y_i}{\theta}}$$

$$= \frac{1}{\theta^{2n}} \left(y_1 e^{-\frac{y_1}{\theta}}...y_n e^{-\frac{y_n}{\theta}}\right)$$

$$= \frac{1}{\theta^{2n}} \left(y_1...y_n e^{-\frac{y_1+y_2+...+y_n}{\theta}}\right)$$

$$= \frac{1}{\theta^{2n}} \left(\prod_{i=1}^n y_i\right) \left(e^{-\frac{1}{\theta}\sum_{i=1}^n y_i}}\right)$$

4. Feb 19, 2005

### semidevil

ok, thanx, now that makes a little bit more sense, but i'll think about it some more...still a bit confusing.

what about to get ln L(theta)? the book does some wierd stuff an I dont know what it did.

ln L(theta) = $$-2n ln \theta + ln \prod yi - 1/\theta \sum yi$$

how did that happen?

and also, i'm not understanding where they put the n's. maybe i'm having trouble w/ the definition. like, on the first problem, why did it become $$\theta^{-2n}$$?

and for ths problem,

we have fy (y, theta) = $$e^{-(y-\theta)},$$ so L(theta) = $$e^{\sum y + n\theta}$$

Last edited: Feb 19, 2005
5. Feb 19, 2005

### hypermorphism

Because $$\theta^{-2}$$ multiplied by itself n times is $$\theta^{-2n}$$. They just took it out of the n-product by associativity.

What happens when you multiply ea with eb ? That's all that's going on here. :)