Maximum likelihood of a statistical model

In summary: That is why I wrote in #4: "Solve the equation ##Y_i = \mu + (1 + \phi x_i) + \alpha_i ## for ##\alpha_i## to express that liklihood in terms of ##Y_i##."In summary, the problem involves finding the maximum likelihood estimator for μ and φ in a distribution where the variables Y_i are defined as μ+(1+φ x_i)+ε_i, with -1<φ<1 and -1<x_i<1. The x's are known numbers and ε's are independent and normally distributed with mean 0 and variance 1. The likelihood function cannot simply be the product of the Y_i variables, but rather must take
  • #1
the_dane
30
0

Homework Statement


I look at the distribution ##(Y_1,Y_2,...,Y_n)##
where
##Y_i=μ+(1+φ x_i)+ε_i## where ##-1<φ<1## and ##-1<x_i<1## . x's are known numbers. ε's are independent and normally distributed with mean 0 and variance 1.

I need to find the the maximum likelihood estimator for μ and φ

Homework Equations

The Attempt at a Solution


I get to the Log likelihood funcion: ##L(Y_1,Y_2,...,Y_n;μ,φ)=∑μ+(1+φ x_i)+ε_i##
When I differentiate it get:
##d L / dμ =∑1/(μ+(1+φ x_i)+ε_i), d L / dφ =∑x/(μ+(1+φ x_i)+ε_i##. Right?
These equations doesn't allow me to set ##d L / dμ=0,d L / dφ=0## because you can't divide by zero. What is going wrong for me?
 
Physics news on Phys.org
  • #2
the_dane said:
I get to the Log likelihood funcion: ##L(Y_1,Y_2,...,Y_n;μ,φ)=∑μ+(1+φ x_i)+ε_i##

Before you get to the log liklihood function, what function are you using for the liklihood function?
if the ##Y_i## represent the sample data, the expression for the liklihood function should have the variables ##Y_i## in it. Otherwise, you'd be doing a computation that ignores the data.
 
  • #3
the_dane said:

Homework Statement


I look at the distribution ##(Y_1,Y_2,...,Y_n)##
where
##Y_i=μ+(1+φ x_i)+ε_i## where ##-1<φ<1## and ##-1<x_i<1## . x's are known numbers. ε's are independent and normally distributed with mean 0 and variance 1.

I need to find the the maximum likelihood estimator for μ and φ

Homework Equations

The Attempt at a Solution


I get to the Log likelihood funcion: ##L(Y_1,Y_2,...,Y_n;μ,φ)=∑μ+(1+φ x_i)+ε_i##
When I differentiate it get:
##d L / dμ =∑1/(μ+(1+φ x_i)+ε_i), d L / dφ =∑x/(μ+(1+φ x_i)+ε_i##. Right?
These equations doesn't allow me to set ##d L / dμ=0,d L / dφ=0## because you can't divide by zero. What is going wrong for me?

This does not look like any likelihood function I have ever seen.

Start with the basics: what is the probability density ##f_i(y)## of the random variable ##Y_i##, that is, what is the function ##f_i(y)## in the statement ##P(y < Y_i < y + dy) = f_i(y)\, dy ?## In terms of the density functions ##f_1(y_1), f_2(y_2), \ldots, f_n(y_n)##, what is the likelihood of an observed event ##\{ Y_1 = y_1, Y_2 =y_2, \ldots, Y_n = y_n\}?## For given ##\{y_i\}## you want to maximize that likelihood function.
 
  • #4
Stephen Tashi said:
Before you get to the log liklihood function, what function are you using for the liklihood function?
if the ##Y_i## represent the sample data, the expression for the liklihood function should have the variables ##Y_i## in it. Otherwise, you'd be doing a computation that ignores the data.
I use ##L(Y_1,...Y_n;μ,φ) =Y_1*Y_2*...*Y_n##
 
  • #5
the_dane said:
I use ##L(Y_1,...Y_n;μ,φ) =Y_1*Y_2*...*Y_n##

That isn't correct. The problem does not say that ##Y_i## is a random variable with an exponential distribution. The problem says that ##\epsilon_i## is a random variable with a normal distribution.

The only random variable with a known distribution in the equation ##Y_i = \mu + (1 + \phi x_i) + \epsilon_i## is the variable ##\epsilon_i##.

The liklihood that ##\epsilon_i = \alpha_i## is ##\frac{1}{\sqrt{2\pi}} e^{- \frac{\alpha_i^2}{2}} ##

Solve the equation ##Y_i = \mu + (1 + \phi x_i) + \alpha_i ## for ##\alpha_i## to express that liklihood in terms of ##Y_i##.
 
  • #6
the_dane said:
I use ##L(Y_1,...Y_n;μ,φ) =Y_1*Y_2*...*Y_n##

No: you do not multiply the random variables together; if anything, you multiply their probability distributions.

So, to return to my question in #3: what is (a formula for) the probability density function ##f_i(y)## of the random variable ##Y_i?##. Until you can answer that question you will get absolutely nowhere with this problem!
 
  • #7
Ray Vickson said:
No: you do not multiply the random variables together; if anything, you multiply their probability distributions.

So, to return to my question in #3: what is (a formula for) the probability density function ##f_i(y)## of the random variable ##Y_i?##. Until you can answer that question you will get absolutely nowhere with this problem!
Can I get the prob. distribution from the formula of ##Y_i##.
 
  • #8
the_dane said:
Can I get the prob. distribution from the formula of ##Y_i##.

Yes, that is exactly what you need to do.
 

What is maximum likelihood?

Maximum likelihood is a statistical method used to estimate the parameters of a statistical model. It is based on the idea that the parameters of the model should be chosen in a way that maximizes the probability of obtaining the observed data.

How does maximum likelihood work?

The maximum likelihood method involves finding the set of parameters that maximizes the likelihood function, which is a measure of how likely the observed data is given the model. This is typically done through an iterative process, such as gradient descent, until the optimal parameters are found.

What are the assumptions of maximum likelihood?

The maximum likelihood method assumes that the data follows a specific probability distribution, such as normal or exponential. It also assumes that the data is independent and identically distributed, meaning that each data point is not influenced by the others and is drawn from the same distribution.

What are the advantages of using maximum likelihood?

Maximum likelihood provides a way to estimate the parameters of a statistical model in a systematic and rigorous manner. It also has good properties in terms of efficiency and consistency, meaning that as the sample size increases, the estimates will converge to the true parameters.

Are there any limitations to maximum likelihood?

While maximum likelihood is a widely-used and powerful method, it does have some limitations. One limitation is that it relies on the correct specification of the underlying probability distribution, which may not always be known. Additionally, it may not perform well with small sample sizes or when the data violates the assumptions of the model.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
901
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
887
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
Back
Top