Maximum likelihood estimator of mean difference

Click For Summary
The discussion focuses on finding the maximum likelihood estimator (MLE) for the difference in means, θ = μ1 - μ2, from two normal populations with unknown parameters. Participants suggest starting with the likelihood function based on the samples from both populations and differentiating it to derive the MLE. There is a consideration of how to allocate the total sample size n to minimize the variance of the MLE, emphasizing the independence of the samples. The conversation also touches on the unknown nature of all parameters involved, which complicates the estimation process. Overall, the goal is to establish a clear method for deriving the MLE and optimizing sample allocation.
safina
Messages
26
Reaction score
0

Homework Statement


A sample of size n_{1} is to be drawn from a normal population with mean \mu_{1} and variance \sigma^{2}_{1}. A second sample of size n_{2} is to be drawn from a normal population with mean \mu_{2} and variance \sigma^{2}_{2}. What is the maximum likelihood estimator of \theta = \mu_{1} - \mu_{2}?

If we assume that the total sample size n = n_{1} + n_{2} is fixed, how should the n observations be divided between the two populations in order to minimize the variance of the maximum likelihood estimator of \theta?


Homework Equations





The Attempt at a Solution

 
Physics news on Phys.org
so if we denote the samples from distribution 1, xi, and the samples from distribution 2, yj, then let call the samples
\textbf{x} = (x_1,.., x_i,..)
\textbf{y} = (x_1,.., y_j,..)

i'd start by trying to find the distribution likelihood of the data given the sample parameters

L(\textbf{x}, \textbf{y} | \mu_1, \sigma_1, \mu_2, \sigma_2)

then consider differntiating ln(L) w.r.t. theta

(I haven't tried this, but its how i'd try and get started)
 
Last edited:
also are any of the parameters known beforehand? if all are unknown as they are not independent you could consider the set \left{\theta,\sigma_1,\mu_2, \sigma_2 \right}
 
Last edited:
And one more let, t, m1, m2 be the estimators for theta, mu1 and mu2

Due to the independence, it shouldn't be too hard to convince yourself the MLE for t is
t=m1+m2

Now you should be able to come up with the sampling distribution for m1 and m2
P(m1,m2|mu1,mu2,sig1,sig2) and use that to find a sampling distribution for t, of which the variance should be apparent. Then minimize wrt the constraint
 
lanedance said:
also are any of the parameters known beforehand? if all are unknown as they are not independent you could consider the set \left{\theta,\sigma_1,\mu_2, \sigma_2 \right}
None of the parameters are known beforehand.

I have a feeling that the MLE for t is m1-m2, but could you help me more what will be the form of the likelihood function?
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
964
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K