Help with Statistical Inference

  • Context: MHB 
  • Thread starter Thread starter trousmate
  • Start date Start date
  • Tags Tags
    Statistical
Click For Summary

Discussion Overview

The discussion revolves around statistical inference, specifically focusing on likelihood functions and maximum likelihood estimators. Participants are seeking assistance with a presentation related to these concepts, particularly in the context of a problem involving statistical data analysis.

Discussion Character

  • Homework-related
  • Technical explanation
  • Debate/contested

Main Points Raised

  • TM expresses uncertainty about the topic due to missed lectures and requests help with the presentation.
  • One participant provides a likelihood function for a random variable and derives the maximum likelihood estimator, suggesting that the estimator can be expressed as the sample mean.
  • Another participant presents a different likelihood function and its log-likelihood, leading to a similar conclusion about the maximum likelihood estimator.
  • A later reply critiques the previous contributions, stating that they address the wrong problem despite arriving at a similar solution, emphasizing the importance of correctly interpreting the data as the number of successes from trials.

Areas of Agreement / Disagreement

Participants do not appear to reach a consensus, as there are competing interpretations of the problem and differing approaches to the likelihood functions presented.

Contextual Notes

There is a lack of clarity regarding the specific problem TM is facing, which may affect the applicability of the solutions provided. Additionally, the assumptions underlying the likelihood functions and the definitions of the variables involved are not fully explored.

trousmate
Messages
1
Reaction score
0
View attachment 3490

See image.

I have to present this on Monday and don't really know what I'm doing having missed a couple of lectures through illness. Any help or hints would be very much appreciated.

Thanks,
TM
 

Attachments

  • Screen Shot 2014-11-07 at 12.38.28.png
    Screen Shot 2014-11-07 at 12.38.28.png
    6.8 KB · Views: 97
Physics news on Phys.org
trousmate said:
https://www.physicsforums.com/attachments/3490

See image.

I have to present this on Monday and don't really know what I'm doing having missed a couple of lectures through illness. Any help or hints would be very much appreciated.

Thanks,
TM

Wellcome on MHB trousmate!...

If You have rsamples of a r.v. Y then the likelihood function is... $\displaystyle f(y_{1}, y_{2}, ..., y_{r} | p) = p^{\sum y_{i}}\ (1-p)^{r - \sum y_{i}}\ (1)$... so that... $\displaystyle \ln f= \sum_{i} y_{i}\ \ln p - (r - \sum_{i} y_{i} )\ \ln (1 - p) \implies\frac{d \ln f}{d p } = \frac{\sum_{i} y_{i}} {p} - \frac {(r -\sum_{i} y_{i})}{1 - p}\ (2)$... and imposing$\displaystyle \frac{d \ln f}{d p} = 0$ You arrive to write... $\displaystyle \overline {p}= \frac{\sum_{i} y_{i}}{r}\ (3)$Kind regards$\chi$ $\sigma$
 
trousmate said:
https://www.physicsforums.com/attachments/3490

See image.

I have to present this on Monday and don't really know what I'm doing having missed a couple of lectures through illness. Any help or hints would be very much appreciated.

Thanks,
TM

For this problem you have the likelihood:

$$L(\theta|Y) = b(Y,r,\theta)=\frac{r!}{Y!(r-Y)!}\theta^Y(1-\theta)^{r-Y}$$

Then the log-likelihood is:

$$LL(\theta|Y) =\log(r!) - \log(Y!) -\log((r-Y)!)+Y\log(\theta) +(r-Y)\log(1-\theta)$$

Then to find the value of $\theta$ that maximises the log-likelihood we take the partial derivative with respect to $\theta$ and equate that to zero:

$$\frac{\partial}{\partial \theta}LL(\theta|Y)=\frac{Y}{\theta}-\frac{r-Y}{1-\theta}
$$
So for the maximum (log-)likelihood estimator $\hat{\theta}$ we have:
$$\frac{Y}{\hat{\theta}}-\frac{r-Y}{1-\hat{\theta}}=0$$

which can be rearranged to give:$$\hat{\theta}(r-Y)=(1-\hat{\theta})Y$$ or $$\hat{\theta}=\frac{Y}{r}$$

and as you should be aware of; the maximum log-likelihood estimator for this sort of problem is also the maximum likelihood estimator.

.
 
chisigma said:
Wellcome on MHB trousmate!...

If You have rsamples of a r.v. Y then the likelihood function is... $\displaystyle f(y_{1}, y_{2}, ..., y_{r} | p) = p^{\sum y_{i}}\ (1-p)^{r - \sum y_{i}}\ (1)$... so that... $\displaystyle \ln f= \sum_{i} y_{i}\ \ln p - (r - \sum_{i} y_{i} )\ \ln (1 - p) \implies\frac{d \ln f}{d p } = \frac{\sum_{i} y_{i}} {p} - \frac {(r -\sum_{i} y_{i})}{1 - p}\ (2)$... and imposing$\displaystyle \frac{d \ln f}{d p} = 0$ You arrive to write... $\displaystyle \overline {p}= \frac{\sum_{i} y_{i}}{r}\ (3)$Kind regards$\chi$ $\sigma$

Very nice but this is the solution to the wrong problem, it just happens to have the same solution as the problem as asked, but it is still the wrong problem. The data is the number of successes from r trials not a vector of results.

.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K