MHB Help with Statistical Inference

  • Thread starter Thread starter trousmate
  • Start date Start date
  • Tags Tags
    Statistical
AI Thread Summary
The discussion revolves around a request for assistance with statistical inference, specifically regarding likelihood functions and maximum likelihood estimators. Participants provide detailed mathematical formulations for likelihood and log-likelihood functions, emphasizing the importance of derivatives to find estimators. There is a correction noted about the relevance of the problem being addressed, clarifying that the data pertains to the number of successes from trials rather than individual results. The conversation highlights the need for clear understanding and application of statistical concepts for effective presentation preparation. Overall, the thread serves as a collaborative effort to clarify statistical inference methods.
trousmate
Messages
1
Reaction score
0
View attachment 3490

See image.

I have to present this on Monday and don't really know what I'm doing having missed a couple of lectures through illness. Any help or hints would be very much appreciated.

Thanks,
TM
 

Attachments

  • Screen Shot 2014-11-07 at 12.38.28.png
    Screen Shot 2014-11-07 at 12.38.28.png
    6.8 KB · Views: 86
Physics news on Phys.org
trousmate said:
https://www.physicsforums.com/attachments/3490

See image.

I have to present this on Monday and don't really know what I'm doing having missed a couple of lectures through illness. Any help or hints would be very much appreciated.

Thanks,
TM

Wellcome on MHB trousmate!...

If You have rsamples of a r.v. Y then the likelihood function is... $\displaystyle f(y_{1}, y_{2}, ..., y_{r} | p) = p^{\sum y_{i}}\ (1-p)^{r - \sum y_{i}}\ (1)$... so that... $\displaystyle \ln f= \sum_{i} y_{i}\ \ln p - (r - \sum_{i} y_{i} )\ \ln (1 - p) \implies\frac{d \ln f}{d p } = \frac{\sum_{i} y_{i}} {p} - \frac {(r -\sum_{i} y_{i})}{1 - p}\ (2)$... and imposing$\displaystyle \frac{d \ln f}{d p} = 0$ You arrive to write... $\displaystyle \overline {p}= \frac{\sum_{i} y_{i}}{r}\ (3)$Kind regards$\chi$ $\sigma$
 
trousmate said:
https://www.physicsforums.com/attachments/3490

See image.

I have to present this on Monday and don't really know what I'm doing having missed a couple of lectures through illness. Any help or hints would be very much appreciated.

Thanks,
TM

For this problem you have the likelihood:

$$L(\theta|Y) = b(Y,r,\theta)=\frac{r!}{Y!(r-Y)!}\theta^Y(1-\theta)^{r-Y}$$

Then the log-likelihood is:

$$LL(\theta|Y) =\log(r!) - \log(Y!) -\log((r-Y)!)+Y\log(\theta) +(r-Y)\log(1-\theta)$$

Then to find the value of $\theta$ that maximises the log-likelihood we take the partial derivative with respect to $\theta$ and equate that to zero:

$$\frac{\partial}{\partial \theta}LL(\theta|Y)=\frac{Y}{\theta}-\frac{r-Y}{1-\theta}
$$
So for the maximum (log-)likelihood estimator $\hat{\theta}$ we have:
$$\frac{Y}{\hat{\theta}}-\frac{r-Y}{1-\hat{\theta}}=0$$

which can be rearranged to give:$$\hat{\theta}(r-Y)=(1-\hat{\theta})Y$$ or $$\hat{\theta}=\frac{Y}{r}$$

and as you should be aware of; the maximum log-likelihood estimator for this sort of problem is also the maximum likelihood estimator.

.
 
chisigma said:
Wellcome on MHB trousmate!...

If You have rsamples of a r.v. Y then the likelihood function is... $\displaystyle f(y_{1}, y_{2}, ..., y_{r} | p) = p^{\sum y_{i}}\ (1-p)^{r - \sum y_{i}}\ (1)$... so that... $\displaystyle \ln f= \sum_{i} y_{i}\ \ln p - (r - \sum_{i} y_{i} )\ \ln (1 - p) \implies\frac{d \ln f}{d p } = \frac{\sum_{i} y_{i}} {p} - \frac {(r -\sum_{i} y_{i})}{1 - p}\ (2)$... and imposing$\displaystyle \frac{d \ln f}{d p} = 0$ You arrive to write... $\displaystyle \overline {p}= \frac{\sum_{i} y_{i}}{r}\ (3)$Kind regards$\chi$ $\sigma$

Very nice but this is the solution to the wrong problem, it just happens to have the same solution as the problem as asked, but it is still the wrong problem. The data is the number of successes from r trials not a vector of results.

.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Back
Top