MHB Is My Application of the Neyman-Pearson Lemma Correct?

  • Thread starter Thread starter mathjam0990
  • Start date Start date
  • Tags Tags
    Apply
mathjam0990
Messages
28
Reaction score
0
View attachment 5426What I have done so far...

View attachment 5427

Is this correct so far? If not, would someone be able to provide an explanation as to how to solve this? I am not sure if I am going in the right direction. Thank you
 

Attachments

  • (4).jpg
    (4).jpg
    34.6 KB · Views: 134
  • (5).jpg
    (5).jpg
    26.1 KB · Views: 111
Physics news on Phys.org
Your work is fine so far! However, the question states that we only consider a single observation. Therefore, you just have to consider the quotient of the likelihoods for one observation $x$: The critical region $C$ is given by
$$\frac{L(\theta_0 \ | \ x)}{L(\theta_1 \ | \ x)} \geq k.$$
An easy calculation gives
$$\frac{L(\theta_0 \ | \ x)}{L(\theta_1 \ | \ x)} = \frac{\theta_0 (1-\theta_0)^{x-1}}{\theta_1 (1-\theta_1)^{x-1}} = \left(\frac{\theta_0}{\theta_1}\right)\left(\frac{1-\theta_0}{1-\theta_1}\right)^{x-1} \geq k,$$
which implies that (please recheck this)
$$x \geq 1 + \frac{\ln\left(\frac{k \theta_1}{\theta_0}\right)}{\ln\left(\frac{1-\theta_0}{1-\theta_1}\right)} := k^{*}.$$
Hence, by the Neyman-Pearson lemma, the rejection region for the most powerful hypothesis test $H_0: \theta = \theta_0$ and $H_A: \theta =\theta_1$ where $\theta_1>\theta_0$ is given by $x \geq k^{*}$. Note that since the geometric distribution is discrete, this critical region $C = \{k^{*},k^{*}+1,\ldots,\}$. We still need to compute $k^{*}$. This can be done by looking at the type $I$-error, since $\mathbb{P}(H_0 \ \mbox{is false} \ | \ H_0) = \alpha$. Now $H_0$ is false if $x \geq k^{*}$ and hence the type $I$-error satisfies
\begin{align}
\mathbb{P}(X \geq k^{*} \ | \theta = \theta_0) = \sum_{k = k^{*}}^{\infty} (1-\theta_0)^{k-1} \theta_0 = \alpha,
\end{align}
from which you can extract $k^{*}$. I think you can also generalize this to multiple observations $x_1,\ldots,x_n$. In that case you will have to determine the distribution of $\overline{x}$ which can be a little bit more messy.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top