This is the question:(adsbygoogle = window.adsbygoogle || []).push({});

Suppose that X1......Xn form a random sample from the Bernoulli Distribution with unknown parameter P. Let Po and P1 be specified values such that 0<P1<Po<1, and suppose that is desired to test the following simple hypotheses: Ho: P=Po, H1: P=P1.

A. Show that a test procedure for which α(δ) + β(δ) is a minimum rejects Ho when Xbar < c.

B. Find the value of c.

I know that this problem is not that difficult I just can't figure out where to start. I know the Bernoulli distribution, but I can't figure out how to get α(δ) and β(δ). I have not seen any problems like this so I am kinda lost, any help would be much appreciated. Thanks!

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Testing Hypotheses with Bernoulli Distribution

**Physics Forums | Science Articles, Homework Help, Discussion**