# Confidence and prediction interval

• bizzy

#### bizzy

I need help with this question.

Suppose we are making predictions of the dependent variable y for specific values of the independent variable x using a simple linear regression model holding the confidence level constant. Let C.I = the width of the confidence interval for the average value y for a given value of x, and P.I = the width of the prediction interval for a single value y for a given value of x.

I need to know if C.I > P.I., < P.I., = P.I., or = .5 P.I.

The length of the confidence interval will always be less than the length of the prediction interval, because the margin of error in the confidence interval is always smaller than the margin of error in the prediction interval.

The length of the confidence interval estimate is twice this:

$$s t \sqrt{\frac 1 n + \frac{(x_0 - \bar x)^2}{\sum (x-\bar x)^2}$$

the length of the prediction interval estimate is twice this:

$$s t \sqrt{1+ \frac 1 n + \frac{(x_0 - \bar x)^2}{\sum (x-\bar x)^2}$$

The difference between the two margins of error varies, depending on the sample size, $$s$$, and the magnitude of the $$x$$ values. So far as I know, there is no simple rule that always works to quantify the magnitude of the difference.