# Probability over an interval in a Normal Distribution?

1. May 10, 2011

I've been given the question:
In a photographic process, the developing time of prints may be looked upon as
a random variable which is normally distributed with a mean of 16.28 seconds
and a standard deviation of 0.12 second. Find the probability that it will take
anywhere from 16.00 to 16.50 seconds to develop one of the prints.

I *think* I know what I need to do, just dont know how to do it:

So I have the Guassian Distribution formula (with std dev & mean plugged in) as my Probability Density Function.
I need to find the area under the Cumulative Distribution Function over the interval 16 to 16.50.
Because the integrals in CDFs are evaluated from -inf to a, I need to subtract the (integral of CDF from -inf to 16) from the (integral of CDF from -inf to 16.50), and that will be my answer.

But I don't know how to (A) Get the CDF, (B) Evaluate the CDF integral.

Ive tried reading up on the net but I'm not following the theory, can someone please show me how to do this? Thank you!

EDIT+=======
Apologies, realised i posted it on the wrong board!! feel free to ignore / delete. Please see my post over in the correct homework calculus thread! thank you!

Last edited: May 10, 2011
2. May 10, 2011

### mathman

The CDF is usually given in tabular form, since there is no analytic solution. The table (standard normal) will be for mean = 0 and standard deviation = 1. Your data then is for the range -.28/.12 to .22/.12.