embphysics
- 67
- 0
Homework Statement
We are given that a light-bulb radiates 0.1 W from it, and the wavelength of light emitted from it is 530 nm. Seeing as the human eye requires around 6 photons per second for the brain to register a light signal. I am then asked, what is the largest possible distance between the observer and light source.
Homework Equations
The Attempt at a Solution
Well, light with a wavelength of 530 nm carries with it E = \frac{hc}{530 \cdot 10^{-9}m}=3.747 \cdot 10^{-19} J. Thus, \frac{1~photon}{3.747 \cdot 10^{-19} J}. Furthermore, the amount of the photons emitted each second in all directions is \frac{0.1 J}{1s} \cdot \frac{1~photon}{3.747 \cdot 10^{-19} J} = 2.6688 \cdot 10^{17} \frac{photons}{s}. This is the amount of light emitted in all directions. Let's assume there are five sides to the light source, 4 sides and 1 top. Emitting from these sides is 2.6688 \cdot 10^{17} \frac{photons}{s}, but we can't view every side at once. So, the total amount that one can possibly see is \frac{1}{5} (2.6688 \cdot 10^{17} \frac{photons}{s}) = 5.3376 \cdot 10^{16} The diameter of an average eye is 25 mm.
The distance between the eye and light source is variable. we need to find L such that only 6 photons reach the eye each second. (I would if I could use the calculus of variations somehow...)
I am not quite certain of what to do next. I have heard of some inverse square, but am not sure of how to use it.