1. The problem statement, all variables and given/known data Supopse that a 100 W source radiates light of wavelength 600 nm in all directions and that the eye can detect this light if only 20 photons per second enter a dark adapted eye having a 7 mm diameter pupil. How far from the source can the light be detected under these conditions? 2. Relevant equations [tex]E = hf = hc/\lambda[/tex] 3. The attempt at a solution I am having a hard time finding a relationship between the 100 W source and a distance. At what rate do the photons die off? I can calculate the energy per photon: [tex]E = hc\ 600nm[/tex] but with that information, I still don't see a way factor distance in there. Anyone have some hints? how I can figure out at what rate it will dissipate over a distance.