The eye is not an ideal measure of surface brightness, especially if you try to use it as a standard for peripheral viewing. If you consider the case of a lens (such as your eye or a magnifying glass or a camera lens) being used to put an image onto a screen, (in the case of a camera lens onto the pixels that generate the image), it is necessary to have everywhere in the the scene be incident on the same projected area of the lens. The projected area of the lens only decreases by ## cos(\theta) ## so for small angles not to far off axis, this remains relatively constant. For a perfect image onto the screen, (or retina), for perfect mapping of the scene, the image brightness (on the screen) at any location should be proportional to the brightness of the scene. For angles not too far off axis, this will be the case with most lenses, including the eye. Far off axis, this generally will not be the case, either for your eye or any other lens system. ## \\ ## To quantify the previous discussion, for irradiance ## E_{screen} ## from the other side of the lens onto a perfectly diffuse white projector screen, the image brightness ## L_i ## will be such that ## E_{screen}=L_i \pi ##. (The incident irradiance is perfectly reflected in a Lambertian pattern that has an effective solid angle of ## \pi ## steradians). ## \\ ## And a couple additional calculations: If the scene is in the far field and has brightness ## L_s ## and occupies a solid angle ## \Omega_s ## as measured from the lens, it will image with an area ## A_i=f^2 \Omega_s ## and total power collected by the lens will be ## P=L_s \Omega_s A_{Lens} ##. This gives ## E_{screen}=\frac{P}{A_i}=\frac{L_s A_{Lens}}{f^2} ## and image brightness (on a perfectly diffuse white screen) ## L_i=\frac{E_{screen}}{\pi } ## which in general will be proportional to but less than the brightness ## L_s ##.