- #1
lalligagger
- 14
- 0
Hi,
I am a physics undergrad designing a solar spectrometer as part of a senior design type course. If we have light incident from the sun onto some sort of telescope objective (say a single lens for simplicity, focal length "f" and aperture radius "R") and then a pinhole at the focal point (radius "r"), how do we model the power loss at the pinhole? I was thinking it'd be
Isun*(pi*R2)*[(.01*f/2)2/(r)2]
In other words, intensity times the area of the entrance aperture gives you power in, and taking the ratio of your image area over the pinhole area gives you the fraction of power that gets through. The .01 comes from the sun being approximately .01 radians in the sky. This made sense to me, but it seems like we would barely get any power through with a 1 meter telescope into a 20-50 micron pinhole. Any help would be greatly appreciated.
I am a physics undergrad designing a solar spectrometer as part of a senior design type course. If we have light incident from the sun onto some sort of telescope objective (say a single lens for simplicity, focal length "f" and aperture radius "R") and then a pinhole at the focal point (radius "r"), how do we model the power loss at the pinhole? I was thinking it'd be
Isun*(pi*R2)*[(.01*f/2)2/(r)2]
In other words, intensity times the area of the entrance aperture gives you power in, and taking the ratio of your image area over the pinhole area gives you the fraction of power that gets through. The .01 comes from the sun being approximately .01 radians in the sky. This made sense to me, but it seems like we would barely get any power through with a 1 meter telescope into a 20-50 micron pinhole. Any help would be greatly appreciated.