- #1
blizzardof96
- 22
- 0
Assume you have the following scenario:
Light begins traveling through a gel of index of refraction n=1.34 in a straight line along the x axis. It is then incident on a solid sphere(n=1.36) of radius R in 3-space. Upon transmittance, the light again travels through the gel(n=1.36) and finally back into air(planar boundary) where it is incident on a detector.
Is it safe to assume that the further light is from the centre of the sphere, the longer it will take for light to hit the detector(due to spherical aberration)? My assumption is that due to spherical aberration, refraction is greater at the edge of the sphere and the light therefore travels a longer distance. By d=vt, it would also imply a greater amount of time before hitting the detector. My intuition suggests that light incident on the centre of the sphere will not refract(theta=0) and therefore takes path of least time(a straight line). Are these assumptions correct?
Light begins traveling through a gel of index of refraction n=1.34 in a straight line along the x axis. It is then incident on a solid sphere(n=1.36) of radius R in 3-space. Upon transmittance, the light again travels through the gel(n=1.36) and finally back into air(planar boundary) where it is incident on a detector.
Is it safe to assume that the further light is from the centre of the sphere, the longer it will take for light to hit the detector(due to spherical aberration)? My assumption is that due to spherical aberration, refraction is greater at the edge of the sphere and the light therefore travels a longer distance. By d=vt, it would also imply a greater amount of time before hitting the detector. My intuition suggests that light incident on the centre of the sphere will not refract(theta=0) and therefore takes path of least time(a straight line). Are these assumptions correct?