I guess I am just bored, but I was wondering why we assume light travels in a straight line and not treat it like a waveform as I was taught in school. As far as I can remember light waves each have a frequency. Why do we not take into calculation that it actually travels over a longer path than what we give it credit for? I do understand that in the end it has very little bearing on where the light ends up, but this just seems a bit odd to me for some reason. I mean overall a light wave with an amplitude of 500nm and say a wave length of 650nm since I like red... to simplify the calculation if we use say the formula for a eclipse for a total movement we would get something like: Circ = pi * sqrt(2*(.000000250^2 + .000000325^2) Circ = 0.00000182171745359 meters take the speed of light and get how many waves there are in it... google this and they will do the math: ((299792458/.000000650)*0.00000182171745359) 299792458(m/s) / .000000650 = X X * Circ = 840211005(m/s) Which is actually how far light moves if it has an amplitude of 500nm and a wavelength of 650nm. Although in the end it is constantly negating the sinusoidal movement shouldn't this be taken into consideration when looking at the speed of light and our limitations concerning the speed of light? To explain why I choose the formula for an eclipse to calculate with if it is a waveform be it sin or cos it seems logical an eclipse is a reasonable model for which to infer circumferance from. I also don't mean this to be against the TOS with a "crackpot theory" this just came to me and I couldn't think of a reasonable explanation as to why we do not take this into account.