Hello all, If you have been watching the olympic games lately you could have seen that in the past week it has occurred twice that an athlete won (and another lost) a gold medal by a time difference of a few 1000th of a second. Both occurrences where in speed skating (i.e. the 500m and 1500m men's events). My question is: is it possible with state of the art electronics to measure the times so accurately? What equipment would be necessary for that? And what would be the errorbar on such a measurement? Generally it is assumed that a high speed camera at the finish line that can record (and presumably time-stamp) easily more than a 1000 frames per second, ensures such accurate timing. But I would say there are many other factors to consider. Also take into account that in some sports (i.e. 100m sprint final) contestants are compared within one race, so they hear the same start signal and the photofinish observes all runners in the same heat and measures relative time. However in other sports, such as speed skating, or bobsleigh, athletes times are set in different runs, so timings of these runs must be compared. Then also try to take into account all other possible influences, for a start I could think of: position of the starter (holding a starter gun), speed of sound differences with temperature/pressure of the air, non ideal shape of the ice staking ring (also there is an inner and an outer lane) What is the error in the time-stamp of the start signal for different runs? Probably there is lots of other stuff... Omega (the company taking care of timing at the olympics) claims that they can measure with accuracy of 1 millionth of a second and that the measurements are flawless (no errorbar). I'm not doubting that they can measure so accurately, but maybe they make an error in what they are actually measuring. What do you think?