Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The definition of second

  1. Oct 9, 2005 #1
    The definiton of second is the time needed for a cesium-133 atom to perform 9,192,631,770 complete oscillations.

    Why 9,192,631,770 complete oscillations? Is that because it is the interval of 1 sec that is about the same as the old definition?

    And the cesium clock, is claimed to accurate to a single second over 3 million years. Therefore, couldnt we just take this error into account, so that there will be no error?
  2. jcsd
  3. Oct 9, 2005 #2


    User Avatar
    Staff Emeritus
    Science Advisor


    That error figure is representative. Basically, one builds not one, but a whole bunch of clocks, and sees how well they agree with each other. Individual clocks will exhibit both systematic and random errors. If you built a whole bunch of clocks, after 3 million years you'd expect any one of them to be about a second away from the average.

    I don't know all of the error sources in cesium clocks, some of the ones that I've heard of are:

    1) the phase lock loop losing synchronization. The design of the clock is that there is an external microwave source which is "tuned" to match the resonance frequence of the atoms. A phase locked loop keeps the microwave source tuned to the right frequence. The atoms may be keeping good time, but the electronics has to "keep up". This depends on the design of the electronics and measurement circuitry. A key factor is the observation time - the time over which an error of 1 count can be measured by the electronics. In our best clocks, this is as high as 1 second, which means that the electronics might add or subtract a count from the actual count kept by the atoms over a peroid of about a second.

    2) the motion of the cesium atoms themselves, which causes relativistic time dilation effencts. This is kept to a minimum by cooling the cesium atoms to ultra cold temperatures.

    Currently the main US cesium fountain clock is expected to have an uncertanity of 1 second in 60 million years, about 20x better than the figure you quoted. This has been improving as more and more sources of error are tracked down and eliminated.


    is the source for much of this information.
  4. Oct 15, 2005 #3
    I got some other questions.

    Well, If we defined 1 sec using the atomic clock, then how can there be uncertainty in accuracy? Since we are using it as a reference.

    And, after gone through the website, it mentions that the gravity takes parts in this atomic clock mechanism. So, if 2 places with different G, then the 1 sec of both places is not the same! How do they determine the 1 sec then?

  5. Oct 16, 2005 #4


    User Avatar
    Staff Emeritus
    Science Advisor

    The standard is the vibration of the cesium ion under ideal conditions. Real clocks are limited both in the accuracy with which they measure the vibration of the ions, and the ideality of the conditions under which the vibrations are measured. Hence real clocks have some uncertanity.

    As far as gravity goes, all clocks on the geoid (which basically means at sea level) tick at the same rate. The contributions of clocks that are significantly above sea level (like Denver) have to be and are adjusted for the fact that they run at a different rate. Note that it is not the force of gravity that is important, but the gravitational potential energy.
  6. Oct 16, 2005 #5


    User Avatar

    Staff: Mentor

    The reason we know there is error is that if you compare two cesium clocks, they won't say exactly the same thing. Compare a lot of cesium clocks and you'll get a good idea of what the error is.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook