Understanding the Accuracy and Precision of the Definition of Second

  • Thread starter darkar
  • Start date
  • Tags
    Definition
In summary: So in summary, the definition of a second is the time it takes for a cesium-133 atom to perform 9,192,631,770 complete oscillations. This figure was chosen because it is close to the old definition of a second. While cesium clocks are claimed to be accurate to a single second over 3 million years, there are still sources of error that need to be accounted for, such as the phase lock loop and the motion of the cesium atoms themselves. Additionally, gravity can affect the accuracy of cesium clocks, but adjustments can be made to account for this. Overall, the current standard for a second is based on the vibration of cesium ions under ideal conditions, but real clocks will have some level
  • #1
darkar
187
0
The definiton of second is the time needed for a cesium-133 atom to perform 9,192,631,770 complete oscillations.

Why 9,192,631,770 complete oscillations? Is that because it is the interval of 1 sec that is about the same as the old definition?

And the cesium clock, is claimed to accurate to a single second over 3 million years. Therefore, couldn't we just take this error into account, so that there will be no error?
 
Physics news on Phys.org
  • #2
darkar said:
The definiton of second is the time needed for a cesium-133 atom to perform 9,192,631,770 complete oscillations.

Why 9,192,631,770 complete oscillations? Is that because it is the interval of 1 sec that is about the same as the old definition?

Yes

And the cesium clock, is claimed to accurate to a single second over 3 million years. Therefore, couldn't we just take this error into account, so that there will be no error?

That error figure is representative. Basically, one builds not one, but a whole bunch of clocks, and sees how well they agree with each other. Individual clocks will exhibit both systematic and random errors. If you built a whole bunch of clocks, after 3 million years you'd expect anyone of them to be about a second away from the average.

I don't know all of the error sources in cesium clocks, some of the ones that I've heard of are:

1) the phase lock loop losing synchronization. The design of the clock is that there is an external microwave source which is "tuned" to match the resonance frequence of the atoms. A phase locked loop keeps the microwave source tuned to the right frequence. The atoms may be keeping good time, but the electronics has to "keep up". This depends on the design of the electronics and measurement circuitry. A key factor is the observation time - the time over which an error of 1 count can be measured by the electronics. In our best clocks, this is as high as 1 second, which means that the electronics might add or subtract a count from the actual count kept by the atoms over a peroid of about a second.

2) the motion of the cesium atoms themselves, which causes relativistic time dilation effencts. This is kept to a minimum by cooling the cesium atoms to ultra cold temperatures.

Currently the main US cesium fountain clock is expected to have an uncertanity of 1 second in 60 million years, about 20x better than the figure you quoted. This has been improving as more and more sources of error are tracked down and eliminated.

http://tf.nist.gov/cesium/fountain.htm

is the source for much of this information.
 
  • #3
I got some other questions.

Well, If we defined 1 sec using the atomic clock, then how can there be uncertainty in accuracy? Since we are using it as a reference.

And, after gone through the website, it mentions that the gravity takes parts in this atomic clock mechanism. So, if 2 places with different G, then the 1 sec of both places is not the same! How do they determine the 1 sec then?

Thanks
 
  • #4
darkar said:
I got some other questions.

Well, If we defined 1 sec using the atomic clock, then how can there be uncertainty in accuracy? Since we are using it as a reference.

And, after gone through the website, it mentions that the gravity takes parts in this atomic clock mechanism. So, if 2 places with different G, then the 1 sec of both places is not the same! How do they determine the 1 sec then?

Thanks

The standard is the vibration of the cesium ion under ideal conditions. Real clocks are limited both in the accuracy with which they measure the vibration of the ions, and the ideality of the conditions under which the vibrations are measured. Hence real clocks have some uncertanity.

As far as gravity goes, all clocks on the geoid (which basically means at sea level) tick at the same rate. The contributions of clocks that are significantly above sea level (like Denver) have to be and are adjusted for the fact that they run at a different rate. Note that it is not the force of gravity that is important, but the gravitational potential energy.
 
  • #5
The reason we know there is error is that if you compare two cesium clocks, they won't say exactly the same thing. Compare a lot of cesium clocks and you'll get a good idea of what the error is.
 

What is the definition of a second?

The definition of a second is a unit of time equal to 1/60th of a minute or 1/3,600th of an hour. It is the base unit of time in the International System of Units (SI) and is defined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between two hyperfine levels of the ground state of the cesium 133 atom.

How is the definition of a second determined?

The definition of a second is determined by using atomic clocks, which measure the frequency of oscillations of atoms. The International Bureau of Weights and Measures (BIPM) uses more than 400 atomic clocks around the world to determine the average number of oscillations it takes for a cesium 133 atom to transition between two energy levels, and this is used to define the length of a second.

Why is the definition of a second important in science?

The definition of a second is important in science because it is a fundamental unit of measurement that is used in many scientific calculations and experiments. It is also used in various fields such as physics, chemistry, and engineering to measure time intervals accurately and precisely.

Has the definition of a second always been the same?

No, the definition of a second has not always been the same. It was first defined in 1956 as 1/86,400 of a mean solar day, which was based on the rotation of the Earth. However, with the advancement of technology and the need for more precise measurements, the definition was changed in 1967 to its current definition based on atomic clocks.

Are there any proposed changes to the definition of a second?

Yes, there have been proposals to change the definition of a second to be based on a natural constant, such as the speed of light. This would make the definition more stable and independent of any physical object. However, no changes have been made yet and the current definition is still used by the scientific community.

Similar threads

  • Other Physics Topics
2
Replies
56
Views
4K
  • Other Physics Topics
Replies
21
Views
3K
Replies
14
Views
1K
Replies
17
Views
1K
  • Art, Music, History, and Linguistics
Replies
4
Views
982
  • Special and General Relativity
2
Replies
57
Views
4K
  • Special and General Relativity
Replies
31
Views
4K
  • Atomic and Condensed Matter
Replies
3
Views
3K
  • General Discussion
Replies
3
Views
2K
Replies
19
Views
2K
Back
Top