How do we measure time more accurately?

So I'm beginning to study photonics and the issue of excessively accurate reference times for clock signals are mentioned.

I understand that natural frequencies are often compared to determine the accuracy of a time measure (olden times it was vs astronomical events and nowadays it's against photon / matter interactions like counting vibrations of an atom using lasers). However, how does once exactly determine the range of error in something like time? You can't measure a more accurate frequency with a less accurate frequency ... as the the measurement would just have the same range of error as the less accurate frequency. Only thing i can think of is since we have set a standard of some number of cesium atom vibrations equaling a second we have a counter synchronized to count some relevant phenomon against something that counts the cesium vibrations at very low temperatures and you do that for several seconds and average it out. So in a way it's arbitrarily set by comparing to the standard. Is that how it's actually done in physic labs or is there another method?

hm... i think that ^ might be it I swear typing questions into forums helps you think things out.

f95toli
Gold Member
You use two copies of the osccillator that is generating the frequency. As long as you can consider them to be independent you can then beat the signals coming from the two oscillators, and that signal can in turn be measured using a reference less stable than the orignal signal because the frequency will be several orders of magnitude lower.

Note that this is a common problem. Optical clocks are about three orders of magnitude more stable than cesium clocks, so there is no way to compare them with any "offical" time signal.

You use two copies of the osccillator that is generating the frequency. As long as you can consider them to be independent you can then beat the signals coming from the two oscillators, and that signal can in turn be measured using a reference less stable than the orignal signal because the frequency will be several orders of magnitude lower.

Note that this is a common problem. Optical clocks are about three orders of magnitude more stable than cesium clocks, so there is no way to compare them with any "offical" time signal.

Hmm - three clocks....

I know in mirror grinding (the old way), the question arises - how do you grind a flat surface? What you do is take two glass blanks A and B, and the grinding medium and grind them together. This gets all the small rough edges off (except grinding scratches) and you wind up with a concave and a convex spherical blank, unknown radius of curvature. Now you take a third blank, call it C and grind it with A. Then you grind C and B. Then you start over. If you continue this process, the three blanks will progressively get flatter and flatter until their "unflatness" is just that due to the scratches of the grinding medium. This is based on the geometrical fact that the three surfaces cannot match at every point unless all three are flat.

So you have created a very flat surface without a reference flat surface. I have no experience with clocks, but I wonder if this concept is used to create a very "flat" (i.e. linear, or "correct") clock?

Last edited:
f95toli
Gold Member
but I wonder if this concept is used to create a very "flat" (i.e. linear, or "correct") clock?

Not really, but so-called 3-cornered hat methods are sometimes used to characherize oscillators

http://www.wriley.com/3-CornHat.htm

However, as long as you can use the "beat" method this should not be needed.

Andy Resnick
Last edited by a moderator:
f95toli