So I'm beginning to study photonics and the issue of excessively accurate reference times for clock signals are mentioned. I understand that natural frequencies are often compared to determine the accuracy of a time measure (olden times it was vs astronomical events and nowadays it's against photon / matter interactions like counting vibrations of an atom using lasers). However, how does once exactly determine the range of error in something like time? You can't measure a more accurate frequency with a less accurate frequency ... as the the measurement would just have the same range of error as the less accurate frequency. Only thing i can think of is since we have set a standard of some number of cesium atom vibrations equaling a second we have a counter synchronized to count some relevant phenomon against something that counts the cesium vibrations at very low temperatures and you do that for several seconds and average it out. So in a way it's arbitrarily set by comparing to the standard. Is that how it's actually done in physic labs or is there another method? hm... i think that ^ might be it I swear typing questions into forums helps you think things out.