Standards Definitions are weird

In summary, the definition of a second has changed over time to be more accurate, but consistency is important in keeping our measurements accurate.
  • #1
Not_a_dork
2
0
Hello,

While riding home today, I started to wonder what was the definition of a second in the context of time. I looked it up and found out that the definition has changed somewhat over the years and it is now defined as (paraphrased) 9,192,631,770 periods of the cesium atom.

A meter is defined as the distance light travels in a vacuum in 1 over 299,792,458 seconds.

Since we are arbitrarily defining our units, why not make them a rounded value? We have changed the definition a few times before, and I imagine the new definitions are more exact approximations of what our old definitions "meant". When the meter was proposed, they wanted to use 10 million meters from pole to equator and while they got close, it was ultimately wrong. We now have the capability to map the surface in rather fine detail, so why not go back to the original definition? Or perhaps use 1 over 300,000,000 and call it close enough?

I notice that there are no zeros in the light definition and only the least significant place in the time definition is a zero. Is this significant?

I guess my problem is that 9,192,631,770 periods seems so random. Perhaps at the scale of atoms everything seems random but we could have easily forced some "order" into these definitions.

The period of cesium seems to be a great source of something that won't change over time, but we have tied that to something that will change over time: our day.

Anyway, any thoughts and perspectives would be appreciated.

Jack
 
Physics news on Phys.org
  • #2
It is our accepted units of measurement, "meters", "seconds" and so on that are arguably random, not the value fundamental constants of nature happen to get within our system of units.
 
  • #3
Not_a_dork said:
Hello,

While riding home today, I started to wonder what was the definition of a second in the context of time. I looked it up and found out that the definition has changed somewhat over the years and it is now defined as (paraphrased) 9,192,631,770 periods of the cesium atom.

A meter is defined as the distance light travels in a vacuum in 1 over 299,792,458 seconds.

Those constants are chosen so that the meter and second do not change size relative to their previous definitions.
 
  • #4
Not_a_dork said:
Since we are arbitrarily defining our units, why not make them a rounded value?
We aren't arbitrarily defining our units. We are refining them. You are missing something important: Consistency. There's a whole lot of machinery out there depends on those definitions.

Consider the meter. The initial definition of the meter was 1/10,000th of the distance between the equator and the poles. Notice the nice, round number. This didn't work because the ability to measure that distance precisely didn't exist in the late 18th century. The first working definition was a prototype meter bar that was supposed to represent this distance. That didn't work perfectly because materials change length as they heat and cool, and because errors are always introduced on making a copy.

The next definition (1960) was based on the wavelength of a specific frequency of light. This worked quite nicely; in fact it was used as a de facto replacement for the prototype meter bar for a long time before it became standard. For consistency's sake, this 1960 redefinition was made such the prototype meter bar remained exactly one meter long. No engineering drawings had to be changed, and no machinery had to be retooled as a result of this change.

The next definition (1983) was based on the distance light travels during one second. This new definition recognized the fact that the speed of light is constant. Once again, no engineering drawings were harmed by this redefinition.

Keeping the meaning of these metrology definitions consistent is extremely important. Changing the meaning of a meter would have drastic economic impacts. The impact of a change on the existing manufacturing base is one of the key reasons why the US has not switched to the metric system.
 
  • #5
"The impact of a change on the existing manufacturing base is one of the key reasons why the US has not switched to the metric system"

The Brits did, though. They tend to be..(no, I dare not say it!) than Americans.
:smile:
 
  • #6
The Brits lost a good chunk of their English units manufacturing base because of WWII. They had to rebuild anyhow; rebuilding to metric standards only made sense because the bulk of their exports went to continental Europe. The Americans built up a huge proportion of their English units manufacturing base because of WWII. Tear all that down just to switch to metric, when most of their product stayed within country? It didn't make sense. Now it does, and a switch to metric is happening underneath the hood. That WWII era machinery is now old, inefficient, and decrepit, and there's a whole lot more international trade now than there was 67 years ago.
 
  • #7
You have your history incorrect.
It was part of UK's entry into the EEC (European Union) in 1973 that obliged UK to change its system, NOT WWII.
--
Without offending ukies or usans alike, it isn't really that difficult to see why the US did NOT, and has NOT implemented such as the metric system:

USA is the Western country with the least active central authority to drive through reforms on this matter or that, and in that sense, the LEAST authoritarian Western society.
Development of structures , standards etc. in European countries is, by far, more government-driven than the US.
--------------------------------------------
As long as a sufficient critical mass of USANs saw more benefit in keeping the system of standards than changing it, the system stayed, rather than being abolished by governmental decree.
 
  • #8
I appreciate the thoughtful responses. I am glad this thread took a detour into why the US has not accepted the Metric system. One of my favorite topics to discuss with the previous generation. I am constantly amazed at how much emotion is involved in keeping the current system. My Dad is of the opinion that we being the last hold-out is a badge of honor. I personally think we should switch just because everything is based on 10. It makes everything so much easier when I don't have to constantly change my bases when doing conversions.

How many cups in a gallon again? Does an ounce of liquid refer to its volume or mass? Does an ounce of solid refer to its volume or mass? Can never keep those straight. haha
 
  • #9
Metrication in the UK began long before 1973, arildno, and the movement was driven largely by industry. Prior to metrication, British industry needed two production lines, one for continental Europe and another for internal consumption. Forcing the country to switch to metric was a win for British industry.

Metrication in the US on the other hand has largely been opposed by industry. Most production was aimed at the huge American market, and for you guys in Europe: You often bought our products even though they were tooled / sized in customary units. This trend has changed as of late. The US automobile industry, for example, has pretty much switched to metric. Many consumer products are now sized in metric units, with the customary units being secondary and having goofy, non-rounded value.

Metrication support or opposition has a lot more to do with the golden rule than having an active central authority. The golden rule: He who has the gold makes the rules.
 
  • #10
Jack you are definitely "Not_a_dork".

From Roman Legion's to Imperial Stormtroopers... This thread seems to be about the American Rhetoric System more than anything.

If someone wishes to introduce an new unit of measure there are means and ways to go about it that in my limited personal experience seem to revolve foremost around a special capacity for hard work and dedication.
 
  • #11
It is perhaps worth mentioning that the US "switched" to metric, or to be more specific the SI, a very long time ago (1893).
While it is obviously true that inches etc are still by far more common in everyday life, it is also true that all calibrations in the US are ultimately referred back the metric system which in the US is maintained by NIST.
Whenever they need to calibrate something in e.g. inches they still start out using the meter, and then convert by multiplying with a constant. These constants were defined in an internatonal agreement some 60 years ago (e.g. a yard is 0.9144 meters, exactly)
 
  • #12
Not_a_dork said:
I looked it up and found out that the definition has changed somewhat over the years and it is now defined as (paraphrased) 9,192,631,770 periods of the cesium atom...Since we are arbitrarily defining our units, why not make them a rounded value?
The periods are not arbitrary, I think they are a more precise and reliable translation of the 86,400th part of the day. Is that so?
 
Last edited:
  • #13
D H said:
The initial definition of the meter was 1/10,000th of the distance between the equator and the poles.
I wonder what the rationale for this was. Maybe something like: A meter sample could be stolen or destroyed. But the Earth, if that is gone we have bigger worries than units.
 
  • #14
f95toli said:
It is perhaps worth mentioning that the US "switched" to metric, or to be more specific the SI, a very long time ago (1893).

While US law specifies that the metric system is a legal system of units (since 1866), SI has only been around since 1960. SI was not adopted until P.L. 110-69 was enacted in 2007.

http://www.nist.gov/pml/wmd/metric/metric-policy.cfm

The UK did not convert to the metric system immediately after WWII. Like the US, the metric system had long been legal (since at least 1896 in the UK) but was not adopted widely outside of the scientific community. In the 1960s, with the prospect of joining the Common Market, British industry lobbied for the creation of the Metrication Board in 1969 (abolished 1980), which was to implement a 10-year plan to convert British industry to the metric system. EU directives from 1989 have required compulsory metrication by member states, but many products in the UK still retain dual labelling in Imperial and metric units.

http://en.wikipedia.org/wiki/Metrication#United_Kingdom
 
  • #15
bobie said:
The periods are not arbitrary, I think they are a more precise and reliable translation of the 86,400th part of the day. Is that so?
Correct.

Interestingly, the second was not a part of the metric system until about 100 years ago. The original French metric system did not have a unit of time. Even more interesting, the French initially toyed with making the day the unit of time. A day was to be divided into ten hours, that into ten parts, and so on. A week would have been ten days rather than seven. Time would have been metric!

That didn't take hold for a number of reasons. The French concept of decimal time was tossed at about the same time that the metric system was first proposed. One reason is that metric time does not scale up well. Months and years are what they are, and they are much more important concepts than tens, hundreds, and thousands of days. Months and years were even more important than they are now to the agrarian late 18th century society. Another is that the division of a day into hours, then minutes, then seconds was already very standardized. Decimal time offered little, if any, added value to this already widely-accepted concept of measuring time.

Compare that already standardized concept of time with units for length and mass. The situation there was terrible. It wasn't so much a matter of inconsistent units between country A and country B. It was a matter of inconsistent units between town A and town B, and oftentimes even within one town.

A.T. said:
I wonder what the rationale for this was. Maybe something like: A meter sample could be stolen or destroyed. But the Earth, if that is gone we have bigger worries than units.
Erratum: I made an error in post #4. That should have been 1/10,000,000th of the distance from the equator to the North Pole, not 1/10,000th.

The meter prototype was based on a measurement over a short baseline, about 10 km. This prototype was supposed to be a provisional stand-in until a meter based on this concept of 1/10,000,000 of the distance from the equator to the North Pole could be better refined. It turns out that the meter prototype is not 1/10,000,000th of the distance from the equator to the North Pole. One ten millionth of the distance between the equator and the North Pole is 1.00019657 meters. It was too late to change the definition of meter by the time this error was discovered. The meter prototype and its successors lasted as the standard until 1960.

Side note: Several of the scientists wanted to make the meter be the length of a pendulum that had a two second period (one second half-period). This is the concept of a seconds pendulum. One problem: One of the key members of the French Academy was adamantly opposed to the second. He was one of the architects of decimal time. Another problem is that it wasn't egalitarian. That seconds pendulum would have been located in Paris. Other places would have had to calibrate their clocks to something other than a one second half-period to account for variations in gravitational acceleration.
 
  • #16
The French faced more problems than the practicality of measuring a quadrant of the Earth's surface. At the time the meter definition was proposed, the shape of the Earth itself was in doubt. The British scientists posited that due to the rotation of the earth, the diameter of the globe measured in a plane coincident with the equator would be greater by several miles than the diameter measure between the poles. The French, if nothing more than to be contradictory, claimed that the reverse was true.

All of the mishaps encountered by the French surveyors and scientists charged with defining the meter are recounted in a book entitled 'The Measure of All Things', by Ken Alder.

https://www.amazon.com/dp/0743216768/?tag=pfamazon01-20

The attempt to define the meter based on measuring the Earth was just as arbitrary as the yard was supposedly defined as the distance between the nose and the outstretched fingertips of this or that English king. But when push came to shove, the French encountered great difficulty in undertaking their survey enterprise and wound up, IIRC from the book, fudging the final result.
 
  • #17
Basing the unit of time on 1/86,400th of the length of a mean solar day is arbitrary as well. So is the kilogram. Every base unit in the metric system is arbitrary. For a non-arbitrary system you'd have to go to Planck units or something similar -- and even then there's still a bit of arbitrariness present. You can't have a conversion factor of one for everything. Moreover, Planck units are completely impractical for everyday use.

The chief problem with a meter based on the distance between the equator and a pole is that it is not a very realizable unit. There are many other problems with this definition. That's why it was tossed so quickly.
 
  • #18
SteamKing;4546954EU directives from 1989 have required compulsory metrication by member states said:
http://en.wikipedia.org/wiki/Metrication#United_Kingdom[/url]

Don't believe everything you read on wikipedia. Except for beer (pints) and road signs and speed limits (miles), that statement is years out of date. AFAIK it is now illegal to offer any other goods for sale in the UK that are not described in metric units.

Historically, the UK non-decimal units at least had some commercial logic in their correspondence with the non-decimal currency, but I assume the US broke that link in 1776 and it remains broken.
 
  • #19
AlephZero said:
Don't believe everything you read on wikipedia. Except for beer (pints) and road signs and speed limits (miles), that statement is years out of date. AFAIK it is now illegal to offer any other goods for sale in the UK that are not described in metric units.

Historically, the UK non-decimal units at least had some commercial logic in their correspondence with the non-decimal currency, but I assume the US broke that link in 1776 and it remains broken.

The key phrase about the products on sale in the UK is 'dual labelling', i.e., both metric and imperial units are used. I didn't take this picture, but it speaks for itself:

UKproducts-metricusage.JPG


After the revolution, various currencies were used by the colonies. The 'dollar' actually originated as a coin minted by the kingdom of Spain. The etymology of the word 'dollar' dates it back to 16th century central Europe. An act of congress in 1792 specified the precious metal content of a dollar coin and further specified that coins of lesser denomination would be in percentages of a one dollar coin, which is where decimalization of the currency originates in US coinage.

http://en.wikipedia.org/wiki/Dollar
 
  • #20
SteamKing said:
The key phrase about the products on sale in the UK is 'dual labelling', i.e., both metric and imperial units are used. I didn't take this picture, but it speaks for itself:

The picture doesn't "speak for itself", because it doesn't state when it was taken.

I just checked my latest weekly supermarket shopping in the UK (admittedly, not from Tesco but one of the other major chains). The number of "dual labeled" items in it was what I expected: zero.

Fresh milk is one of the very few exceptions, and even then some brands are sold in metric quantities. The other products in the picture are apparently not dual labeled, unless you count selling something in a pack of 12 as "non-metric".
 
Last edited:
  • #21
Well, if you can't be bothered to check out the link, here is the 411:

http://en.wikipedia.org/wiki/File:UKproducts-metricusage.JPG

The picture was taken Nov. 17, 2005 according to the wiki info.

In the accompanying article, it says the sausages are labeled '12 oz 340 g', much like you would find on many products produced in the U.S. of A.
 
  • #22
AlephZero said:
I just checked my latest weekly supermarket shopping in the UK (admittedly, not from Tesco but one of the other major chains). The number of "dual labeled" items in it was what I expected: zero.

Circular, shmircular, check actual products.
 

What are standards definitions?

Standards definitions are a set of guidelines or specifications that are used to ensure consistency and compatibility in a particular field or industry. They provide a common language and criteria for products, processes, and services.

Why are standards definitions important?

Standards definitions are important because they help to ensure quality, safety, and efficiency in products and services. They also promote interoperability, which allows different systems and products to work together seamlessly.

Who creates standards definitions?

Standards definitions are typically created by organizations or committees made up of experts in a particular field or industry. These organizations can be national, international, or industry-specific.

How often are standards definitions updated?

Standards definitions are regularly reviewed and updated to keep pace with advancements in technology, changes in regulations, and improvements in best practices. The frequency of updates varies depending on the specific standard and the organization responsible for it.

Do all countries use the same standards definitions?

No, standards definitions can vary from country to country. However, there are often efforts to align or harmonize standards between different countries to facilitate international trade and cooperation.

Similar threads

  • Other Physics Topics
Replies
5
Views
2K
  • Other Physics Topics
2
Replies
56
Views
4K
  • Other Physics Topics
2
Replies
57
Views
5K
  • Introductory Physics Homework Help
Replies
4
Views
660
  • Other Physics Topics
Replies
27
Views
2K
Replies
9
Views
733
Replies
12
Views
739
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
2K
Replies
38
Views
2K
Back
Top