So, I will try to explain this as best as possible. I am literally on the first chapter of a classical mechanics book. The chapter is discussing all the basic SI units...the meter, the second, etc. It's these units that I am having such a hard time understanding. They seem so easy but my mind keeps wondering and it's making it harder to understand them. I will list my difficulties below: I understand that the meter was created by using the earth as a basis of its measurement. This was eventually inscribed in a metal stick with two sharp scratches. So, the meter was set...more or less (because the scratches gave a +/- error--ie: do you measure at the valley of the scratch, the beginning of the slope, or the top end of the slope) and there were several other variances that made it not a reliable measurement. The second was created based upon our solar day--1/86,400 of a solar day is a second. And that leads me into the question of the sidereal day and the solar day and how they actually even measured a measurement like 1/86,400 of a day hundreds of years ago and knew exactly when the earth turned a full solar day. That's a whole other question I don't understand. But anyways...from my understanding...since the meter was set with those scratches, we eventually had the accommodate that measurement to be the length light travels in 1/299,792,458 of a second. We had to fit that fraction of a second to fit our meter that was already made. But how do they know it's exactly 1/299,792,458 seconds of "light length". My guess is that the difference between 1/299,792,458 and 1/299,792,459 second of "light length" would be the difference of measuring the scratch mark in the valley vs the top edge of the valley. I hope you understand what I'm getting at there. And I know I skipped a standard there with the krypton, but it's still the same question. We let's say I understand that it's 1/299,792,458 seconds of "light length". That is set. Well that comes from the second. Which is based off a Cs atom. Well, that definition has it at 0K. Which I thought was impossible, so I don't even know how they got that. And then on top of that! I was reading that the meter actually has an absolute error of something! I can't find that article again but I believe it was bigger than something that we actually know...like the radius of a proton. So, if the meter potentially has an error bigger than that, then how do we know the radius of a proton!! Ahhhh!! Lol So, what I'm trying to say is that I think the meter and the second are, in my opinion, the two most important SI units (yeah, kg I guess too). My understanding is that we have come up with standards that don't differ from one universe to another and we have closely matched those standards to the previously known standards. My confusion is that I think they all depend on each other...so if there is error with one, it only is magnified in the next SI unit that depends on it. Also, I understand that this doesn't have big implications for the day-to-day world....like it's not going to matter that a bolt on a car is a picometer off, but at the atomic level, I do feel like it makes a difference. I hope my questions are obvious enough..and if not, I hope you understand where I am going with this, because sometimes I don't even understand what I am trying to ask. Lol PS: could anyone tell me what 1/300,000,000 of "light length" is. I'm just curious. I know it's obviously smaller than a meter since it s a shorter amount of time, but what is it? 9/10 of a meter? My brain is fried right now and I can't even figure that out lol Thanks you.
Let me elaborate on this so I can better explain what is bothering me. The meter is based off the second. Does the second have error in it? And if it does, how does that effect the meter at extremely small distances? I would assume if there is error in the second, then the length of a picometer could very. Also, is the meter as accurate as it can get? Why 1/299,792,458 seconds of "light length"? Could it be even more precise: ex) 1/299,792,458.0003 seconds of light length? And what implications does that have for extremely small distances? Also, why 1/299,792,458 seconds of "light length"? And not 1/299,792,459 seconds of "light length"? Someone had to decide this while moving from the krypton standard to this as this one is "more accurate". Also, since perfect vacuums are not possible and other factors exist, is there error in our measurement of light and what implications does that have on extremely small distances? Ah...those are my questions...just took me a few moments to collectively organize my thoughts. :). I guess what's bothering me is the extremely small distances. Like how to we know a femtometer is a femtometer if we have error in the second and the meter...and I don't even want to begin thinking a the Plank length.
The definitions are exact (by defintion). In using them in any experiment there will be an experimental error.
Let me start off by saying that it is good to think about what you are reading, but if you spend too much time and energy investigating every detail that occurs to you, you risk the following: learning lots of detail which is not very important and not having time to learn some concepts that are really important receiving information that you are unable to understand because you haven't yet learned what you need to put it in context getting frustrated that you don't know/don't understand anything/aren't making any progress because you are drowning in a sea of detailed information rather than following the well-charted passage that a more structured approach would provide I would say that at this stage it is probably only relevant to know that we have defined the metre, second and kilogram so that all scientists agree what they mean. Also, because the speed of light in a vacuum is such an important quantity we have defined the metre and the second so that the speed of light is exactly 299,792,458 ms^{-1}. Having said that, I will answer your questions: There will always be error in measurement, but there is no error in the definition of a second. There is no error so this is not a problem. Yes, it is defined exactly. Why do we use the distance light travels in a vacuum in a certain time? Because that gives a fixed definition that is constant in time and space. Why do we use that number? Because that gives a length which is almost exactly the length of the previous definition of the metre. No, we have defined it as exactly the distance light travels in a vacuum in 1/299,792,458 s. None as the definition is exact. It is not any "more accurate" than the previous definition. Both definitions define exactly a certain length. The fact that the lengths are slightly different does not cause any problems to scientists. There are many problems to overcome when measuring very small distances (one in particular which cannot ever be overcome), and we don't actually need very high precision measurment of small distances anyway so again this doesn't have any material implications.
Wow. Thank you for your lengthy response. I have a little better understanding of my questions from your answers, but the text I quoted was probably the most valuable advice you provided me. That's exactly my problem. I was on page 5 of the book for over 3 hours trying to research all this stuff. I will try to just read the information as I go hoping that it will all fit into together eventually. But, yes, you did indeed helped me out with that. Thanks!
Also, it is important to realize that these measurements were not set 'hundreds of years ago' and then left alone, but new measurements are currently being made all the time. As new instruments and measurement techniques are developed, existing standards are checked and refined as necessary, and sometimes re-defined, as has occurred with the speed of light: http://en.wikipedia.org/wiki/Speed_of_light
The definition of a meter cannot be made more accurate, because the meter is what it's defined to be. In other words, the meter is only what we say it is. There is no other concept of a "meter" besides what we've defined a meter to be, and so there is no other concept to compare it to in order for a notion of accuracy to even be meaningful. Suppose I had a magical rod that was exactly the distance that light travels in 1/299,792,458th of a second (IE, it is exactly one meter by definition). Exactly what object am I supposed to compare it to, in order to gauge its accuracy? Itself?
The meter currently is defined as a small portion of the distance light travels in one second: it was not always defined thus. Who's to say that in fifty or one hundred years, a new definition for the meter will not arise? The important thing about a system of units to remember is that they are a convention of arbitrary standards which are agreed to in order to facilitate commerce, science, whatever. The French, who first conceived of the meter, thought their definition of it was particularly clever, 1/10 millionth of the length of the quadrant from the equator to the North Pole. In practical terms, this definition was less than ideal: no one at that time had ever visited the North Pole nor could say precisely where it was, no one knew the precise shape of the earth, but there were indications it was not perfectly spherical, and no one had experience trying to measure such a large distance with any accuracy. Eventually, a series of compromises had to be adopted to the original definition so that a practical standard measurement could be produced. http://en.wikipedia.org/wiki/Metre The current definition of the meter was only established in 1983.
Ultimately, metrologists (measurement scientists) would like to define all units in terms of fundamental physical quantities - avoiding the need for an arbitrary, "artificial" standard. Currently, the second and the metre are defined as such, but other quantities such as the kilogram are not. Claude.
Right, if things go as planned, in 2018 the following fundamental constants will be defined to have exact values, as is the case now with the speed of light in vacuum: Planck's constant: h = 6.626[...] x 10^{-34} J·s Electron charge: e = 1.602[...] x 10^{-19} C Boltzmann's constant: k = 1.308[...] x 10^{-23} J/K Avogadro's number: N_{A} = 6.022[...] x 10^{23} mole^{-1} with [...] representing whatever extra additional digits are included in the final definitions.
But note that everything (except the mole) will still depend on the arbitrary definition of the second.