Calibrating measurements from scratch

In summary: But for now, that's what we have.In summary, the first question is what accuracy and precision you require, surely. Why do you ask, OP? Google metrology provides an answer for how to measure time, weight/mass, and volume if there are no calibrated devices. The first question is what accuracy and precision you require, surely. Why do you ask, OP? Google metrology provides an answer for how to measure time, weight/mass, and volume if there are no calibrated devices.
  • #1
atracious
12
1
TL;DR Summary
I'm wondering how a person would determine accurate measurement of time, weight/mass, and volume if there were no calibrated equipment.
Assuming you are in the field and don't have any device to weigh, measure volume, or measure time. How would one establish accurate measurements? Where does one start?
 
  • Like
Likes BvU
Physics news on Phys.org
  • #2
Well, go back in history: you take the size of the king's foot as a length standard and a nice big stone for weight. Twenty centuries later you still measure feet and stones.

Google metrology
 
  • #3
The first question is what accuracy and precision you require, surely. Why do you ask, OP?
 
  • #4
atracious said:
Summary: I'm wondering how a person would determine accurate measurement of time, weight/mass, and volume if there were no calibrated equipment.

Where does one start?
That's actually a pretty good question...
BvU said:
Google metrology
And a good reply... :smile:

https://www.azom.com/article.aspx?ArticleID=12035
 
  • Like
Likes anorlunda
  • #5
Ibix said:
The first question is what accuracy and precision you require, surely. Why do you ask, OP?

I am curious as to the scientific method applied to define units of measure. And how easy/hard it would be to progress in accuracy up to, say, high school level. Mm, cm, L, mL, Kg, lbs, feet, inches, etc. Or if it would be more prudent to just define new units of measure and rebuild from there.
 
  • #6
Are you asking about actual history, or are you writing a fictional story?

I have to imagine that the balance scale was a really ancient invention. Wikipedia says 2400 BC. The accuracy and precision of a balance scale can be very good.

Something as simple as a fingernail clipping could be use to measure small distances. On fingernail, two fingernails -- 1 mm, 2 mm. Wikipedia says the oldest measuring rod dates to 2650 BC.

Or are you not asking about measurements but rather our system of units? Can you clarify your question for us please?
 
  • #7
atracious said:
Or if it would be more prudent to just define new units of measure and rebuild from there.
We do redefine our units of measure, actually somewhat routinely (if any change requiring decades of thought and thousands of man-years can ever be called “routine”) as measuring technology improves. In 1983 the meter was redefined to be the distance that light travels in 1/299792458 seconds. Before then it had been defined to be 1650763.73 wavelengths of the radiation associated with a particular atomic transition of the gas krypton, and before that it was the length of a reference bar kept in a vault in a laboratory in Paris.

In each case, the new standard was something that could, because of technological advances, be measured more precisely than the old standard. Calibrations made with the new standard are better than with the old standard; the worst-case difference between measurements of the same thing made with two different but ostensibly properly calibrated instruments is smaller.

However, the new standard is deliberately kept compatible with old one, in the sense that the length of the new meter is always within the range of error of measurements of the old one. Thus, anything calibrated against the old standard still works as well as it ever did - we aren’t obsoleting hundreds of billions of dollars of worldwide investment in machine tools, measuring devices, and manufacturing equipment.

And this last is why we keep improving the meter instead of defining something new and rebuilding from there. Sure, we could create a new unit of length... call it the “greeple” and define it to be the distance that light travels in ##10^{-8}## seconds. But why bother? The downside is the enormous economic cost of retooling the entire world economy, and the upside is... what? Whether we use greeples or meters, our ability to precisely specify distances (share specifications among manufacturers, repeatable manufacturing processes, spare parts that fit, ...) is still determined by our technology for measuring fractions of a second.

You should not be surprised to hear that the 1983 definition of the meter was adopted precisely because we can measure time more precisely than distance. Maybe in a few decades that will change, and then we will take advantage of the new technology to further improve the meter standard.
 
Last edited:
  • Like
Likes berkeman

1. What is the purpose of calibrating measurements from scratch?

The purpose of calibrating measurements from scratch is to ensure accuracy and precision in scientific experiments and measurements. By calibrating from scratch, you establish a baseline for your measurements and can account for any potential errors or variations in your instruments.

2. How often should measurements be calibrated from scratch?

The frequency of calibration depends on the specific instrument and its intended use. Generally, it is recommended to calibrate at least once a year, but more frequent calibrations may be necessary for instruments that are used frequently or in critical experiments.

3. What is the difference between calibration and recalibration?

Calibration refers to the initial process of establishing a baseline for measurements, while recalibration involves periodically checking and adjusting the instrument to maintain accuracy over time. Recalibration is important because instruments can drift or become less accurate over time.

4. Can measurements be calibrated without specialized equipment?

In some cases, yes. For example, using a known weight to calibrate a scale does not require specialized equipment. However, for more precise and complex instruments, specialized equipment such as calibration standards and reference materials may be necessary for accurate calibration.

5. What are some potential sources of error in the calibration process?

There are several potential sources of error in the calibration process, including human error, environmental factors (such as temperature and humidity), and equipment malfunctions. It is important to carefully follow calibration procedures and perform regular maintenance on instruments to minimize these errors.

Similar threads

Replies
85
Views
4K
  • Classical Physics
Replies
21
Views
722
Replies
45
Views
966
Replies
1
Views
514
Replies
1
Views
605
  • Classical Physics
Replies
31
Views
911
  • Classical Physics
Replies
18
Views
1K
Replies
3
Views
667
  • Classical Physics
Replies
6
Views
2K
  • Classical Physics
Replies
13
Views
901
Back
Top