Calibrating Equipment to Standard: Can A Volt Ever Be Accurate?

In summary, the most accurate voltage standard is the Josephson Array Voltage Standard that can generate voltages between zero and 10 volts and calibrate Zener voltage standards with an accuracy of better than ± 0.02 ppm at 10 volts. However, this standard is not perfect and can only measure voltages up to a certain limit. Some meters are accurate to the limit of their ability, while others need more important measurements to be taken with higher resolution meters.
  • #1
bpatyk2
4
0
I've worked in calibration for a few years and I know that my equipment has to be calibrated by a standard that is a certain amount more accurate than the piece of equipment I'm calibrating, but where exactly does it end where the most accurate piece of equipment is calibrated by the absolute standard? Can a piece of equipment ever actually accurately measure 1 volt?
 
Engineering news on Phys.org
  • #2
Now there's a question that opens up a whole world of its own, metrology.

http://www.sandia.gov/psl/Dc_fact%20Sheet2008_Final.pdf


DC voltage measurements start with the Josephson Array Voltage Standard that can generate voltages between zero and 10 volts and calibrate Zener voltage standards with an accuracy of better than ± 0.02 ppm at 10 volts.
 
Last edited by a moderator:
  • #3
I got to say that's pretty damn f*ing accurate, but does that mean we are taking measurements directly from this josephson array standard. That's still not exactly accurate right?
 
  • #4
bpatyk2 said:
I got to say that's pretty damn f*ing accurate, but does that mean we are taking measurements directly from this josephson array standard. That's still not exactly accurate right?

Yes it is, sort of. If the Josephson voltage was a "true" standard; it would -by definition- be exact.

In reality it is a bit more complicated than that since the Volt is not actually a base unit (the Ampere is). Hence, what we are actually doing is using the accepted CODATA value for the Josephson constant (483.6 GHz/V). The Josephson constant can be used to relate frequency and voltage, and since we can get the frequency from our hydrogen masers ("atomic clocks") the precision of that is extremely high (probably at least one part in 10^14 if you integrate for long enough).

This is how voltage metrology has been done in the past 25 years or so. From a practical point of view it means that we can calibrate voltage standards (in this case Zener diodes) with an accuracy much, much higher than for anything me need in practical applications.

Btw, the 0.02 ppm is VERY conservative. Most standard labs can calibrate their secondary standard with an accuracy that is at least one order of magnitude better than advertised, this is to have some tolerance for unexpected aging etc.

EditL I forgot to say that is possible to just calibrate say a multimeter from a Josephson array setup. This is done routinely in some demanding applications: some high-end multimeters have calibration inputs a the back, and by re-calibrating them say once a day (which can be done automatically) one can avoid problems with drift etc.
 
  • #5
There's accuracy and there's resolution.

A meter is accurate when it reports correctly to the limit of its ability.

A three digit meter is"accurate" if all three digits are true.
I once had access to a six digit multimeter, but that much resolution was useless - we had nothing that'd stand still enough the last two digits wasn't all over the place. Well, except a meter calibrator.

They're talking about eight digits.. boggles my mind. That takes a lab environment.

When you need to resolve microvolts, you struggle with effects like the temperature gradient along your measuring wires , dissimilar metals in your test prods and the like. These guys talk of ability to resolve, indeed measure, down to a tenth of a microvolt.
100X that, around ten microvolts, is a fraction of a degreeF to a thermocouple. In forty years of practical industrial work i never had any need to resolve DC voltage any closer than that. Four digits were enough. But you need a five digit meter to check your four digit one. And so on...

Look ma, no participles !

old jim
 

1. How is a volt measured and calibrated?

A volt is measured using a voltmeter, which is calibrated using a known standard voltage source. The voltmeter is adjusted to match the standard voltage, ensuring accuracy in measurement.

2. What is a standard voltage source?

A standard voltage source is a device that produces a known and stable voltage, typically with very low uncertainty. This can include batteries, power supplies, or specialized equipment designed for calibration purposes.

3. Can a volt ever be 100% accurate?

In theory, yes, a volt can be 100% accurate if calibrated using a perfectly accurate standard voltage source. However, in practice, there will always be some level of uncertainty and error, even with the most precise calibration methods.

4. How often should equipment be calibrated?

The frequency of calibration depends on the specific equipment and its intended use. Generally, equipment used for critical measurements should be calibrated more frequently, while equipment used for less critical purposes may have longer calibration intervals.

5. What are the consequences of not calibrating equipment to standard?

Not calibrating equipment to standard can result in inaccurate measurements, leading to errors in experiments, processes, or products. This can have serious consequences in fields such as healthcare, manufacturing, and research. Regular calibration is essential for maintaining reliable and accurate measurements.

Similar threads

  • Electrical Engineering
Replies
4
Views
3K
  • Electrical Engineering
Replies
33
Views
581
Replies
8
Views
1K
  • Electrical Engineering
Replies
5
Views
702
Replies
4
Views
977
  • High Energy, Nuclear, Particle Physics
Replies
3
Views
154
  • Electrical Engineering
Replies
9
Views
1K
  • Electrical Engineering
Replies
2
Views
6K
  • Electrical Engineering
Replies
4
Views
1K
Replies
7
Views
2K
Back
Top