A method to improve the vertical accuracy of an oscilloscope?

In summary, the person proposing using a DMM to calibrate the oscilloscope claims that this will impart the DMM's much better accuracy of 0.025% to the scope measurements. However, they suggest that this correction will allow future scope measurements (of AC varying signals) to use the DMM's accuracy, thus improving our test quality. However, I have doubts about the accuracy of this proposal.
  • #1
jephthah
4
0
Hello

we have a number of oscilloscopes with 1.5% vertical accuracy (DPO-4032)

we would like to improve (reduce) this number, to improve the quality of our tests.

someone has proposed using a DMM (Fluke 45) to "calibrate" the oscilloscope and impart the DMM's much better accuracy of 0.025% to the scope measurements.

they suggest:

- measure a constant DC value with the DMM, treat the result as "true"
- measure the same DC value with the oscilloscope.
- subtract the one from the other, and apply the difference as a "correction factor" to the scope

it is claimed that this correction will allow future scope measurements (of AC varying signals) to use the DMM's accuracy. thus improving our test quality.

Brilliant, right? So why do i feel like this is wrong? is the basic premise even valid?

Thank you for considering, any and all responses will be greatly appreciated.
 
Engineering news on Phys.org
  • #2
It is invalid because the scope is not off by a constant 1.5%. If the case were that simple, it could easily just be calibrated better. The 1.5% accuracy means it will be up to 1.5% off the 'true' value. The error will not be a constant.

Sometimes the output will be exactly the true value, sometimes it will be 101.5% of the true value, sometimes it will be 98.5% of the true value - depending on the signal and circumstances.

Trust me, if it really were 'that easy' to make the scopes more accurate, why wouldn't the company just do it in their production line?
 
  • #3
yes i agree. of course the scope is not always off by 1.5% ... but wouldn't one particular scope be off by some arbitrary amount, and remain fairly consistently off by that amount for at least a short period of time?

for instance, say we're measuring a 1.0 Vpp signal offset at 0.5 VDC: take one unique scope, measure 0.5 VDC with a calibrated Fluke 45 and then the same 0.5 VDC with the scope, calculate the difference as the scope's "adjustment" and then *immediately* measure our 1.0 Vpp + 0.5 VDC signal.

why wouldn't that particular scope measurement, adjusted to the calibrated Fluke 45's measurement taken just a fraction of a second prior, have the same accuracy as the Fluke 45? Or at least something very close to it?
 
Last edited:
  • #4
This is where digital oscilloscopes have an advantage: their accurate is determined by the ADCs used rather than deflection or analog circuitry accuracy.
 
  • #5
uh, thanks, but that's not an answer. of course the DPO-4032 is a digital scope. i don't even know who uses analog scopes these days.

how can you use a 5-1/2 digit DMM (0.025% accuracy) to "calibrate" an oscilloscope with 1.5% accuracy?
 
  • #6
"how can you use a 5-1/2 digit DMM (0.025% accuracy) to "calibrate" an oscilloscope with 1.5% accuracy?"

You probably can't or the manufacturer would have done that to increase the accuracy of their oscopes. I would bet $100 that the 1.5% accuracy is not a linear error characterisable with a DC voltage that you could simply tune the system to remove. If it was a linear error, the designers of the oscope would have calibrated it out. Whenever you see accuracies for test equipment like that, it means that there are non-linear errors over the range of voltage or frequency inputs the device was built to accept that can't be removed without greatly increasing the cost of the device.

But... If you want to try to remove it for an AC signal, the best idea would be to use a function generator to put in an approximate of the signal you are going to be measuring (approximate amplitude and frequency range) and then try to observe what the differences are. Putting in a DC signal to try to tune for an AC signal isn't going to get you very far.
 
Last edited:
  • #7
Accuracy of oscilloscopes is a very difficult thing to determine and you can just magically get an accuracy of 1.5% by calibrating it to something more accurate. You're essentially at the mercy of your ADC's. If you want more accuracy you have to look into using "tricks" such as a large amounts of averaging and optimizing the sampling rate.
 
  • #8
Accuracy and repeatability are not the same thing. If you can provide the scope with a signal of the same form as the signal you want to measure and with a known amplitude then, if the scope is reliable / consistent, you can do better than the published figure for accuracy. It may be necessary to wait until the temperature has settled down.
This applies to both analogue and digital equipment.

Another possible trick might be to subtract a known amplitude of signal from the test signal and examine the difference signal. This expanded difference signal would then be subject to your 1.5% accuracy. It would need some fancy circuit design though.
 
  • #9
Get a sig genny and a ratio transformer and compare the measureand with that on screen.
I think mine is seven decade which should be enough accuracy for you.
 
Last edited:
  • #10
The scope error will almost certainly be frequency dependent, so measuring the error at DC will only help for low frequency signals.

Any signal that is not a pure sine wave will contains a spectrum of different frequencies, and the different errors in each frequency component will distort the shape of the "graph" of the signal as well as changing its amplitude.

Even for a DC signal, the errors may vary with the amplitude of the input. So if you measure the error accurately as say 1.23% at full scale, there is no guarantee it will be 1.23% for any other input level.
 
  • #11
Sophie pretty much nailed it. Repeatability and accuracy are not the same. I've worked for a test equipment manufacturer and we did this sort of thing to 'characterize' RF generators from a couple MHz to a Gig. The Fluke signal generator allowed direct control over the RF attenuator on it's output through GPIB. We had a sensative measuring receiver that did the whole spectrum in steps in about 7 different amplitude levels. About half a dozen generators received this characterization fairly regularly. Results were tracked from one characterization to the next. Nothing ever changed very much if at all.
 
  • #12
Scopes are full of time related distortion. Leading edges don't have the same value as the settling and settled values, and subtle things like the amplifiers changing temperature over time, due to self heating, throw the readings off.
The A/Ds are optimized more for response than accuarcy and the circuits suffer from recovery when overdriven.
For all this, they do a pretty good job.
If you can synthesize the waveform in phase with the test signal, than you may be able to use a differential probe (or transformer) to detect the error. Square waves are easy to synthesize using CMOS chips because they drive the signal to the two power supply rails.

Given two or more of these "references," you can compare them for consistensy.

Best of luck,

Mike in Plano
 
  • #13
When I was using scopes seriously, they were calibrated using a square wave and probes were tuned for optimum. Scopes have never, afaik, been used for high accuracy. Other methods are normally devised specially for really high accuracy of measurement.
 
Last edited:
  • #14
Maybe I was too nebulous. If it's a digital scope, what is the ADC being used (# of bits). You can only achieve accuracy as good as what's defined by the ADC bits and probably not even that good (resolution != accuracy). The DMM's 5-1/2 digits is 30 bits. Most oscilloscopes are 8-12 bits only or 3 to 3-1/2 digits. On a good day you might find a high end scope that is 24 bits or 4-1/2 (Again this is resolution, not even accuracy).

If you waveform is repetitive, you can integrate over multiple cycles to get better vertical accuracy (most high-end digital scopes made over the last 30 years have this feature built-in). But this does not (and can never) work for single-shot waveforms.

This is the inherent issue with all oscilloscopes: if you need very high vertical accuracy, especially single-shot, it's it's often the wrong instrument to use.

Better to use other measurement instruments or techniques such as a sampling DMM (up to a certain frequency) like a Agilent/HP 3458A or some waveform integration technique.

And in some cases you have to get completely out of the box: one time I got a call from a professor who was sure he needed a 50 GHz oscilloscope (which didn't exist back then) - turns out looking at what he was really trying to do all he really needed for a microwave frequency counter (he was thinking in terms of the "idiot" undergrad-taught measuring technique of using an oscilloscope to measure the frequency).
 
  • #15
You can only achieve accuracy as good as what's defined by the ADC bits

Actually you can do better if you use the scope as an indicator in a nulling method.
Or simply compare with a ratio tx as I said earlier. Then you could be up against the basic accuracy of the range switch, rahter than the ADC.
 

1. What is the purpose of improving the vertical accuracy of an oscilloscope?

The purpose of improving the vertical accuracy of an oscilloscope is to ensure that the measurements and readings displayed on the screen are as precise and accurate as possible. This is particularly important in scientific and engineering applications where even small errors can have significant impacts.

2. How does the method work to improve the vertical accuracy of an oscilloscope?

The method typically involves calibrating and adjusting the oscilloscope's internal components, such as the amplifiers and voltage dividers, to ensure that they are functioning correctly and providing accurate measurements. This may also involve using external calibration tools and techniques to fine-tune the instrument.

3. What are the benefits of improving the vertical accuracy of an oscilloscope?

The benefits of improving the vertical accuracy of an oscilloscope include more precise and reliable measurements, which can lead to better understanding and analysis of electronic signals. This can also help to identify and troubleshoot any issues or anomalies in a circuit or system.

4. Are there any limitations to this method?

While this method can significantly improve the vertical accuracy of an oscilloscope, it may not be able to completely eliminate all sources of error. External factors, such as noise and interference, can still impact the accuracy of the readings. Additionally, the quality and capabilities of the oscilloscope itself may also play a role in the overall accuracy.

5. How often should an oscilloscope's vertical accuracy be checked and improved?

The frequency of checking and improving the vertical accuracy of an oscilloscope will depend on the specific instrument and its usage. In general, it is recommended to perform regular calibrations and adjustments, especially before and after critical measurements or when significant changes in the instrument's performance are observed.

Similar threads

  • Classical Physics
Replies
16
Views
821
  • Electrical Engineering
Replies
4
Views
6K
Replies
1
Views
1K
  • STEM Educators and Teaching
Replies
1
Views
12K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
Replies
2
Views
3K
Replies
2
Views
3K
Replies
35
Views
9K
  • Quantum Physics
2
Replies
52
Views
8K
  • Beyond the Standard Models
Replies
24
Views
7K
Back
Top