Accuracy of measurements from DAQ board

In summary: Good quality test equipment should have a very high accuracy. In my opinion, an accuracy and stability of ±20ppm is good enough that you may be able to...
  • #1
HyperSniper
39
2
Hello, I am confused as to how to determine the accuracy of a measurement of an AC signal from a Data Acquistion Board.

I went to NI's website to look at the the specifications of the board I was using (a USB-6009) http://sine.ni.com/ds/app/doc/p/id/ds-218/lang/en#header0", and I don't know whether the "accuracy at full scale" is referring to the scale setting of the board or the scale of the signal that the board is actually receiving.

If someone could clear this up for me, I would really appreciate it. Thanks!
 
Last edited by a moderator:
Engineering news on Phys.org
  • #2
It depends on the range you selects.
For measure wrt GND, range it's +/-10 V
It means that you have to divide 20V by 2^n, n length of the output register.
 
  • #3
HyperSniper said:
Hello, I am confused as to how to determine the accuracy of a measurement of an AC signal from a Data Acquistion Board.

Not sure what you mean by "AC signal from..." You are using an input, correct?

If so, Quinzios equation is for the accuracy of the A/D but you need to also consider your input sensing device. There will be error there that will propagate through your DAQ board.
 
  • #4
dlgoff said:
Not sure what you mean by "AC signal from..." You are using an input, correct?

If so, Quinzios equation is for the accuracy of the A/D but you need to also consider your input sensing device. There will be error there that will propagate through your DAQ board.

Yes, that's right. I am measuring a signal that is input into the DAQ from a function generator.

So what I am asking is how can I determine the error in the measurement I am taking?

I'm pretty ignorant on how to use these things, so if you could avoid using acronyms that would be helpful. I don't know what an A/D is for instance...
 
  • #5
HyperSniper said:
Yes, that's right. I am measuring a signal that is input into the DAQ from a function generator.

So what I am asking is how can I determine the error in the measurement I am taking?

I'm pretty ignorant on how to use these things, so if you could avoid using acronyms that would be helpful. I don't know what an A/D is for instance...

Okay, sorry about the acronyms. A/D is a common notation for Analog to Digital converter. i.e. the electronics that, in your case, takes a sample of the Analog signal from the function generator and converts it to the digital value, 12 or 14 bits, with a maximum sample rate of 48 kilo-sample per second. (From the data sheet: 8 analog inputs at 12 or 14 bits, up to 48 kS/s)

In order to find the error you need to know the error/uncertainty of the function generator for a particular setting. (see your function generators specifications) e.g. with a sine wave frequency setting of 100 hertz and a amplitude setting of 1 volt, there will be an uncertainty in the frequency and an uncertainty in the amplitude. Then there's the error/uncertainty of the A/D. The error of the function generator will get propagated through the A/D so it's not necessarily as simple as adding the errors to get a final accuracy. This is called http://en.wikipedia.org/wiki/Error_propagation" .
 
Last edited by a moderator:
  • #6
All right, I think I have got it figured out now.

The resolution of the Analog to Digital conversion that the DAQ makes can be calculated using the formulas here:

http://en.wikipedia.org/wiki/Analog-to-digital_converter#Resolution

Given 14 bits, that means that the resolution that the board can produce at a ±10V would be 1.221mV.

However, since there are a number of other factors that could effect the measurement produced, it would probably be better to use the values on NI's website for the range that the measurement is made at. I'm assuming the table titled "Absolute accuracy at full scale, differential" is what I need to use. So for ±10V the actual level of inaccuracy is 7.73mV.

However, now I can't figure out what the uncertainty is with respect to the signal produced by the function generator. http://www.gwinstek.com/en/product/productdetail.aspx?pid=5&mid=72&id=92" state that the "accuracy" and "stability" are both ±20ppm, but it seems like 10^-6 Hz and Volts are awfully small. Does it actually have something to do with the resolution? I can't find any guides that will tell me what all these numbers really mean...
 
Last edited by a moderator:
  • #7
HyperSniper said:
However, now I can't figure out what the uncertainty is with respect to the signal produced by the function generator. http://www.gwinstek.com/en/product/productdetail.aspx?pid=5&mid=72&id=92" state that the "accuracy" and "stability" are both ±20ppm, but it seems like 10^-6 Hz and Volts are awfully small. Does it actually have something to do with the resolution? I can't find any guides that will tell me what all these numbers really mean...

Good quality test equipment should have a very high accuracy. In my opinion, an accuracy and stability of ±20ppm is good enough that you may be able to consider it << less than the D/A accuracy; hence you could probably ignore it depending on the significance in measurement you are looking for.
 
Last edited by a moderator:

1. What is a DAQ board and how does it work?

A DAQ (Data Acquisition) board is a hardware device used to collect and measure data from sensors and other electronic devices. It typically consists of analog-to-digital converters (ADCs), digital-to-analog converters (DACs), and various input/output channels. The board connects to a computer through a USB, Ethernet, or other interface, and the data can be transferred and analyzed using specialized software.

2. How accurate are the measurements from a DAQ board?

The accuracy of the measurements from a DAQ board depends on various factors such as the resolution and precision of the ADCs and the noise level in the system. Most modern DAQ boards have a resolution of 16 bits or higher, which allows for accurate measurements with a precision of up to 1 part per 65,536. However, external factors such as signal interference and temperature variations can also affect the accuracy of the measurements.

3. Can the accuracy of a DAQ board be improved?

Yes, the accuracy of a DAQ board can be improved by using high-quality sensors and cables, minimizing external noise sources, and properly calibrating the system. It is also important to choose a DAQ board with a higher resolution and sampling rate if a higher level of accuracy is required for the specific application.

4. How can I validate the accuracy of my measurements from a DAQ board?

To validate the accuracy of a DAQ board, you can use a known input signal and compare the measurements from the board to the expected values. This can be done through various methods such as using a calibration reference signal or using a precision voltage source. Additionally, performing repeated measurements and calculating the standard deviation can also give an indication of the accuracy of the measurements.

5. Are there any limitations to the accuracy of measurements from a DAQ board?

While modern DAQ boards can provide highly accurate measurements, there are certain limitations that can affect the accuracy. These include nonlinearity in the ADC, noise in the system, and limitations in the sensors and cables used. Additionally, external factors such as temperature and humidity can also impact the accuracy of the measurements. It is important to consider these limitations and properly calibrate the system to ensure accurate measurements.

Similar threads

Replies
3
Views
2K
Replies
2
Views
9K
  • Electrical Engineering
Replies
34
Views
3K
  • Electrical Engineering
Replies
11
Views
12K
  • General Discussion
Replies
11
Views
25K
Back
Top