# Multimeter accuracy

1. Mar 9, 2016

### nothing909

1. The problem statement, all variables and given/known data
Which multimeter is more accurate?

2. Relevant equations

3. The attempt at a solution
Which multimeter is more accurate, one that is ±1%+2 or one that is ±2.5%+2

I don't know how to read it, is it the higher percentage one that's more accurate or the lower percentage one?
I think it's the lower percentage one but I'm not too sure.

Also, what does the +2 mean? Does that make much of a difference?

2. Mar 9, 2016

### Staff: Mentor

Lower would be more accurate.
I don't know what the +2 means, either. Where did you see this? Can you post a link or a picture?

3. Mar 9, 2016

Here

File size:
7.8 KB
Views:
35
4. Mar 9, 2016

5. Mar 9, 2016

### Staff: Mentor

The explanation isn't very clear, and I think they might have a typo. If the actual voltage is 100 V, an error of +/- 1% would give a range of indicated values between 99.0 V and 101.0 V. The + 2 has to do with the least significant digit being off by 2. They go on to say it could be as low as 99.8 V or as high as 101.2 V. I believe the low value should be 98.8 V, not 99.8 V.

6. Mar 9, 2016

### nothing909

Yea, I notice that now, that's why I was so confused. I couldn't find a clear explanation online so I came here. Thanks for clearing it up.

7. Mar 9, 2016

### Staff: Mentor

My comment is consistent with the link that nsaspook provided. (Look on the tab with Accuracy.)