Was this enough to damage Multimeter?

  • Thread starter Thread starter Flyingwing12
  • Start date Start date
  • Tags Tags
    Damage Multimeter
AI Thread Summary
The discussion centers on concerns about potential damage to a multimeter after attempting to measure resistance on a live circuit powered by 1.5 volts. Users generally agree that such low voltage is unlikely to cause damage, and readings like 0.3 ohms are considered normal due to contact resistance and probe variability. It is suggested to test the meter's functionality by measuring resistance with the leads swapped to confirm proper operation. Additionally, the difference in readings when using various voltage settings is explained as normal, with battery tester functions applying a load that can affect measurements. Overall, the multimeter appears to be functioning correctly despite initial concerns.
Flyingwing12
Messages
41
Reaction score
0
This is the MM I am using : http://www.sears.com/shc/s/p_10153_12605_03482139000P

I made a little electric motor powered by 1.5 volts. However, like the newbie I am, I thought you could measure R on a live setup.

Of course the reading was probably wrong, but could this have damaged my meter?

Just curious.
 
Engineering news on Phys.org


with such small voltages your meter should recover.
 


You could check the meter against a known resistor, to make sure but 1.5V is unlikely to damage anything but a really sensitive old fashioned galvanometer. No need to panic, I think.
To get an idea of the resistance of the motor when running you should measure the volts across it and the current through it, either with two meters simultaneously or using the same meter connected as a voltmeter and re-connected as an ammeter with the motor running on each occasion.
 
I have my share of sticking the meter to measure resistance on live circuit. I yet to burn one with low voltage below 24V.
 
So with a digital read-out, there wouldn't be anything showing up if it were busted right?

I switched my meter to the 0-200 range. The meter used to show 0.0 resistance between probes. It now shows 0.3

Whats up with this?

I could probably take it back to sears to get a replacement.
 
0.3ohm souns about right. A normal handheld multimeter can't really be used to measure resistances this small (anything below 1 ohm or so), so what it will show depends on which range it is on, the contact resistane of the probes, the materials involved, the temperature etc.

I.e. lots of things; meaning any reading below about 1 ohm or so can usually be interpreted as a short.
 
Flyingwing12 said:
So with a digital read-out, there wouldn't be anything showing up if it were busted right?

I switched my meter to the 0-200 range. The meter used to show 0.0 resistance between probes. It now shows 0.3

Whats up with this?

I could probably take it back to sears to get a replacement.

No, this is normal operation.
 
0.3 is probably about the resistance of both leads in series.

One lead should read less than two leads. But meter can only resolve about 0.1 ohm.

Try this simple test:
Unplug the red lead and stick the black lead's probe tip into the jack from which you unplugged red lead. If you read 0.1 or 0.2 she's just fine.
Then swap, do same thing using only red lead.
old jim
 
If the Ohmmeter circuit was blown, would I not get a reading when I switched over to it?

my electronics teacher said that It would " damage" it. What is actually getting damaged and how does if effect my meter?

I found out also that with a low 9v battery, setting it to 20vdc, getting a reading, and then clicking over to the 1.5v battery tester gives different readings for a 1.5v battery. Why is this?
 
  • #10
Flyingwing12 said:
I found out also that with a low 9v battery
I wondered if what you did might have discharged the battery, but the meter should have a low battery indicator on the display.
Analog meters used to have a zero adjustment to compensate for the state of the battery, but digital meters don't (or at least cheap digital meters don't)

Actually if you look at the specification in the owners manual, it says the accuracy on the liwest resistance range is 1.2% + 4 digits, so if a short circuit reads 0.3 ohms, that's still inside the specification.

setting it to 20vdc, getting a reading, and then clicking over to the 1.5v battery tester gives different readings for a 1.5v battery. Why is this?
A special purpose "battery tester" function will probably draw more corrent from the battery than just measruing its voltage, to give a more realistic idea of how good the battery is.
 
  • #11
Flyingwing12 said:
I found out also that with a low 9v battery, setting it to 20vdc, getting a reading, and then clicking over to the 1.5v battery tester gives different readings for a 1.5v battery. Why is this?

Again that is completely normal operation. "Battery Tester" ranges apply a load to the battery while measuring the voltage. The typical load that would be applied to a 1.5 volt battery would definitely "load down" a small 9V battery. In general you should not be using the 1.5 volt battery tester setting to test a 9 volt battery.
 
  • #12
Since there is no such thing as 100% accuracy, or tolerance for multimeters.

What ( to your knowledge ) is the most accurate, rock solid, multimeter out there?
 
  • #13
Flyingwing12 said:
Since there is no such thing as 100% accuracy, or tolerance for multimeters.

What ( to your knowledge ) is the most accurate, rock solid, multimeter out there?

In my opinion, Fluke.

Edit: While working for a FDA regulated pharmaceutical company, having "pharmaceutical" were mandatory.
 
Last edited by a moderator:
  • #14
Fluke, Beckman are workhorses.

you can buy preposterously accurate meters with five or six digits

but ~1% is fine for 99% of all practical work.
 
  • #15
Flyingwing12 said:
Since there is no such thing as 100% accuracy, or tolerance for multimeters.
It's not just multimeters, nothing has 100% accuracy or zero tolerance.

The 0.3 ohms you're seeing on the 200 ohm scale is within the typical variability that you'd normally see depending on how tightly you push the probes to contact, or even if you "twiddle" the connections where the probes plug into the meter etc.

When measuring resistances up around 50 ohms or more we normally just ignore such a small offset, since it's less than one percent anyway.

When measuring very low resistances of just a few ohms, the standard procedure is to short the probes together and take a reading (eg 0.3 ohms) and then subtract this amount from the subsequent measurement.
 
  • #16
Flyingwing12 said:
Since there is no such thing as 100% accuracy, or tolerance for multimeters.

What ( to your knowledge ) is the most accurate, rock solid, multimeter out there?

An old HP multimeter you can no longer buy, the 3458A. Agilent still sells it but from what I understand newly produces models are not as good as the one with the HP logo (although this might just be metrology mythology). This is still the workhorse in high-precision metrology (we have a few of them where I work).

But other than that, there is virtually no difference between the accuracy of multimeters for "normal" applications. That said, a benchtop model is always preferable to a handheld model since you can change the integration time and do 4-point measurements (a handheld multimeter can never be used to do accurate resistance measurements since there is no way to do a proper 4-point measurement).

But if you want something you can carry around it will come down to how robust etc. the multimeter is. Don't worry about the accuracy.
 
Back
Top