# Multimeter suggestion

## Main Question or Discussion Point

Now I am on the edge today because I somehow busted my grandfather's multimeter. When he died, his son- my uncle, took it and I borrowed it for my home projects, to measure voltage drops etc.

Today I went outside to test how much voltage can my solar cell produce, in that process of measuring, I somehow broke it. After doing some test on previously checked stable 24V DC adapter, I found out that my multimeter(analog) now reads 3-4 more volts, and testing on 9 V battery it reads 0,5-0,9 V so the error scales with bigger voltages.

My idea is that i somehow burnt out some resistor inside that makes readings go bad.
That multimeter was REALLY good. I think it cost around 50 euros at that time. And that was probably 20 years ago.

I really, REALLY feel bad for ruining this good piece of technology but in the end, I will have to buy a new multimeter...

Now I don't think I will measure anything above 220 V AC. Usually it should be DC circuits, just for calculations.

Can any of you help me with a choice of a new multimeter and how much I should spend on it?

Brand most available to me is Voltcraft, but I could probably order online.

Related Electrical Engineering News on Phys.org
Fluke is known for being the best, but most expensive.

All in all, it depends what you plan on measuring. Some major things that change as you go up or down in price:

• Build quality - IMPORTANT
• Maximum current measured
• Maximum voltage measured
• Maximum and minimum measurable capacitances
• Maximum frequency for measuring AC
• Ability to measure frequency at all
• Accuracy tolerances
• Triggering capability
• Graphing capability

Generally speaking, go with the cheapest one that satisfies your requirements. Even the bottom-of-the-barrel unbranded DMM's off eBay can do basic measurements. I would suggest getting something with a rubberized coating, and fuses that are easy to access.

We went through this question at great length recently.

By the way it should be possible to fix an analog multimeter.
Did you try to measure voltage on the current or resistance range and have you swapped the leads back to the proper sockets?
No I am well introduced with using multimeter. And I do use 30V area of measurement to measure 24V and it goes off 24V mark. And before that, 2 weeks probably, it measured 24V exactly.

No matter how good you are, it's still possible to make a bone-headed mistake.

Averagesupernova
Gold Member
How do you know your uncle didn't unknowingly damage it and it was that way when you got it? Or has he not had access to it since you had correct readings from it?

vk6kro
When you overheat a resistor, it usually goes high in resistance.

In an analog meter, the voltmeter range resistors are in series with the meter, so the meter would tend to read low if you had somehow cooked the resistors.

If you overheated the meter movement and caused some of the turns of wire to short together, then the movement would become less sensitive and again you would get a lower reading.

These meters usually have a magnetic shunt across the magnet in the meter movement. This is a small piece of metal across the magnet poles and held in place by screws.

To adjust it you need to send exactly the right current through the meter movement (usually 50 µA ) and adjust this shunt until the meter reads exactly full scale.
For this, of course, you will need to borrow another meter.

This is delicate stuff and it is easy to bend the needle of the meter if you are not careful.

The main point, though, is that it probably wasn't your fault. You would know if you dropped it (and that could cause this fault), but anything electrical you did would tend to give a lower reading. It is probably just due to the age of the instrument.

I would discuss it with your uncle and see if he feels it is worth getting this meter "calibrated" professionally because it has sentimental value.
He will probably say "no" when he finds out how much this will cost.
50 Euros would now buy a much better meter than it did 20 years ago.

How do you know your uncle didn't unknowingly damage it and it was that way when you got it? Or has he not had access to it since you had correct readings from it?
I got it from him, and I used it for a while. I have a 24V DC adapter and I measured its output voltage. It read 24V. I did the measurements because I wasn't sure if that adapter still worked.

I tested it even before that on 9V battery and it read 9V. It has been at rest, at my place ever since. But yesterday... I said what happened in original post.

When you overheat a resistor, it usually goes high in resistance.

In an analog meter, the voltmeter range resistors are in series with the meter, so the meter would tend to read low if you had somehow cooked the resistors.

If you overheated the meter movement and caused some of the turns of wire to short together, then the movement would become less sensitive and again you would get a lower reading.

These meters usually have a magnetic shunt across the magnet in the meter movement. This is a small piece of metal across the magnet poles and held in place by screws.

To adjust it you need to send exactly the right current through the meter movement (usually 50 µA ) and adjust this shunt until the meter reads exactly full scale.
For this, of course, you will need to borrow another meter.

This is delicate stuff and it is easy to bend the needle of the meter if you are not careful.

The main point, though, is that it probably wasn't your fault. You would know if you dropped it (and that could cause this fault), but anything electrical you did would tend to give a lower reading. It is probably just due to the age of the instrument.

I would discuss it with your uncle and see if he feels it is worth getting this meter "calibrated" professionally because it has sentimental value.
He will probably say "no" when he finds out how much this will cost.
50 Euros would now buy a much better meter than it did 20 years ago.

Just going by the feature list, that looks more than good enough. It's even got no contact AC voltage testing. First multimeter I've seen with that.

Just going by the feature list, that looks more than good enough. It's even got no contact AC voltage testing. First multimeter I've seen with that.
Whats that?

Hello,

I just bought my new multimeter. Turned out that my old wasn't malfunctioning at all...

But nevertheless, I am happy to get a new multimeter and get that old one a rest.

Now another question:

Is it possible that my 9V battery is measuring 9.7 V ? and my DC adapter is measuring 31.7 V instead of 24V like it says on the adapter?

And when I test my adapter, I charge the capacitor inside(I think). And I measure 30 V even after I unplug the adapter. Question here is: How can I safely discharge that capacitor, because I got zapped few times.:tongue2:

DC adapters measure 25% - 30%, or even 50% with cheap ones, higher off load.

In the UK every electrical appliance has to have a 'rating plate' fixed to it.
This plate must describe the working current and voltage.

So an adapter rated to provide 12 volts at 750 milliamps will measure 15 or 16 volts on zero load and 12 volts on full load (750 mA) and somewhere between on partial load.

Hello,

I just bought my new multimeter. Turned out that my old wasn't malfunctioning at all...

But nevertheless, I am happy to get a new multimeter and get that old one a rest.

Now another question:

Is it possible that my 9V battery is measuring 9.7 V ? and my DC adapter is measuring 31.7 V instead of 24V like it says on the adapter?

And when I test my adapter, I charge the capacitor inside(I think). And I measure 30 V even after I unplug the adapter. Question here is: How can I safely discharge that capacitor, because I got zapped few times.:tongue2:
Just hook a small resistor across it to discharge through there.
DC adapters measure 25% - 30%, or even 50% with cheap ones, higher off load.

In the UK every electrical appliance has to have a 'rating plate' fixed to it.
This plate must describe the working current and voltage.

So an adapter rated to provide 12 volts at 750 milliamps will measure 15 or 16 volts on zero load and 12 volts on full load (750 mA) and somewhere between on partial load.
An unregulated adapter will behave that way. If it's regulated, the voltage will be stable no matter what the load is. Regulated adapters are more expensive though.

An unregulated adapter will behave that way. If it's regulated, the voltage will be stable no matter what the load is. Regulated adapters are more expensive though.
Have a care here. All power supplies behave this way. It's just that the regulation % for 'regulated' ones is rather better.

Regulation has a specific definition.

$$\% {\mathop{\rm Re}\nolimits} gulation = \frac{{No\,load\,voltage - Voltage\,at\,specified\,load}}{{No\,load\,voltage}}*100$$

Edit Note: This also applies to transformers.

Studiot, I hope you don't mind me asking but what is your current occupation? What college have you finished and what is your job now?

No I don't mind.

College was a long long time ago. My first degree was applied maths and I have spent the last 40 years applying it in various fields.

Currently I own and run a small company providing support in industrial instrumentation and general IT.

wow! 40 years of applying math. I admire you sir. I am not that friendly with math, mainly because, here at my college you have to learn A LOT in very short time.
But anyway thank you for answering.

Have a care here. All power supplies behave this way. It's just that the regulation % for 'regulated' ones is rather better.

Regulation has a specific definition.

$$\% {\mathop{\rm Re}\nolimits} gulation = \frac{{No\,load\,voltage - Voltage\,at\,specified\,load}}{{No\,load\,voltage}}*100$$

Edit Note: This also applies to transformers.
True, I do need to be careful with my exact words. But you aren't going to have 4 volts of swing on a regulated 12V line. The ripple on a regulated line is usually millivolts.

There's a different definition for line regulation in my textbook, so it's really not as specific as you claim. :tongue:

Thank you for pointing that out JN. I was too hasty witht he copy 'n paste in Mathtype.

It should, of course, be

$$\% regulation = \frac{{no\,load\,voltage - voltage\,at\,specified\,load}}{{voltage\,at\,specified\,load}}*100$$

My definition in % is more fully known as the load regulation which is more useful for describing the overall performance of a device and, I think the one applicable to Bassilisk's multimeter measurement.

Line regulation would tell us what happens when the input (mains in this case varies) and is a per unit measure not a %.
So it would tell us what how much the output would change per voltage change in input from the mains or whatever.

Last edited:
Actually, I was thinking of line regulation, which my book gives two slightly different equations that spec sheets might use to calculate it. But that's measuring a different parameter than your equation.

Any appearance of competence on my part, real or imagined, is purely coincidental.