Power supply and Multimeter questions

In summary, when measuring voltage or current, you should crank up the power supply so that the voltage is greater than the power draw of the circuit. This ensures that the power supply can provide adequate power.
  • #1
caljuice
70
0
When measuring voltage or current, why are we suppose to crank up the power supply? For example, when we are measuring voltage we are supposed to make sure sure the current is set high enough so it won't limit the voltage, and vice versa for measuring current. I thought if you crank up voltage in a circuit that would already create current. I also notice my circuit won't work if the current is too low, even though the voltage is high but why do you need to put current in a circuit if voltage creates current?
 
Engineering news on Phys.org
  • #2
The voltage and current settings on the power supply dictate how much power the power supply can provide.
Those need to be adjusted so the power is greater than the power draw of your circuit. Assuming a fixed voltage, if the current setting is too low on the power supply, it will not be able to provide adequate power to the circuit and will usually go into an overcurrent mode (depending on the power supply). The current setting needs to be at least as large as the expected current draw of the circuit - if you set it too high though, the power supply will not go into overcurrent mode if something is wrong (bad part, bad wiring, etc.).

An overly simple way to think of this when prototyping is a circuit breaker - you set the current setting on the power supply low enough that it will prevent severe damage if something is wrong, but high enough that everything can operate correctly.

In normal circumstances, setting the current at maximum won't cause any issues (just because it's set at 1A, doesn't mean its going to necessarily dump 1A into the circuit).

Make sense?
 
  • #3
Yep, makes sense. Thanks, I appreciate it.
 
  • #4
Power supplies with current limiting reduce the voltage output, if necessary, to stop the external circuit drawing more current than is allowed.

So, Ohms Law still applies. The external load will just get less voltage and so will draw less current than it would if the current limit was higher.
 
  • #5
mdjensen22 said:
The voltage and current settings on the power supply dictate how much power the power supply can provide.
Those need to be adjusted so the power is greater than the power draw of your circuit. Assuming a fixed voltage, if the current setting is too low on the power supply, it will not be able to provide adequate power to the circuit and will usually go into an overcurrent mode (depending on the power supply). The current setting needs to be at least as large as the expected current draw of the circuit - if you set it too high though, the power supply will not go into overcurrent mode if something is wrong (bad part, bad wiring, etc.).

An overly simple way to think of this when prototyping is a circuit breaker - you set the current setting on the power supply low enough that it will prevent severe damage if something is wrong, but high enough that everything can operate correctly.

In normal circumstances, setting the current at maximum won't cause any issues (just because it's set at 1A, doesn't mean its going to necessarily dump 1A into the circuit).

Make sense?

What a great way to explain this! After I read this question I was working with a DC power supply (working just as the OP'er stated) and had no idea how to explain why it worked that way. Nice Job!
 

1. How do I choose the right power supply for my project?

Choosing the right power supply depends on the voltage and current requirements of your project. You will need to consider the voltage range, maximum current output, and the type of power supply (linear or switching) to ensure compatibility with your project.

2. What is the difference between AC and DC power supply?

AC (alternating current) power supply provides a continuously changing electrical current, while DC (direct current) power supply provides a constant electrical current. AC power is typically used for household appliances, while DC power is commonly used for electronic devices.

3. How do I measure voltage with a multimeter?

To measure voltage with a multimeter, set the dial to the DC voltage option. Then, connect the red probe to the positive terminal and the black probe to the negative terminal. The multimeter will display the voltage reading in volts (V).

4. Can a multimeter measure current?

Yes, a multimeter can measure current. To measure current, set the dial to the DC current option. Then, connect the multimeter in series with the circuit, with the positive probe connected to the positive side and the negative probe connected to the negative side. The multimeter will display the current reading in amps (A).

5. What is the difference between a digital and analog multimeter?

A digital multimeter displays readings on a digital screen, while an analog multimeter uses a needle to indicate the readings on a scale. Digital multimeters are more accurate and easier to read, while analog multimeters are better for measuring rapidly changing values.

Similar threads

  • Electrical Engineering
Replies
8
Views
1K
  • Electrical Engineering
2
Replies
36
Views
2K
  • Electrical Engineering
Replies
23
Views
2K
  • Electrical Engineering
Replies
32
Views
831
  • Electrical Engineering
Replies
9
Views
3K
Replies
19
Views
1K
Replies
11
Views
2K
  • Electrical Engineering
Replies
25
Views
2K
  • Electrical Engineering
Replies
14
Views
4K
  • Electrical Engineering
Replies
8
Views
1K
Back
Top