Power supply and Multimeter questions

Click For Summary

Discussion Overview

The discussion revolves around the operation of power supplies and multimeters, specifically focusing on the relationship between voltage and current settings during measurements. Participants explore the implications of adjusting these settings for circuit functionality and safety, touching on concepts such as current limiting and Ohm's Law.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions why it is necessary to increase the power supply settings for voltage and current measurements, noting that high voltage should inherently create current.
  • Another participant explains that the voltage and current settings on a power supply determine the maximum power available, emphasizing that the current setting must be adequate to prevent the power supply from entering overcurrent mode.
  • A later reply highlights that power supplies with current limiting will reduce voltage output to prevent excessive current draw, reinforcing the application of Ohm's Law in this context.
  • One participant expresses appreciation for the explanation provided, indicating that it clarified their understanding of the power supply's operation.

Areas of Agreement / Disagreement

While some participants agree on the importance of adjusting power supply settings for circuit operation, there remains a lack of consensus on the fundamental relationship between voltage and current, as well as the necessity of current in circuits where voltage is present.

Contextual Notes

Participants do not fully resolve the underlying assumptions about the relationship between voltage and current, nor do they clarify the conditions under which the power supply operates effectively.

caljuice
Messages
70
Reaction score
0
When measuring voltage or current, why are we suppose to crank up the power supply? For example, when we are measuring voltage we are supposed to make sure sure the current is set high enough so it won't limit the voltage, and vice versa for measuring current. I thought if you crank up voltage in a circuit that would already create current. I also notice my circuit won't work if the current is too low, even though the voltage is high but why do you need to put current in a circuit if voltage creates current?
 
Engineering news on Phys.org
The voltage and current settings on the power supply dictate how much power the power supply can provide.
Those need to be adjusted so the power is greater than the power draw of your circuit. Assuming a fixed voltage, if the current setting is too low on the power supply, it will not be able to provide adequate power to the circuit and will usually go into an overcurrent mode (depending on the power supply). The current setting needs to be at least as large as the expected current draw of the circuit - if you set it too high though, the power supply will not go into overcurrent mode if something is wrong (bad part, bad wiring, etc.).

An overly simple way to think of this when prototyping is a circuit breaker - you set the current setting on the power supply low enough that it will prevent severe damage if something is wrong, but high enough that everything can operate correctly.

In normal circumstances, setting the current at maximum won't cause any issues (just because it's set at 1A, doesn't mean its going to necessarily dump 1A into the circuit).

Make sense?
 
Yep, makes sense. Thanks, I appreciate it.
 
Power supplies with current limiting reduce the voltage output, if necessary, to stop the external circuit drawing more current than is allowed.

So, Ohms Law still applies. The external load will just get less voltage and so will draw less current than it would if the current limit was higher.
 
mdjensen22 said:
The voltage and current settings on the power supply dictate how much power the power supply can provide.
Those need to be adjusted so the power is greater than the power draw of your circuit. Assuming a fixed voltage, if the current setting is too low on the power supply, it will not be able to provide adequate power to the circuit and will usually go into an overcurrent mode (depending on the power supply). The current setting needs to be at least as large as the expected current draw of the circuit - if you set it too high though, the power supply will not go into overcurrent mode if something is wrong (bad part, bad wiring, etc.).

An overly simple way to think of this when prototyping is a circuit breaker - you set the current setting on the power supply low enough that it will prevent severe damage if something is wrong, but high enough that everything can operate correctly.

In normal circumstances, setting the current at maximum won't cause any issues (just because it's set at 1A, doesn't mean its going to necessarily dump 1A into the circuit).

Make sense?

What a great way to explain this! After I read this question I was working with a DC power supply (working just as the OP'er stated) and had no idea how to explain why it worked that way. Nice Job!
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 23 ·
Replies
23
Views
6K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 32 ·
2
Replies
32
Views
3K
  • · Replies 9 ·
Replies
9
Views
5K
Replies
19
Views
3K
  • · Replies 14 ·
Replies
14
Views
6K
  • · Replies 25 ·
Replies
25
Views
3K
Replies
4
Views
2K
Replies
11
Views
3K