Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Power supply and Multimeter questions

  1. Sep 22, 2011 #1
    When measuring voltage or current, why are we suppose to crank up the power supply? For example, when we are measuring voltage we are supposed to make sure sure the current is set high enough so it won't limit the voltage, and vice versa for measuring current. I thought if you crank up voltage in a circuit that would already create current. I also notice my circuit won't work if the current is too low, even though the voltage is high but why do you need to put current in a circuit if voltage creates current?
     
  2. jcsd
  3. Sep 23, 2011 #2
    The voltage and current settings on the power supply dictate how much power the power supply can provide.
    Those need to be adjusted so the power is greater than the power draw of your circuit. Assuming a fixed voltage, if the current setting is too low on the power supply, it will not be able to provide adequate power to the circuit and will usually go into an overcurrent mode (depending on the power supply). The current setting needs to be at least as large as the expected current draw of the circuit - if you set it too high though, the power supply will not go into overcurrent mode if something is wrong (bad part, bad wiring, etc.).

    An overly simple way to think of this when prototyping is a circuit breaker - you set the current setting on the power supply low enough that it will prevent severe damage if something is wrong, but high enough that everything can operate correctly.

    In normal circumstances, setting the current at maximum won't cause any issues (just because it's set at 1A, doesn't mean its going to necessarily dump 1A into the circuit).

    Make sense?
     
  4. Sep 23, 2011 #3
    Yep, makes sense. Thanks, I appreciate it.
     
  5. Sep 24, 2011 #4

    vk6kro

    User Avatar
    Science Advisor

    Power supplies with current limiting reduce the voltage output, if necessary, to stop the external circuit drawing more current than is allowed.

    So, Ohms Law still applies. The external load will just get less voltage and so will draw less current than it would if the current limit was higher.
     
  6. Sep 24, 2011 #5
    What a great way to explain this! After I read this question I was working with a DC power supply (working just as the OP'er stated) and had no idea how to explain why it worked that way. Nice Job!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Power supply and Multimeter questions
  1. Power supply question (Replies: 5)

Loading...