Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Current provided / forced or drawn?

  1. Mar 14, 2013 #1
    Current.. forced or drawn?

    Hi all

    I'm a bit confused by DC current (possibly AC too if its the same).

    On many DC power supplies it will state the current output and voltage. Now from my basic understanding of electronics, when you increase a voltage, you increase the current flow proportionally according to what ever Ω there is in the circuit.

    So can someone then tell me how some devices / circuits are said to 'draw' x amount of current while other devices seem to forcefully output a fixed current?

    How is this affected when using batteries? - Things still seem to draw a current rather than the batteries outputting a fixed current and I find that slightly mind boggling.

    I'm looking to power a CCTV camera in a remote location from a 12V car battery which is 65Ah. The camera 'draws' 1amp so I should get roughly 2.5 days if I'm lucky though probably less in reality. I'm just curious whether I would need to provide any resistance to it (eg would the battery pump the full 65A into the camera) or will the camera 'draw' only what it needs?

    Also when things 'draw' the current they need then what is this current limiting resistor business all about? - Or is that how they fix what is drawn in the first place internally?

    Too many questions and not enough explanations out there!

    Thanks for your help,

    Dixo
     
    Last edited: Mar 14, 2013
  2. jcsd
  3. Mar 15, 2013 #2
    No voltage source will output a fixed current - the connected load draws a current determined by its impedance.

    When you see some voltage/current specification on a device, that's usually its rating for continuous use. The DC voltage supply for my laptop is rated at 20 V/4.5 A, which means its designed for a maximum continuous output of 90 W for a load that should thus draw, at most, 4.5 amps continuously.

    Batteries are voltage sources and the rules are no different. They might have more complex ratings though since you have to make an effort to regulate them right.

    You just need to supply the camera with the voltage source it's rated for. It will draw a current determined by its impedance. Do _not_ insert any resistance.

    When driving a LED, you need some way to regulate the voltage across it due to its nonlinear V/I characteristics. A 'current limiting resistor' is a common way to fix this voltage.

    Edit: Cleared up a bit of poor wording.
     
    Last edited: Mar 15, 2013
  4. Mar 15, 2013 #3
    Hi milesyoung,

    I must be having a blonde day or something because I'm still seeing conflicting information regarding this online. For instance on this site:
    http://www.electronics-tutorials.ws/dccircuits/dcp_1.html

    Now clearly if I connect + to - on a battery its going to overload and overheat / blow itself to bits due to the sheer amount of current going through it.

    So.. without meaning to sound thick, how is it that a device 'draws' current? Is it more a case that it resists it but lets through what it needs?

    Also according to ohms, as voltage increases across resistance, the current goes up. So presumably that means that the current IS pushed? - Otherwise how would a voltage increase make a device 'draw' more current? Oh and power supplies that you can select how much current to output.. is that just limiting the max current or actually setting the current that will be pumped out?

    Sorry if I'm being thick, I just prefer to ask questions and get it clear in my head rather than have a half witted understanding of it.
     
  5. Mar 15, 2013 #4

    f95toli

    User Avatar
    Science Advisor
    Gold Member

    That is just a figure of speech. The current through a device is given by the voltage divided by the resistance(or more generally, the impedance) across thatt device.

    Sort of, for simple circuits (without semiconductors) it is always true that if you increase the voltage the current will also increase.

    It depends. On most power supplies you can just set the maxium current (it is mosly used as a safety feature, to prevent lots of current flowing through a faulty circuit).
    However, there are also real current sources, in the latter case you set the current and the source will then adjust the output voltage accordingly (and on these sources you can often set the maximum voltage instead, this is known as the compliance).

    Better to ask one time too many....
     
  6. Mar 15, 2013 #5
    So it is really the resistance / impedance of the device that determines how much current will flow through / be drawn then.

    Ok, that clears that up nicely!

    Thank you very much :)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook