Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Current operating vs voltage operating electronic devices

  1. Feb 11, 2013 #1
    Hello Forum,
    I have heard of current operating devices, like LEDs, and voltage operated devices...

    What is the difference?
    It seems to me that all electronic devices, once provided a certain voltage, draw the current they need....so what is this distinction between current operating and voltage operating?

  2. jcsd
  3. Feb 11, 2013 #2
    This isn't a very precise definition (you could probably find a rigorous one with a little Googling) but it seems to me to have to do with functionality.

    If you want a LED to emit light you have to drive a current through it. You naturally apply a voltage across the LED to produce the current, but I think it's reasonable to say that the current is the direct cause of the LED emitting light, not the voltage.

    Other popular examples are BJT/FET's where a base current or gate voltage, respectively, controls the V/I characteristics of the transistor.
    Last edited: Feb 11, 2013
  4. Feb 12, 2013 #3
    You need a voltage source to be able to push current through an electronic device. It all comes down to Ohm's law V=IR. Electronic devices are rated according to the voltage and current required for proper operation. If you can represent a device as a load, then you can find the current drawn by that device using Ohm's law, simply by dividing the operating voltage value over the load which has the unit of ohm.

    As far as I know, there's nothing called current/voltage operating device, because you can't get current going through a device without a voltage source to push that current through it. Any electronic device has resistance, which according to it, the electronic device draws a certain level of current.

    LEDs, for example, don't have a known constant resistance value, so it has come with practice that we connect a resistor in series with an LED to limit the current that goes through it. You can directly connect a 2~3 volt power supply, which is the usual operating voltage for an LED, to an LED and you get a proper function (illumination) but it will get hotter with time and may burn eventually.
    Last edited: Feb 12, 2013
  5. Feb 12, 2013 #4
    Some devices are better when driven with constant current & the voltage is incidental, like LEDs, bjt's, SCRs, triacs, Hall effect devices/sensors, etc. Others are better suited to being driven with constant voltage with current being incidental, such as incandescent light bulbs (which also can be current driven but usually voltage drive is preferred for reasons i will not go into), FETs, integrated op amps (voltage feedback type), vacuum tubes, etc.

    An LED cannot be driven by a voltage source, please do not attempt to do so. If an LED is driven with a 10 mA current source & the forward voltage is measured at 2.2 volts, one should not attempt to drive the LED with a 2.2 volt constant voltage source. The result is thermal instability. An LED is built from a semiconductor material which has a conductivity that increases w/ temp (resistivity decreases w/ increasing temp.

    A voltage source across an LED, even the "right value of voltage" will produce thermal runaway. An LED must be current driven. If no constant current source is available, a voltage source can be used with an appropriate valued series resistor. The resistor prevents thermal runaway & limits the current.

    With linear devices, i.e. a linear resistive heating element, it can be driven from constant voltage or constant current. No reason one would be better than the other.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook