Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Electronic devices operating voltage and current

  1. Feb 26, 2012 #1
    Hello Forum,

    Some electronic devices specify a certain specific voltage that they need to work properly. Some other devices specify a specific current...
    Why? Why don't they all specify the voltage or the current?

    If the current is specified, the voltage will be automatically fixed by the internal resistance R: V=IR....At the end of the day all devices require power, which is the product P=IV....

    A power supply usually has an output voltage and a current rating. What does the current rating represent? The max current the power supply can output?

    Why does a power supply provide a fixed voltage but varying currents? Does the current depend on the load?

    thanks,
    fisico 30
     
  2. jcsd
  3. Feb 26, 2012 #2

    vk6kro

    User Avatar
    Science Advisor

    A majority of electrical devices are designed to operate on a particular voltage.

    For example a light bulb will operate below the voltage stamped on it, but the light will be at reduced intensity and may be orange tinted.
    At the right voltage it should operate correctly

    So, if it receives this voltage, the lamp itself will cause a certain current to flow and a certain amount of power to be consumed.
    In this case, the voltage and power is usually given on the lamp, but the current is available from a simple calculation. Power = voltage times current. So current = Power divided by voltage.

    There are a very few examples where the current is more important than the actual voltage.

    Electrolysis is one. Arc welding is another.

    In arc welding, there is very little voltage across the actual arc, but you could measure it. However, the current is controlled from within the welder case and the current may be shown on a switch setting or a meter. The current is important, so this is maintained even if the resistance of the load changes.

    Power supplies are designed to give a constant voltage out (although good ones can also be set to give a constant current) and they supply this voltage to a load which then takes current according to Ohm's Law.
    There is a limit to the current a power supply can give and this is usually printed on the power supply somewhere.
     
  4. Jan 31, 2013 #3
    Hello vk6kro,

    I was just reviewing this old thread and thinking about your answer.

    So, electronic devices function with a very specific optimal voltage. The current is automatically set once the correct voltage is provided.

    If we throw too much or too little power to an electronic device, things can go wrong. Can all electronic devices function with a lower than recommended input power? Energy is still being provided but at a slower rate.

    I still don't see why for some devices is more important to know the current or voltage...
    Can you elaborate on that again? Both voltage and current are important to give the suitable power.

    Thanks
    fisico30
     
  5. Jan 31, 2013 #4

    NascentOxygen

    User Avatar

    Staff: Mentor

    Generalizations are always dangerous. :smile: :smile:

    Generalizing to "ALL" is something no one would dare. Not even someone as bold and brave as vk6kro.
     
  6. Jan 31, 2013 #5
    I think you are right....

    Maybe a need a few more specific examples to see where the distinction is important...

    thanks
    fisico30
     
  7. Jan 31, 2013 #6

    vk6kro

    User Avatar
    Science Advisor

    Yes, it is too difficult to give a general answer to this.

    You probably know what happens as the batteries in a transistor radio go flat.
    Output sound decreases and may be distorted if you try to increase the volume.
    When you do remove the old batteries you can still measure voltage on them, but it wasn't enough to keep the radio working well.

    The important point, though, is that the radio will still work well if you put new batteries in it.

    If you had somehow tried to run the radio from 12 volts instead of its normal 6 or 9 volts, you may have damaged it and restoring the correct voltage would not fix it.

    With rare exceptions, most electronic devices are designed to work with the correct voltage on them, provided they have adequate cooling.
    The current drawn is a consequence of the voltage and the circuit and is usually not controlled by the user.
     
  8. Feb 1, 2013 #7
    Thanks vk6kro.

    As far as voltage conversion goes, let's say my device needs 20 V but my source is only able to provide 10 V. What can I do?
    I can surely change the source and get one that outputs 20 V.

    Or I can find what I call a"voltage converter". What is this voltage converter? Is it a passive device? Is it a small electronic component?

    thanks,
    fisico30
     
  9. Feb 1, 2013 #8

    vk6kro

    User Avatar
    Science Advisor

    A voltage converter could be a switched mode power supply.
    This is a small circuit on a printed circuit board which uses rapid switching of current through an inductor to produce a required voltage.

    Sometimes you need to do this, but it should be a last resort. You have to be careful not to spend more on converting the voltage than it would cost to replace the high voltage device with one that worked on the lower voltage.

    One example would be where you need to use a 24 volt device (say a GPS) in a 12 volt car.
    It won't work on 12 volts, so you have to produce 24 volts somehow.

    It isn't a pleasant situation, but it is time to get clever and get the job done.
     
  10. Feb 1, 2013 #9
    Thanks!

    This is my situation: I have two solar cells connected in series to give me a certain output voltage, say 5 V (in good illumination conditions).
    I cannot add more cells to get the right voltage because of space requirements. I am limited that max output voltage of 5 V.

    I want to power an electronic device that requires 15 V. I feel like I need to have a voltage converter to change 5V to 15 V.
    Also, the 5 V could be a fluctuating voltage. How do I ensure that whatever the output voltage of my solar cell system is, the voltage to the device is 15 V?

    I need a voltage converter/regulator, right? It needs to take little place. Can I buy some radioshack components?

    thanks
    fisico30
     
  11. Feb 1, 2013 #10
    The 5 V you get, is that an open circuit voltage? Or is it measured when the solar cells have a load on?
     
  12. Feb 1, 2013 #11

    russ_watters

    User Avatar

    Staff: Mentor

    Right -- do you know what the wattage of the device and solar cells is?
     
  13. Feb 1, 2013 #12
    Well, that is the open voltage of the solar cells.

    Of course, with a resistive load attached to the cells, the voltage would go down and the current go up.
    There is a value for V and I at which the cell outputs the most power....


    But I guess that 5 V cannot be considered as the max voltage that can be provided to a connected device. the voltage will be smaller and dependent on the internal impedance of the device, correct?

    Surely, if 5 V is the open load voltage and the electronic device needs 10 V, then we absolutely need a voltage converter...

    thanks
    fisico30
     
  14. Feb 1, 2013 #13
    You guess? Look at the I-V curve of your solar cell. What's the output power at the open-circuit voltage of 5V?
     
  15. Feb 1, 2013 #14
  16. Feb 1, 2013 #15

    vk6kro

    User Avatar
    Science Advisor

    There are DC to DC voltage boost modules available on EBay and other places. They are quite cheap, sometimes under $10, and very efficient.

    If your 10v device used 10 watts, it would need 1 amp at 10 volts,
    It would need an input of at least 2 amps at 5 volts (also 10 watts), plus whatever losses the converter introduced.

    So you can see that there is much more current needed at 5 volts than you use at 10 volts.

    These converters produce a steady output with varying input, but they do have minimum input voltage requirements. Also, the current will need to increase if the input voltage drops.
    10 watts at 3 volts is 3.3 amps, for example..

    If you only wanted 10 volts for a short time, you could charge a battery while you had good sunlight, although this adds extra costs and reduces efficiency.
     
  17. Feb 1, 2013 #16

    NascentOxygen

    User Avatar

    Staff: Mentor

  18. Feb 2, 2013 #17
    Hi vik6kro,

    you mention that

    It would need an input of at least 2 amps at 5 volts (also 10 watts), plus whatever losses the converter introduced.

    Sure, let's say the device needs 10 W to properly function. These 10 W can be either
    2 A and 5 V or 1A and 10V or even 10A and 1V.

    But didn't you say that we are both constrained by the 10W number and also the voltage number, i.e. the device may specifically tell us that it needs 10 V. With 1V it may not work or the 10A current may be too high and damage the electronics.

    So recommended power and voltage are the two numbers we need to match when connecting to a source....

    some sources are constant current or voltage sources. A set of solar cells in series is neither: its output voltage or current depends on what load is attached to them.

    Best,
    fisico30
     
  19. Feb 3, 2013 #18

    vk6kro

    User Avatar
    Science Advisor

    A fact that is technically necessary is that you can't get more power out of a converter ( or anything) than you put into it.
    So, if you have 5 volts as your input voltage, as you said you did, then you need at least 2 amps in to get 10 watts out.
    And the converters have losses, so more than 10 watts input would be needed.

    DC to DC converters are sold on EBay and these sellers give very precise information about what input and output voltages and currents are possible with each converter.

    So, if you want to use some other voltages, you would read the specifications to make a decision about each converter.
     
  20. Feb 3, 2013 #19

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    To return to the OP and to answer this particular question. The reason that power supplies are usually sources of voltage with low source (internal) resistance is because they dissipate very little power internally over a massive range of supply currents. This makes them a very efficient source. Batteries tend to behave like this, inherently - at least, it's relatively easy to make a battery that will give a fairly steady voltage over a range of currents. The same thing goes for rotary dynamos and generators.
    Consider this: if all electrical appliances / devices were designed to work from a particular current and power supplies were required to feed them with that current then HOW would you connect multiple devices to the same power supply? They would need to be connected in SERIES, for a start. You would need to specify that they all take the same amount of current - so the more you connected in series, the greater voltage you would need to supply and they would all have different voltage drop (depending upon the power of each one. Also - if you didn't want any power out of your supply, you would need to SHORT CIRCUIT it or the volts would go up and up and up - as the device 'tried' to deliver its nominal current. Removing an appliance from its socket, you would need to replace it with a shorting link, so the other devices could be powered. A proverbial nightmare and a truly upside down world.
    The highest power devices would need to have the highest resistances (having the highest voltage) - compared with voltage driven devices where the highest power device has the lowest resistance (taking the highest current).
    There are some components that need to be supplied with a particular current, of course, but they are exceptions and need specially designed circuitry.
     
  21. Feb 4, 2013 #20
    Hello Sophiecentaur

    The reason that power supplies are usually sources of voltage with low source (internal) resistance is because they dissipate very little power internally over a massive range of supply currents.

    Ok, I see that: the power sources does not consume too much power if the internal resistance is zero. That does not assure that the most power possible goes to the load (impedance matching theorem, correct?)


    In the case of a battery, the voltage is rather constant and the current can change depending on the load resistance. That is why a battery is a constant voltage source and not a constant current source.

    Electrical devices need electrical power to function. Power is simply energy per time.
    Power is given if both a nonzero current and voltage are applied to the device. The current seems to be determined by powered device while the voltage by the power source....is that a correct general statement?

    For instance, in the case of a single solar cell at max illumination and connected to a certain load, both the voltage and current coming out of the solar cell depend on the cell internal resistance and electronic device impedance, correct?

    A solar cell, under ideal illumination, reaches a max open load voltage (like a battery has a fixed voltage). The current can increase depending on how much solar light is incident on the cell. The more the light, the higher the max drawable current can be.....correct?
    But all depends on the load, correct?

    thanks
    fisico30
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Electronic devices operating voltage and current
  1. Electron Devices (Replies: 1)

  2. Voltage and Current (Replies: 5)

Loading...