Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Power supplies, voltage and current

  1. Jul 7, 2011 #1
    One power supply can go up to 30 volts or 1 amp, and the other can go up to 18 volts or 3 amps:

    https://www.amazon.com/Extech-Power...?s=industrial&ie=UTF8&qid=1310083681&sr=1-26"

    https://www.amazon.com/Extech-Power...?s=industrial&ie=UTF8&qid=1310083681&sr=1-25"

    Why does one output higher voltage or lower current and the other output lower voltage or higher current? Since V = IR I would think the one that could put out a higher voltage on voltage mode would be able to put out a higher current on current mode (since voltage and current are proportional)...

    PS Maybe I misunderstand how these power supplies work and my physics understanding is fine. The way I understand these power supplies to work is that one either picks voltage or current mode. In voltage mode it simply outputs the voltage you ask for. In current mode it senses the current with the given resistance and increases or decreases the voltage to achieve the current you asked for. If my understanding of the power supplies is wrong (rather than my understanding of the physics) then please correct me.

    Thanks!
     
    Last edited by a moderator: Apr 26, 2017
  2. jcsd
  3. Jul 7, 2011 #2

    rcgldr

    User Avatar
    Homework Helper

    It's due to the design of the components used in the power supplies, as opposed to some law of physics.
     
  4. Jul 7, 2011 #3

    Drakkith

    User Avatar

    Staff: Mentor

    The difference is in the power usage. If you have a power supply which can deliver a max of 100 watts, it could deliver say 10 volts and 10 amps. However if you wanted to get more voltage you would have to drop the current. Such as 20 volts and 5 amps. I have a high voltage one that delivers 40,000 volts but only about 10 mA, so the power is 400 watts. Decreasing the voltage to 20,000 would let me get 20 mA unless the current is limited.

    I'm assuming they are power limited due to no wanting to burn out your power input or the outlet that the power supply gets its electricity from. If it wasn't power limited you could double your current pulled without realizing it and start a fire or destroy the power supply.
     
  5. Jul 7, 2011 #4

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    In designing a power supply among the other issues of stability and efficiency and such the three quantities, power, voltage, and current are interrelated by the relationship:

    Power = Voltage * Current.

    Thus at a given power level the higher the voltage, the lower the current and vis versa. Generally power rating is an issue of size, thermal issues, and component quality.

    Transformers are able to trade off AC voltage vs current at a given power level to match the application for which the power supply is to be used. (and it is then rectified and stabilized if DC voltage is needed).

    So to answer your question typically a power supply is designed to supply up to a certain amount of power at a given voltage matched to its application. This then dictates the current by the above relation.
     
  6. Jul 8, 2011 #5
    Thanks for all the replies! So let's say I purchase both power supplies and a 10 ohm resistor. First I attach the 30 Volt 1 amp supply and turn the voltage up to 30. V=IR. Does this mean that the supply that can supposedly output only 1 amp is now outputting 3?

    Similarly, imagine that I disconnect that power supply from the resistor and attach the 18V 3A one to the resistor. Then I turn the current up to 3A. V=IR. DOes this mean that the supply that can supposedly output only 18 volts is now outputting 30?

    What am I missing here? Thanks for your time and help!
     
  7. Jul 8, 2011 #6

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    No. The power supply is not an ideal voltage source (unlimited current). Rather it will have an internal resistance. Given the 1 amp rating at that voltage it may be as much as 30ohms. So connecting a 10 ohm resistor you'll have 40ohms total circuit resistance and draw 0.75 amps. But a good supply has as little internal resistance as is feasible to prevent output voltage drop under load.

    It is more likely that the internal resistance is small and the power supply will indee output more than 1 amp for a short while. (Not quite 3 since there is still some internal resistance).

    If it is well designed then a circuit breaker will kick in to keep you from destroying the supply.

    If less well designed, a fuse will blow and you'll need to replace it but it will protect the supply.

    If even less well designed (unlikely but possible) it will die horribly in a cloud of acrid smoke with a slight possibility of flames as well.

    No, you won't be able to "turn the current up to 3A" but rather will only get the 18V/10ohms = 1.8 amp output. That's ideally. In fact you'll likely get less via:
    Current = 18V/(10+internal resistance) ohms = less than 1.8 amps.

    [EDIT] It looks like the power supply in your link may have a constant current mode which means you will be able to set the current to any value up to the max current at max output voltage. What a constant current supply does is basically lower the voltage from the max value until the current drops to the set value. While in constant current mode if you connect a load with too high a resistance and set the current setting too high then it will not be able to output enough voltage to meet the setting and the actual current will be less than the setting. If you connect a smaller resistance load then the output voltage will drop to meet the current value you set. You won't be able to set it higher than the max rated value.[End Edit]

    In short, a (DC) power supply can be modeled by a voltage source in series with a low value resistor representing internal resistance along with a circuit breaker to prevent overload.
     
    Last edited: Jul 8, 2011
  8. Jul 8, 2011 #7
    I have seen blenders that can get 1000 Watts. Assuming AC mains is 100 V for simplicity, that would mean that the internal resistance should be 5 Ohms, and the 1000 Watts is attained when the internal resistance is matched by the blender.

    So why is the power supply as high as 30 ohms? Can't they do better and get 5 ohms?

    Also, could you quadruple the resistance of a toaster, and use a step up transformer to double the mains voltage, or quarter the resistance of a toaster and use a step down transformer to halve the mains voltage, and still get the same quality toast as leaving the toaster alone? I guess what I'm trying to get at is why is mains voltage 120 V. Is that just a number they took out of a hat?
     
  9. Jul 8, 2011 #8

    Drakkith

    User Avatar

    Staff: Mentor

    From here: http://en.wikipedia.org/wiki/Household_voltage

     
  10. Jul 8, 2011 #9
    That's interesting. It seems to me high voltages would be better, to reduce losses from wiring leading up to the load (this is the reason why power is transmitted at high voltages, low current) which would allow you to use thinner wires to save on costs without fearing you'll overheat the wires if they're too thin. Higher voltages would however be less safe.

    I find the incandescent light bulb to be inherently unsafe. A finger can fit into the socket and short hot and neutral. In fact, the reason why plugs in the U.S. are polarized is because of the incandescent light blub! You want to make sure that the threads on the screw of the light bulb touch neutral (because that's the part you touch), and that the very bottom tip touches hot, so it's very important that the plug to your lamp knows which outlet hole is neutral and which outlet hole is hot. But the weird thing is that some countries don't have polarized plugs! Do they have incandescent light bulbs? I would be scared to change a light bulb in those countries - I would need to make sure I wasn't grounded. Anyways, the reason why plugs have polarization is for historical reasons! Also, this historical preservation is the reason why a CPU is unfathomable.
     
  11. Jul 8, 2011 #10

    Drakkith

    User Avatar

    Staff: Mentor

    I don't really understand the hot and neutral wires in an AC circuit. Or rather what people say about them. In an AC circuit isn't the neutral wire just as "hot" as the hot wire? I understand the polarization and why it's important, but I here alot of people say that the neutral wire isn't hot.

    Edit: Neutral or Return line or whatever.
     
  12. Jul 8, 2011 #11
    You have two wires coming out of a generator, wire A and wire B. With wire B, you split it down the middle so that it looks like a Y: that is, wires B branches into two wires, B1 and B2. Now you connect B2 to ground, literally. That is you drive a huge metallic stake into the ground, and attach wire B2 to it. Then you only have wire B1 and wire A to attach to the load.
    Now wire A is called hot, and wire B1 is called neutral, and wire B2 is called the ground wire. Now it is safe to touch only one wire: A, B1, and B2. However, if you touch A and B1 at the same time, you complete the circuit and you're shocked. If you touch A and B2 at the same time, you complete the circuit and you're shocked. Now the key is that B2 is not just the second branch of B, or the metal stake, but the entire earth! The earth is a fairly good conductor, so B2 is basically the 2nd branch of B, the metal rod in the earth, and the earth, since they are all touching and are all good conductors. So if you touch A and the ground at the same time, that is the same as touching A and B2 at the same time, so you get shocked. The only safe thing you can do is touch B1 and B2 at the same time, since they are at the same potential, as they are all connected to B. So you can be touching the earth and hold the neutral wire at the same time. So basically the terminology reflects the danger. Of course, you need to be careful because someone might have gotten the wiring wrong, so what you think is neutral is actually hot, and if you touch hot and a piece of plumbing that goes to the earth, you complete the circuit from the generator, to A, to B2, to B, and back to the generator.

    I recommend this website to get a picture:

    http://www.allaboutcircuits.com/vol_1/chpt_3/3.html

    The neutral wire is the same as the hot wire, the only difference being that the neutral wire is also connected to the ground. That makes a huge difference with regards to safety.
     
  13. Jul 8, 2011 #12

    Drakkith

    User Avatar

    Staff: Mentor

    Alright, that's pretty much how I thought it worked.
     
  14. Jul 10, 2011 #13
    Anyone, how does the current from a small pv inverter get to the grid?
     
  15. Jul 10, 2011 #14

    Drakkith

    User Avatar

    Staff: Mentor

    It converts power from the solar panels or batteries into an AC frequency in phase with the Mains power.
     
  16. Jul 10, 2011 #15
    Drakkith, any idea how the weak output current from a small pv inverter can get past the step down transformer(7200V to 240V) outside the residence and onto the grid?
     
  17. Jul 10, 2011 #16
    How does it push past the incoming 7200V
     
  18. Jul 10, 2011 #17

    Drakkith

    User Avatar

    Staff: Mentor

    As far as I know, which isn't much, the current doesn't matter. If there is 0.1 amps getting to the inverter it will be converted to 7200v as well.

    In AC mains there isn't "incoming" power, the flow alternates directions ever half cycle. In my house the 60 hz frequency means that the flow of current comes in from one wire, reverses to the other one, and then goes back to the first one sixty times a second. Same thing for the transformer outside.
     
  19. Jul 10, 2011 #18
    "sorry if this is a stupid question, I am writing a research paper and my knowledge of electricity is very limited, I have searched high and low for real answers even from doe and have been redirected everywhere. I understand that these inverters shut off if there is no incoming power"
     
  20. Jul 10, 2011 #19
    Thanks for your time>
     
  21. Jul 10, 2011 #20

    Drakkith

    User Avatar

    Staff: Mentor

    I believe that is correct. If there is no incoming power from the mains lines, then it's just a waste of power to have the inverter on and doing nothing, as it still draws a small amount of power just to be turned on. What are you writing your paper on? Have you picked up a basic electronics book or something similar? Or are you interesting less in the technical how, and more in the results?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Power supplies, voltage and current
  1. Power voltage current? (Replies: 4)

  2. Voltage and current (Replies: 3)

Loading...