# Power supplies, voltage and current

• Jaevko
In summary, a power supply can output up to 30 volts or 1 amp, and the other can go up to 18 volts or 3 amps.
Jaevko
One power supply can go up to 30 volts or 1 amp, and the other can go up to 18 volts or 3 amps:

https://www.amazon.com/dp/B004M3XMC4/?tag=pfamazon01-20

https://www.amazon.com/dp/B004M3ROVO/?tag=pfamazon01-20

Why does one output higher voltage or lower current and the other output lower voltage or higher current? Since V = IR I would think the one that could put out a higher voltage on voltage mode would be able to put out a higher current on current mode (since voltage and current are proportional)...

PS Maybe I misunderstand how these power supplies work and my physics understanding is fine. The way I understand these power supplies to work is that one either picks voltage or current mode. In voltage mode it simply outputs the voltage you ask for. In current mode it senses the current with the given resistance and increases or decreases the voltage to achieve the current you asked for. If my understanding of the power supplies is wrong (rather than my understanding of the physics) then please correct me.

Thanks!

Last edited by a moderator:
Jaevko said:
Why does one output higher voltage or lower current and the other output lower voltage or higher current?
It's due to the design of the components used in the power supplies, as opposed to some law of physics.

The difference is in the power usage. If you have a power supply which can deliver a max of 100 watts, it could deliver say 10 volts and 10 amps. However if you wanted to get more voltage you would have to drop the current. Such as 20 volts and 5 amps. I have a high voltage one that delivers 40,000 volts but only about 10 mA, so the power is 400 watts. Decreasing the voltage to 20,000 would let me get 20 mA unless the current is limited.

I'm assuming they are power limited due to no wanting to burn out your power input or the outlet that the power supply gets its electricity from. If it wasn't power limited you could double your current pulled without realizing it and start a fire or destroy the power supply.

In designing a power supply among the other issues of stability and efficiency and such the three quantities, power, voltage, and current are interrelated by the relationship:

Power = Voltage * Current.

Thus at a given power level the higher the voltage, the lower the current and vis versa. Generally power rating is an issue of size, thermal issues, and component quality.

Transformers are able to trade off AC voltage vs current at a given power level to match the application for which the power supply is to be used. (and it is then rectified and stabilized if DC voltage is needed).

So to answer your question typically a power supply is designed to supply up to a certain amount of power at a given voltage matched to its application. This then dictates the current by the above relation.

Thanks for all the replies! So let's say I purchase both power supplies and a 10 ohm resistor. First I attach the 30 Volt 1 amp supply and turn the voltage up to 30. V=IR. Does this mean that the supply that can supposedly output only 1 amp is now outputting 3?

Similarly, imagine that I disconnect that power supply from the resistor and attach the 18V 3A one to the resistor. Then I turn the current up to 3A. V=IR. DOes this mean that the supply that can supposedly output only 18 volts is now outputting 30?

What am I missing here? Thanks for your time and help!

Jaevko said:
Thanks for all the replies! So let's say I purchase both power supplies and a 10 ohm resistor. First I attach the 30 Volt 1 amp supply and turn the voltage up to 30. V=IR. Does this mean that the supply that can supposedly output only 1 amp is now outputting 3?

No. The power supply is not an ideal voltage source (unlimited current). Rather it will have an internal resistance. Given the 1 amp rating at that voltage it may be as much as 30ohms. So connecting a 10 ohm resistor you'll have 40ohms total circuit resistance and draw 0.75 amps. But a good supply has as little internal resistance as is feasible to prevent output voltage drop under load.

It is more likely that the internal resistance is small and the power supply will indee output more than 1 amp for a short while. (Not quite 3 since there is still some internal resistance).

If it is well designed then a circuit breaker will kick into keep you from destroying the supply.

If less well designed, a fuse will blow and you'll need to replace it but it will protect the supply.

If even less well designed (unlikely but possible) it will die horribly in a cloud of acrid smoke with a slight possibility of flames as well.

Similarly, imagine that I disconnect that power supply from the resistor and attach the 18V 3A one to the resistor. Then I turn the current up to 3A. V=IR. DOes this mean that the supply that can supposedly output only 18 volts is now outputting 30?
No, you won't be able to "turn the current up to 3A" but rather will only get the 18V/10ohms = 1.8 amp output. That's ideally. In fact you'll likely get less via:
Current = 18V/(10+internal resistance) ohms = less than 1.8 amps.

[EDIT] It looks like the power supply in your link may have a constant current mode which means you will be able to set the current to any value up to the max current at max output voltage. What a constant current supply does is basically lower the voltage from the max value until the current drops to the set value. While in constant current mode if you connect a load with too high a resistance and set the current setting too high then it will not be able to output enough voltage to meet the setting and the actual current will be less than the setting. If you connect a smaller resistance load then the output voltage will drop to meet the current value you set. You won't be able to set it higher than the max rated value.[End Edit]

In short, a (DC) power supply can be modeled by a voltage source in series with a low value resistor representing internal resistance along with a circuit breaker to prevent overload.

Last edited:
jambaugh said:
No. The power supply is not an ideal voltage source (unlimited current). Rather it will have an internal resistance. Given the 1 amp rating at that voltage it may be as much as 30ohms. So connecting a 10 ohm resistor you'll have 40ohms total circuit resistance and draw 0.75 amps. But a good supply has as little internal resistance as is feasible to prevent output voltage drop under load.

It is more likely that the internal resistance is small and the power supply will indee output more than 1 amp for a short while. (Not quite 3 since there is still some internal resistance).

I have seen blenders that can get 1000 Watts. Assuming AC mains is 100 V for simplicity, that would mean that the internal resistance should be 5 Ohms, and the 1000 Watts is attained when the internal resistance is matched by the blender.

So why is the power supply as high as 30 ohms? Can't they do better and get 5 ohms?

Also, could you quadruple the resistance of a toaster, and use a step up transformer to double the mains voltage, or quarter the resistance of a toaster and use a step down transformer to halve the mains voltage, and still get the same quality toast as leaving the toaster alone? I guess what I'm trying to get at is why is mains voltage 120 V. Is that just a number they took out of a hat?

RedX said:
I have seen blenders that can get 1000 Watts. Assuming AC mains is 100 V for simplicity, that would mean that the internal resistance should be 5 Ohms, and the 1000 Watts is attained when the internal resistance is matched by the blender.

So why is the power supply as high as 30 ohms? Can't they do better and get 5 ohms?

Also, could you quadruple the resistance of a toaster, and use a step up transformer to double the mains voltage, or quarter the resistance of a toaster and use a step down transformer to halve the mains voltage, and still get the same quality toast as leaving the toaster alone? I guess what I'm trying to get at is why is mains voltage 120 V. Is that just a number they took out of a hat?

From here: http://en.wikipedia.org/wiki/Household_voltage

The choice of utilization voltage is due more to historical reasons than optimization of the distribution system—once a voltage is in use and equipment using this voltage is widespread, changing voltage is a drastic and expensive

Drakkith said:
From here: http://en.wikipedia.org/wiki/Household_voltage

The choice of utilization voltage is due more to historical reasons than optimization of the distribution system—once a voltage is in use and equipment using this voltage is widespread, changing voltage is a drastic and expensive.

That's interesting. It seems to me high voltages would be better, to reduce losses from wiring leading up to the load (this is the reason why power is transmitted at high voltages, low current) which would allow you to use thinner wires to save on costs without fearing you'll overheat the wires if they're too thin. Higher voltages would however be less safe.

I find the incandescent light bulb to be inherently unsafe. A finger can fit into the socket and short hot and neutral. In fact, the reason why plugs in the U.S. are polarized is because of the incandescent light blub! You want to make sure that the threads on the screw of the light bulb touch neutral (because that's the part you touch), and that the very bottom tip touches hot, so it's very important that the plug to your lamp knows which outlet hole is neutral and which outlet hole is hot. But the weird thing is that some countries don't have polarized plugs! Do they have incandescent light bulbs? I would be scared to change a light bulb in those countries - I would need to make sure I wasn't grounded. Anyways, the reason why plugs have polarization is for historical reasons! Also, this historical preservation is the reason why a CPU is unfathomable.

I don't really understand the hot and neutral wires in an AC circuit. Or rather what people say about them. In an AC circuit isn't the neutral wire just as "hot" as the hot wire? I understand the polarization and why it's important, but I here a lot of people say that the neutral wire isn't hot.

Edit: Neutral or Return line or whatever.

Drakkith said:
I don't really understand the hot and neutral wires in an AC circuit. Or rather what people say about them. In an AC circuit isn't the neutral wire just as "hot" as the hot wire? I understand the polarization and why it's important, but I here a lot of people say that the neutral wire isn't hot.

Edit: Neutral or Return line or whatever.

You have two wires coming out of a generator, wire A and wire B. With wire B, you split it down the middle so that it looks like a Y: that is, wires B branches into two wires, B1 and B2. Now you connect B2 to ground, literally. That is you drive a huge metallic stake into the ground, and attach wire B2 to it. Then you only have wire B1 and wire A to attach to the load.
Now wire A is called hot, and wire B1 is called neutral, and wire B2 is called the ground wire. Now it is safe to touch only one wire: A, B1, and B2. However, if you touch A and B1 at the same time, you complete the circuit and you're shocked. If you touch A and B2 at the same time, you complete the circuit and you're shocked. Now the key is that B2 is not just the second branch of B, or the metal stake, but the entire earth! The Earth is a fairly good conductor, so B2 is basically the 2nd branch of B, the metal rod in the earth, and the earth, since they are all touching and are all good conductors. So if you touch A and the ground at the same time, that is the same as touching A and B2 at the same time, so you get shocked. The only safe thing you can do is touch B1 and B2 at the same time, since they are at the same potential, as they are all connected to B. So you can be touching the Earth and hold the neutral wire at the same time. So basically the terminology reflects the danger. Of course, you need to be careful because someone might have gotten the wiring wrong, so what you think is neutral is actually hot, and if you touch hot and a piece of plumbing that goes to the earth, you complete the circuit from the generator, to A, to B2, to B, and back to the generator.

I recommend this website to get a picture:

The neutral wire is the same as the hot wire, the only difference being that the neutral wire is also connected to the ground. That makes a huge difference with regards to safety.

Alright, that's pretty much how I thought it worked.

Anyone, how does the current from a small pv inverter get to the grid?

David Morrow said:
Anyone, how does the current from a small pv inverter get to the grid?

It converts power from the solar panels or batteries into an AC frequency in phase with the Mains power.

Drakkith, any idea how the weak output current from a small pv inverter can get past the step down transformer(7200V to 240V) outside the residence and onto the grid?

Drakkith said:
It converts power from the solar panels or batteries into an AC frequency in phase with the Mains power.

How does it push past the incoming 7200V

David Morrow said:
Drakkith, any idea how the weak output current from a small pv inverter can get past the step down transformer(7200V to 240V) outside the residence and onto the grid?

As far as I know, which isn't much, the current doesn't matter. If there is 0.1 amps getting to the inverter it will be converted to 7200v as well.

David Morrow said:
How does it push past the incoming 7200V

In AC mains there isn't "incoming" power, the flow alternates directions ever half cycle. In my house the 60 hz frequency means that the flow of current comes in from one wire, reverses to the other one, and then goes back to the first one sixty times a second. Same thing for the transformer outside.

David Morrow said:
Drakkith, any idea how the weak output current from a small pv inverter can get past the step down transformer(7200V to 240V) outside the residence and onto the grid?
"sorry if this is a stupid question, I am writing a research paper and my knowledge of electricity is very limited, I have searched high and low for real answers even from doe and have been redirected everywhere. I understand that these inverters shut off if there is no incoming power"

Drakkith said:
As far as I know, which isn't much, the current doesn't matter. If there is 0.1 amps getting to the inverter it will be converted to 7200v as well.

In AC mains there isn't "incoming" power, the flow alternates directions ever half cycle. In my house the 60 hz frequency means that the flow of current comes in from one wire, reverses to the other one, and then goes back to the first one sixty times a second. Same thing for the transformer outside.

David Morrow said:
"sorry if this is a stupid question, I am writing a research paper and my knowledge of electricity is very limited, I have searched high and low for real answers even from doe and have been redirected everywhere. I understand that these inverters shut off if there is no incoming power"

I believe that is correct. If there is no incoming power from the mains lines, then it's just a waste of power to have the inverter on and doing nothing, as it still draws a small amount of power just to be turned on. What are you writing your paper on? Have you picked up a basic electronics book or something similar? Or are you interesting less in the technical how, and more in the results?

David Morrow said:
Drakkith, any idea how the weak output current from a small pv inverter can get past the step down transformer(7200V to 240V) outside the residence and onto the grid?

David Morrow said:
How does it push past the incoming 7200V

I would add that the only difference between a "step up" and a "step down" transformer is in how it is intended to be used. The same transformer will step up, or step down depending on which way the power is flowing, and with similar efficiency both ways.

Now resonate transformers (tesla coils) and pulse transformers (such as the trigger coil in a xenon strobe) may be exceptions but common power transformers are bidirectional.

In short the same transformer which steps 7200V down to 240V household will if fed from the house step 240V up to 7200V.

## 1. What is the difference between voltage and current?

Voltage is the measure of electrical potential difference between two points, while current is the flow of electric charge through a circuit. In simple terms, voltage is the force that pushes the current through a circuit.

## 2. How do I choose the right power supply for my device?

The first step is to determine the voltage and current requirements of your device. Then, choose a power supply with the same voltage and equal or higher current rating. It is important to also consider the power supply's efficiency, reliability, and safety certifications.

## 3. Can I use a power supply with a higher voltage or current rating?

No, it is not recommended to use a power supply with a higher voltage or current rating than what your device requires. This can potentially damage your device and pose a safety hazard.

## 4. What is the difference between AC and DC power supplies?

AC (alternating current) power supplies provide a constantly changing voltage and are used for household electricity. DC (direct current) power supplies provide a constant voltage and are used for electronic devices. Most electronic devices require a DC power supply.

## 5. How do I convert AC voltage to DC voltage?

This can be done using a rectifier, which is a circuit that converts AC voltage to DC voltage. It typically consists of diodes that allow current to flow in only one direction, resulting in a pulsating DC voltage. This can then be smoothed using a capacitor to provide a more constant DC voltage.

• Electromagnetism
Replies
1
Views
869
• Electromagnetism
Replies
3
Views
919
• Electromagnetism
Replies
3
Views
1K
• Electromagnetism
Replies
16
Views
1K
• Electrical Engineering
Replies
8
Views
1K
• Electrical Engineering
Replies
32
Views
1K
• Electromagnetism
Replies
5
Views
4K
• Electrical Engineering
Replies
19
Views
1K
• Electromagnetism
Replies
17
Views
2K
• Electrical Engineering
Replies
25
Views
2K