Wall Adapter Question: How is 1A Output Possible?

  • Thread starter Thread starter Sir Physics
  • Start date Start date
  • Tags Tags
    Adapter Wall
AI Thread Summary
Wall adapters can output lower current ratings, such as 1 amp, because they only supply the amount of current that the connected device requires, not the maximum capacity of the outlet. While outlets may be rated for 20 amps, the actual current drawn depends on the load, which can vary significantly. In AC to DC conversion, voltage can be lowered without necessitating an increase in current, as the load dictates the current drawn. The relationship between power, voltage, and current follows the formula P=IV, meaning that power remains consistent while current and voltage can vary based on the device's needs. Understanding this helps clarify why devices like phone chargers can operate efficiently at lower current draws.
Sir Physics
Messages
3
Reaction score
0
Hey guys and gals - I was constructing a circuit yesterday and a question developed in my mind. Power out of an outlet is about 120 VAC at like 20 amps as most all of you probably know. So if current MUST increase when voltage decreases how are there wall adapters that only put out 1 amp when it should be a lot more?

Thanks!
 
Physics news on Phys.org
The outlet may be rated for 20 amps but that's not what will be drawn. The outlet can only handle 20 amps safely. The outlet must be connected to a breaker that is 20 amps or smaller if your house is wired correctly. That means that the breaker will trip before the outlet can draw more than what it is rated for.


Sir Physics said:
So if current MUST increase when voltage decreases how are there wall adapters that only put out 1 amp when it should be a lot more?
Thanks!

What you are talking about are maximums. The wall adapter is rated for 1 amp but whatever connects to it may only draw 200mA. You could short the adapter and it will draw much more than 1 amp before it burns. If the adapter can draw 2400 Watts before it burns then it will trip the breaker too (120VAC*20Amps =2400W).

AC to DC converters typically lower the voltage in addition to rectifying AC but that does not mean that they must increase the current. It depends on what the connected device will draw.
 
So if current MUST increase when voltage decreases..

That's only true if you want the same power delivered to the load. P=IV

In most (but not all!) situations it's best to think of the power supply as the thing that determines the voltage and the load determines the current and power drawn from that supply.

Example: My mobile phone charger is rated at 5V 2A but my phone typically only draws 1A from the charger. If I switch my phone on when it's charging the current goes up (some used to charge the battery and some to run the phone). When I disconnect my phone the current falls to zero (obviously).
 
Ahhhhh - that makes sense - Thanks for all the help!
 
Back
Top