Electrical Unconnected wall wart power consumption

Click For Summary
A plugged-in but unconnected wall wart (charger) typically consumes minimal power, especially in modern designs. Many newer chargers, particularly those with switched-mode power supplies (SMPS), have significantly improved efficiency, often registering zero power consumption when no device is connected. Older linear transformers, however, can draw a small amount of power due to magnetizing currents, which may account for 0.25% to 5% of full load current. Testing with devices like a Kill-A-Watt meter can provide insights into actual power usage, revealing that many chargers consume negligible power when not in use. Regulations in the EU now mandate that chargers must limit standby power consumption to 0.1W, reflecting a broader trend towards energy efficiency to combat wasteful energy consumption from devices left plugged in. The discussion highlights that while individual chargers may seem insignificant, the cumulative effect of many devices left plugged in can lead to substantial energy waste, prompting the need for awareness and potential regulation to improve efficiency standards.
  • #31
anorlunda said:
Really?

Edit: You mean when not cooking.
Yes, all of those were measured off or on standby except where noted. I think that's what the OP was after.
What does it draw when cooking?
1,665W give or take a couple.
 
Physics news on Phys.org
  • #32
Measuring an unloaded wall-wart will, I believe and as mentioned already, be totally dominated by power factor (reactive power), and I dare say full of power (both positive and 'negative' reactive) at odd harmonics too so you'd need a clever thing to measure it for sure.

Not sure where you are in the world, but making stuff for the EU (in China, most probably!) means it is probably cheaper to make ALL stuff the same way. So maybe worth considering look at some of the EU directives, or whatever accreditations are listed on it/them, to see what standards your wall warts might be designed to.

I believe there is one that says these have to be limited to less than 0.5W when not loaded, but can't find a specific detail that.

https://en.wikipedia.org/wiki/European_Ecodesign_Directive
 
  • #33
cmb said:
I believe there is one that says these have to be limited to less than 0.5W when not loaded, but can't find a specific detail that.
The current (haha) no-load requirement is 0.1W. This only came into force in April 2020 but most suppliers (haha again) have been compliant for some time.

See page 2 of https://ec.europa.eu/energy/sites/e...019-2126_en_annexe_acte_autonome_part1_v3.pdf, linked from https://www.eceee.org/ecodesign/products/battery-chargers/.

Edit: This is only in the EU of course (and the UK where the provisions of such regulations are still effective); in the US you are probably allowed to burn coal to power an iPhone :wink:
Further edit: it seems I was wrong about the US!
 
Last edited:
  • #34
pbuk said:
The current (haha) no-load requirement is 0.1W. This only came into force in April 2020 but most suppliers (haha again) have been compliant for some time.

Philosophically, I dislike government regulation. But this case seems to be a big exception.

In 2001, the power consumption of wall warts plus the standby power of things like televisions was at 6% of electricity consumption in the USA. Forecasts at the time said that it would grow to 15%. But government regulations turned all that around. The designs were changed drastically. Neither the manufacturers nor the consumers suffered much pain. On the contrary, compared to other technology changes, standby power efficiency was almost trivial, but the effect on electricity consumption was major. I must concede this as a triumph of regulation.
 
  • Like
Likes jim mcnamara, Merlin3189, russ_watters and 1 other person
  • #35
anorlunda said:
...
In 2001, the power consumption of wall warts plus the standby power of things like televisions was at 6% of electricity consumption in the USA...
Is that actually true, I mean really actually true and not an urban myth?

Is there some solid, certified published reference for this?

All the thousands of heavy industry factories sucking up juice for aluminium smelting and such, even at a scale of millions of little gadgets, I do struggle to believe that.

6% of domestic consumption? Hmmm .. I still struggle a little but that might not be quite unbelievable.

The other thing to consider is whether 'waste' power is actually wasted. I mean, sure, if it is adding to excess heat, but adding 'waste' heat to a house that uses electric powered heating isn't really adding to the electricity consumption, if you see what I mean.
 
  • #36
cmb said:
Is there some solid, certified published reference for this?
No I can't find a good reference, so you may be right it could be a guess or a legend. Let's try to estimate it ourselves for the year 2001.
  1. 15 w per wall wart x 24 hours
  2. standby power 30 w per TV or stereo for 24 hours
  3. Let's say 5 wall warts plus 3 TV/stereo per household.
  4. 24*(5*15+3*30) about 4 kWh/day.
  5. Say 100M households, but only 75M affluent enough for so many devices.
  6. 75M * 4 kWh/day = 300M kWh/day, 300M*365 = 109 billion kWh/year.
  7. Reliable source https://www.eia.gov/totalenergy/data/monthly/pdf/sec7_2.pdf Shows total generation in 2001 at about 3500 billion kWh/year.
  8. 109/3500 = 3%
[Please check me. It's so easy to slip decimal points.]

So my estimate is only half of the 6% number. But it is in the same order of magnitude, and for a back-of-the-envelope guess, not bad.

[I used the strange units "billion kWh/year" to make it easy to compare with my source.]

Now, think of 2021. My wife and I have 5 digital devices @0.1 watts, + (1 TV + 2 laptop) @2 watts. That is 6.5 watts wasted on standby power compared to 165 watts in 2001. The difference is major.
 
  • #37
15W loss per wall wart!? Phew. Those would be very hot wall warts (or very powerful ones)!

I think most are a few watts at full load! Assume 10% losses, maybe half a watt each?

Most modern LED TVs are also 50~100W or so while on. Maybe an old CRT might have used 10's of W to keep things 'warm'? Again maybe a watt or two to keep the remote control circuits alive and a watt or so in the (unloaded) power converter.

If we're multiplying up 'a number' by 100 million, I think the detail needs to be improved on, whichever way the estimate goes. Small changes but times 100 million will give you big differences on that %age conclusion.
 
  • Like
Likes russ_watters
  • #38
I always found 6% hard to believe also. If you look around there are many many things that seem a lot more wasteful than that. I find drying clothes in a dryer to be bad. At my house most things get hung. In the winter the moisture is a bonus. Any extra heat needed at least doesn't go out the dryer vent.
 
  • #39
My old Motorola 6200 set top box says 40W on the back. When I first installed it, it fit neatly in the cabinet on top of my amp. Until I touched it, ouch, hot! I found this comment from 2009:
Since it's a STB, why not simply leave it on all the time. There's very little difference in power consumption between on/off on a STB (off is really standby).
 
  • #40
Keith_McClary said:
Old folks have big boxes of those, with various voltage, wattage and connectors (but never the combination they need).
Edit: Oh, and I forgot polarity.
Your right! I'm in my sixties and I have two boxes of them and a neat universal adapter that had attachments for different connector plugs and polarities. Unfortunately I moved recently and can't find the box with the attachments. A while back (10 years?) my electrical engineer brother was grousing about the wasted electricity.

My question is, how much energy would be saved if you have solar panels and could have inside wiring to have DC wall plugs. It seems to me a waste to have a transformer to make solar power into AC and then have to transform it back again to DC.
 
  • #41
Welcome to PF. :smile:

Lrodcepts said:
My question is, how much energy would be saved if you have solar panels and could have inside wiring to have DC wall plugs. It seems to me a waste to have a transformer to make solar power into AC and then have to transform it back again to DC.
Well, for efficient moderate-size PV installations, you are already using switching power conversion for MPPT:

https://en.wikipedia.org/wiki/Maximum_power_point_tracking

So to chop it to AC Mains levels will not be that different in efficiency compared to 12Vdc, IMO. Have you learned about MPPT converters yet, and why they are used with PV panels?
 
  • #42
Lrodcepts said:
My question is, how much energy would be saved if you have solar panels and could have inside wiring to have DC wall plugs. It seems to me a waste to have a transformer to make solar power into AC and then have to transform it back again to DC.
Physical switches do not work well in inductive DC circuits. They tend to set fire to the house. We need the AC so we can switch it off.
 
  • Informative
  • Like
Likes berkeman and anorlunda

Similar threads

Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
14K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 31 ·
2
Replies
31
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K