Power consumption: Residential vs. Commercial

Click For Summary

Discussion Overview

The discussion revolves around the comparison of power consumption between residential and commercial sectors, particularly in the context of data centers and server farms. Participants explore various factors influencing power usage, including regional differences and technological advancements.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant poses a hypothetical question about whether IT companies or residential users consume more power overall.
  • Another participant cites that residential use accounts for 21% of US power generation while commercial use is at 17%, with estimates suggesting that 1-2% of US power is used by computers.
  • Regional variations in power consumption are noted, particularly in the southern US where air conditioning significantly increases energy use.
  • Concerns are raised about the efficiency of air conditioning systems, with claims that they require 2-3 watts of energy for every watt used by appliances.
  • A participant suggests that modern data centers can achieve lower overhead on power usage, especially when designed for cooler climates or utilizing water cooling systems.
  • Another participant expresses skepticism about the relevance of data centers in the overall power consumption discussion, referencing a source that indicates both residential and commercial sectors are increasing at similar rates.

Areas of Agreement / Disagreement

Participants express differing views on the significance of data centers in the context of overall power consumption, and there is no consensus on whether residential or commercial sectors consume more power.

Contextual Notes

Participants reference varying statistics and trends over time, indicating that the data may have changed since earlier years, but specific assumptions and definitions regarding power consumption remain unresolved.

AverageJoe
Messages
14
Reaction score
0
Just a hypothetical question here. It's just a thought that popped into my head that I thought was an interesting topic.

Seeing as many companies now have data centers and server farms, what would to say consumes more power overall?

All of the technologically superior countries' companies' IT, or the residents of those countries?
 
Computer science news on Phys.org
Residential use consumes 21% of US power generation, Commercial 17%
Estimates are that 1-2% of the US power is used by computers

It varies a lot by region - in the south most (electrical) power is used for AC
since AC is inefficient for every watt of energy you use in an appliance you need 2-3Watts of AC to remove the heat.

It's a little better in a large purpose designed facility like a server farm - but they still use more energy for cooling than powering the machines.
 
I'm sure those numbers have vastly changed since 2004.
 
If anything they will have got worse.
Lots of big screen plasma TV's at home and a lot of cost cutting at work.

Even a big data centre, think of 10,000 machines at 250W each is only 2.5MW even with old style AC that's only about 5MW total - nothing compared to a cement kiln or steel works.

Modern data centres, especially built somewhere cold or with water cooling, get down to only 10-20% overhead on the power used to run the machines.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 15 ·
Replies
15
Views
8K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
22
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 25 ·
Replies
25
Views
3K
Replies
5
Views
944
Replies
10
Views
5K