IBM's Roadrunner Energy Consumption

  • Thread starter Thread starter peripatein
  • Start date Start date
  • Tags Tags
    Energy
AI Thread Summary
The discussion revolves around calculating the annual energy consumption of IBM's Roadrunner supercomputer, which has a power consumption of 8.4402 GW. To find the yearly energy consumption, users confirm that multiplying the power by the total hours in a year (8.4402 GW multiplied by 8,760 hours) yields the correct energy consumption in gigawatt-hours (GWh). There is some confusion regarding terminology, particularly the phrase "hourly consumption a year," which is clarified to mean annual energy consumption instead. Users emphasize the importance of understanding the distinction between power and energy. The final consensus is to use the multiplication method to arrive at the annual energy consumption figure.
peripatein
Messages
868
Reaction score
0
Hello,

I am to find this supercomputer's energy consumption in GW/h a year.
I have found its consumption per hour to be 8.4402 GW/h.
How do I find the value per hour a year? Is it by simple multiplication by number of hrs a year (which doesn't seem right to me)?
Please advise.
 
Physics news on Phys.org
I have found its consumption per hour to be 8.4402 GW/h

In other words it's power consumption is 8.4402 GW.

So yes, just multiply 8.4402 GW by the the number of hours in a year.
 
But wouldn't that be its power consumption a year, and not 'per hour a year?'
 
Are you sure you didn't find the computers energy consumption in Gigawatts and you are looking for gigawatt*hours/year?
 
I am quite positive. That value was calculated based on data gleaned from the Wikipedia article. You may double check; I could have made a mistake. But does the computer work all year round? Do any other variables change throughout the year? It seems simply multiplying it by 24*365 will be over simplifying it. Yet I could myself be wrong. Any advice?
 
"A modern supercomputer usually consumes between 4 and 6 megawatts—enough electricity to supply something like 5000 homes."

http://spectrum.ieee.org/computing/hardware/nextgeneration-supercomputers/0

5x 10^6 watts * 24 hr/ day* 365 days/yr =43.8 gigawatt*hr/yr
 
So yes, just multiply 8.4402 GW by the the number of hours in a year.

But wouldn't that be its power consumption a year...

No. Energy consumption per year...

Remember Power = Energy/Time

so

Power(GW) * Time(H) = Energy(GWH)

or in SI units...

Power(in Watts) * Time(in Seconds) = Energy(Joules)
 
I am slightly confused. Which should it be then? And how do I calculate the hourly consumption a year?
 
I am slightly confused. Which should it be then? And how do I calculate the hourly consumption a year?

Best avoid the expression "hourly consumption a year". If you want the know how much energy a super computer uses per year in "gigawatt hours" then RTW69 has the right answer...

"A modern supercomputer usually consumes between 4 and 6 megawatts—enough electricity to supply something like 5000 homes."

http://spectrum.ieee.org/computing/hardware/nextgeneration-supercomputers/0

5x 10^6 watts * 24 hr/ day* 365 days/yr =43.8 gigawatt*hr/yr
 
  • #10
Thanks a lot!
 
Back
Top