I've been reading about old iceboxes, and I'm trying to get an intuition for how their energy consumption compares to modern fridges. My reasoning is based on secondary school physics from 20 years ago, so I'm hoping somebody can sharpen up my formulation of the problem. The best numbers I can find so far (http://www.iceboxmemories.com/site/about.htm [Broken]) say that an icebox typically held 25- 50 lbs of ice, which took 1-2 days to melt during the summer months. That's a fairly wide range, so suppose we take an icebox with a capacity of 25lbs (which we'll say is 10kg), and that that block of ice takes a day to melt. Now, if I understand correctly, the energy required to melt a gram of ice is 3.3355*10^2J, so melting the 10kg block of ice takes 3.3355*10^6J, which is fairly close to a kilowatt hour. So an icebox that uses one 10kg block of ice a day is consuming 365KWh a year of energy. Some quick internet searching would suggest numbers of 200-400KWh a year for a modern fridge. Given how rough the numbers are and the fact that we're not comparing like with like, I'm surprised that the calculations for the modern fridge and the icebox are even within an order of magnitude of each other. Is that a coincidence, a mistake in calculations, or is it because the energy consumption is determined more by the requirement (keeping the contents of a box cooler than its surroundings) than by the technology?