Say you wanted to heat up your room. You could choose between an electric heater that outputs 500W or a PC that draws 500W. In both cases, the amount your room would heat up by would be the same, but the second case is much more advantageous because the computer can process information before generating the heat. You could for example tell your computer to calculate the molecular folding of proteins and help find a cure for cancer and other diseases (a project which already exists, called Folding@Home) before it converts that energy to heat, and it would still be just an efficient as the electrical heater (that is, 100% thermally efficient). But why? In basically every other physical process I can remember, there is a trade-off you have to make. If you wanted to lift a car using just your physical strength, you could use pulleys to drastically lower the force required. But in this case, the trade-off is that you have to pull the pulley for many meters for the car to move a few centimeters upwards. The reduced force required is "transformed" into increased distance required. However, in the case of computers, there seems to be no trade-off besides the fact that the microprocessor is much more complex (and thus more costly) to make than the electric heater plates. If I didn't know better, I would have said that a computer that outputs 500W of heat probably draws more than 500W of electricity, and the difference would be due to energy being transformed into information processing. But I do know better. So what's the explanation for information processing's apparent lack of trade-off?