I have a question I've found asked, but never found answered. I'm an EE, but I have a car as a hobby. I changed from a belt drive inlet air compressor (supercharger) to a turbocharger. People commonly say the turbo is "much more efficient" because it "runs free on otherwise wasted exhaust heat". I see almost the same fuel consumption per horsepower and almost the same horsepower per atmosphere of boost. For example, if I normalize boost pressure so my engine makes ~800 HP with a supercharger and 800 with a turbo with the same combustion air/fuel ratio, the turbo system fuel consumption is about 7% less. Isn't this a reasonably accurate way of determining the actual wasted heat energy recovered? Wouldn't efficiency change most accurately show as pounds per hour fuel for a given power output? I thought about measuring temperatures and pressures, but it seems to me the real answer is just the very simple answer. It looks like the turbo only recovers about 7% of wasted energy. I'm just try to decide if that is accurate.