Hi I'm just thinking over a few things and realised there's something pretty fundamental that doesn't seem to get explained often or at least very well. Maybe I'm too much of a novice, who knows...anyway... When a power supply states a power output of say, 750W, what exactly does that mean? 750W when? Not when I'm supply something with 12V and 3A. So does it mean a maximum/combination/summation of outputs that the supply can handle before..it burns out? For example, a psu, 750W output. This would happily supply several different components with different voltages and currents which add up to 750W. But what if there was just 1 output? Does that mean there could be an output of 375V and 2A? That doesn't seem right. But that's still 750W, right? Also, when it comes to circuitry, things like frequency generators, how do you know what current they can handle? Because, for example, something like an induction coil/heater - dc signal into a function generator, to coil (I know there's more to it than that) - surely the high current drawn from the coil would be way too high for the function generator?