Greetings,(adsbygoogle = window.adsbygoogle || []).push({});

I recently took an introductory circuits test. The answer key was released today. I am convinced it's wrong, and if my argument is flawed, I'd like to know before bringing it up with the instructor (no, not the professor, the BME guy who appears to apply equations without actually knowing what they mean).

The http://www.ece.umn.edu/class/ee2001/quiz_1a_solution.pdf [Broken] basically had three idealized voltage sources and a current source. There are no resistors (no load, as far as I can tell). However, power is dissipated in the circuit (according to the answer key). Now, as far as I can tell, this is not possible as P=R*I^2. R=0 (as far as I can tell). Hence no power is dissipated.

I have heard one counterargument: the current source implies a resistance in the voltage source, despite the fact it is ideal. Still how can one dissipate power without a resistor? Am I in the wrong? Can one even have current without a load?

From a physics perspective, what would happen is that the potential differences would simply equal out (we'd get a massive equipotential). Again, as this is an ideal circuit, no voltage would be dropped on the "wires" and no power would be dissipated. Is this reasoning sound?

Thanks,

John

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Power dissipated without a load?

**Physics Forums | Science Articles, Homework Help, Discussion**