- #1

- 234

- 6

imgur: http://i.imgur.com/9VILpYL.jpg

Now, I can easily solve this by applying the formula [itex]P_R = \frac{(\Delta V_R)^2}{R}[/itex] and get all the correct answers.

However, the answers feel strange to me intuitively. In the formula above, lowering the resistance increases the power that is dissipated.

Shouldn't a device with MORE resistance dissipate MORE energy per unit time? After all, isn't the idea behind something like a superconductor that it dissipates no energy while carrying it, and thus, wouldn't convert any of the flowing currents energy into heat.

Or am I misunderstanding the meaning of what it is to dissipate power? I thought it was just heat. Shouldn't something with more resistance get hotter, thus dissipate more power.

Or is it a matter of power vs energy? Is it that a higher resistance converts a

**higher fraction**of every Joule of electrical energy into heat, but doesn't allow as much current, and so makes the conversion more slowly?