Hello everyone, I'm trying to teach myself basic circuit design from an EE textbook, but I'm starting to think myself in circles here. I have no trouble with the basic algebra, but the logic behind the formulas is giving me trouble. The thing I'm most confused about right now is the relation of current to power: 1) [itex]P=I^2R[/itex] 2) and [itex]P = VI[/itex] From these, it's clear that if the current stays constant and the voltage increases, power will increase proportionally... but I just can't wrap my head around why that's so. Let's say we have a 12V battery connected to a resistor such that it draws 1A. We have another setup that uses a 24V battery and a stronger resistor, such that the current is similarly 1A. The circuit draws the same current but uses double the power. How can that be? Can someone please help me understand what's going on here?