I've never really understood the relationship between voltage, current, and resistance. The more I read about them, the less comfortable I feel discussing them. V=IR...mathematically, it doesn't get any simpler than Ohms Law, yet when I try to predict the behavior of its variables in real world situations, I couldn't be more confused. Let's take the example of voltage drop in household circuits. If we have two scenarios, 1) A power tool plugged into an outlet with a 10' extension cord; and 2) The same power tool plugged into the same outlet with a 100' extension cord of the same diameter, what effect would this increase in length have on the values in Ohms Law? Everyone knows it's bad to use too long an extension cord because the resulting voltage drop harms electric motors. But doesn't the increased wire length also increase the resistance as well as drop the voltage? And if this is so, wouldn't you need a concomitant decrease in current (I) to offset both the drop in voltage (V) and increase in resistance (R) in order to satisfy Ohms Law of V=IR? I'm not looking for quantitative examples involving amps, volts, and ohms that we then "plug in" (grin) to our equation, but rather a relative understanding of how and why changing one parameter affects the others. I've read all the water analogies, some aspects make sense and some don't. This brings up the biggest sticking point I have with Ohms Law. Everywhere I read that adding resistance(R) to a circuit decreases voltage (V). But I was under the impression that resistors resist the flow of electrons, and the “flow of electrons” is exactly what current (I) is, right? So V=IR, but increase in (R) results in decrease in (V) and (I) stays the same. Huh?