I need some help to understand the maximum power theorom. From what I understand it states that in order to get maximum power from a source you have to match the resistance of the source to the resistance of the load. Now what i don't understand is how could adding resistance to a load generate more power. I thought electric power traveled a path to least resistance. And if the resistance of the source is high, lets say a kilo-ohm or mega-ohm, wouldn't the power from the source disipate to nothing if it came across a load of such high resistance. Especially if the voltage from the source was low (lets ay under 3 volts).