This has been racking my brain for the last week.. its about current..i keep trying to think of a circuit as a water hose, with voltage being the pressure, current being the flow, resistance being a kink in the hose, etc... Ohm's Law Current = Voltage / Resistance This is common sense... The higher the voltage, the higher the current Extremely low voltage ---> extremely small current that i can live with... Now, the WATTS Law Watts = Voltage * Current or Current = Watts / Voltage In this case, current is inversely porportionate to the Voltage. WHY is it that when a power load is added to a circuit, a higher voltage will cause LESS current to be drawn?? To me that doesnt make sense when i try to picture it in my mind..I took this to the extreme.. Current = 100W / 0.01V Current = 10,000 AMPS wow, 10,000 amps is quite abit of current considering i hardly have any voltage. we could take this more and more extreme.. hooking up a 2v small cell to a 100,000W power load.. theres 50,000 AMPS.. from almost nothing. I thought about this for awhile, and my only reasoning is that AMPS isnt necessarily the amount of current, but the SPEED of it. If i remember correctly, amperage is the amount of electrons to pass a point in a certain amount of time. Is this right? Even so, your telling me a 10000000000000000000000000000000000 WATT power load will suck electrons from a .00000000000000000000000000001 volt source at 999999999999 times the speed of light? I just don't see that happening. LOL. even Ohm's Law seems a little wierd. Once you have 0.000001 resistance, your amperage is in the millions. There has to be a simple explanation to these outrageous numbers!