# Decreaseing voltage increases current

hey fellas,

ok...so here's me question.

with increasing voltage i should increase current?

why does it work to the opposite...

even a small begginers lecture would be much appreciated

Unless you're talking about a Gunn diode or some other relatively esoteric device, an increase in voltage is seen with an increase in current.

Volts = Current * resistance, so voltage is proportional to current with a coefficient of resistance.

Ex: We have 5 volts going through a resistance of 0.5 ohms, what will the current be?

5 = I * 0.5

5/0.5 = 10, so 10 amps.

Double the voltage and see what you get.

why does it work to the opposite...

What circumstance are you referring too?

lets say you have 500 watts at an 8ohm load...it works out to roughly 63volts rms...

lets say you change that load to 1500 watts with a ohm load of 2.67 still the same voltage but obviously with an increse in current...but the same voltage....

but the confusing thing to me is if you do increase the voltage then you in fact increase current...that fact has been pointed out to me..(thanks by the way!!)but all the research ive done always says voltage drop increases current...why

but all the research ive done always says voltage drop increases current...why

A decrease in voltage decreases current if all other factors remain constant.

In an AC circuit a change in current can lead or lag a change in voltage. Could that be what is confusing you.

what other factors would those be?

In this case resistance.

....so would increasing resistance decreases current....if voltage is held constant?

Exactly, I = V/R, current is directly proportional to voltage and inversely proportional to resistance.

awesme thanks guys!!!!!!

Averagesupernova