Decreaseing voltage increases current

  • Thread starter loadedmike
  • Start date
  • #1
hey fellas,

ok...so here's me question.

with increasing voltage i should increase current?

why does it work to the opposite...

even a small begginers lecture would be much appreciated
 

Answers and Replies

  • #2
136
3
Unless you're talking about a Gunn diode or some other relatively esoteric device, an increase in voltage is seen with an increase in current.

Volts = Current * resistance, so voltage is proportional to current with a coefficient of resistance.

Ex: We have 5 volts going through a resistance of 0.5 ohms, what will the current be?

5 = I * 0.5

5/0.5 = 10, so 10 amps.

Double the voltage and see what you get.
 
  • #4
lets say you have 500 watts at an 8ohm load...it works out to roughly 63volts rms...

lets say you change that load to 1500 watts with a ohm load of 2.67 still the same voltage but obviously with an increse in current...but the same voltage....

but the confusing thing to me is if you do increase the voltage then you in fact increase current...that fact has been pointed out to me..(thanks by the way!!)but all the research ive done always says voltage drop increases current...why
 
  • #5
360
21
but all the research ive done always says voltage drop increases current...why
A decrease in voltage decreases current if all other factors remain constant.

In an AC circuit a change in current can lead or lag a change in voltage. Could that be what is confusing you.
 
  • #6
what other factors would those be?
 
  • #7
360
21
In this case resistance.
 
  • #8
....so would increasing resistance decreases current....if voltage is held constant?
 
  • #9
360
21
Exactly, I = V/R, current is directly proportional to voltage and inversely proportional to resistance.
 
  • #10
awesme thanks guys!!!!!!
 
  • #11
Averagesupernova
Science Advisor
Gold Member
3,788
733
Let me guess, another car audio guy is set straight....
 
Top