Decreaseing voltage increases current

  • Thread starter Thread starter loadedmike
  • Start date Start date
  • Tags Tags
    Current Voltage
AI Thread Summary
Increasing voltage typically results in an increase in current, as described by Ohm's Law (V = I * R). However, in specific scenarios like AC circuits, current can lead or lag behind voltage changes, which may cause confusion. A decrease in voltage will lead to a decrease in current if resistance remains constant. The relationship between voltage, current, and resistance is crucial for understanding circuit behavior. Overall, the discussion clarifies that while voltage and current are directly related, other factors like resistance and circuit type can influence their relationship.
loadedmike
Messages
5
Reaction score
0
hey fellas,

ok...so here's me question.

with increasing voltage i should increase current?

why does it work to the opposite...

even a small begginers lecture would be much appreciated
 
Engineering news on Phys.org
Unless you're talking about a Gunn diode or some other relatively esoteric device, an increase in voltage is seen with an increase in current.

Volts = Current * resistance, so voltage is proportional to current with a coefficient of resistance.

Ex: We have 5 volts going through a resistance of 0.5 ohms, what will the current be?

5 = I * 0.5

5/0.5 = 10, so 10 amps.

Double the voltage and see what you get.
 
loadedmike said:
why does it work to the opposite...

What circumstance are you referring too?
 
lets say you have 500 watts at an 8ohm load...it works out to roughly 63volts rms...

lets say you change that load to 1500 watts with a ohm load of 2.67 still the same voltage but obviously with an increse in current...but the same voltage...

but the confusing thing to me is if you do increase the voltage then you in fact increase current...that fact has been pointed out to me..(thanks by the way!)but all the research I've done always says voltage drop increases current...why
 
loadedmike said:
but all the research I've done always says voltage drop increases current...why

A decrease in voltage decreases current if all other factors remain constant.

In an AC circuit a change in current can lead or lag a change in voltage. Could that be what is confusing you.
 
what other factors would those be?
 
In this case resistance.
 
...so would increasing resistance decreases current...if voltage is held constant?
 
Exactly, I = V/R, current is directly proportional to voltage and inversely proportional to resistance.
 
  • #10
awesme thanks guys!
 
  • #11
Let me guess, another car audio guy is set straight...
 

Similar threads

Back
Top