Decreaseing voltage increases current

  • Thread starter Thread starter loadedmike
  • Start date Start date
  • Tags Tags
    Current Voltage
Click For Summary
SUMMARY

In electrical circuits, current is directly proportional to voltage and inversely proportional to resistance, as described by Ohm's Law (I = V/R). An increase in voltage leads to an increase in current, provided resistance remains constant. The discussion highlights that in AC circuits, the relationship between voltage and current can be influenced by factors such as resistance and load changes. A specific example illustrates that with a 5-volt supply and a resistance of 0.5 ohms, the resulting current is 10 amps.

PREREQUISITES
  • Understanding of Ohm's Law (I = V/R)
  • Basic knowledge of electrical circuits
  • Familiarity with AC circuit behavior
  • Concept of resistance and its impact on current
NEXT STEPS
  • Research the implications of resistance in electrical circuits
  • Learn about AC circuit analysis and phase relationships
  • Explore the characteristics of different load types (e.g., resistive, inductive, capacitive)
  • Study the effects of voltage drop in various circuit configurations
USEFUL FOR

Electronics enthusiasts, electrical engineering students, and professionals working with AC circuits or power systems will benefit from this discussion.

loadedmike
Messages
5
Reaction score
0
hey fellas,

ok...so here's me question.

with increasing voltage i should increase current?

why does it work to the opposite...

even a small begginers lecture would be much appreciated
 
Engineering news on Phys.org
Unless you're talking about a Gunn diode or some other relatively esoteric device, an increase in voltage is seen with an increase in current.

Volts = Current * resistance, so voltage is proportional to current with a coefficient of resistance.

Ex: We have 5 volts going through a resistance of 0.5 ohms, what will the current be?

5 = I * 0.5

5/0.5 = 10, so 10 amps.

Double the voltage and see what you get.
 
loadedmike said:
why does it work to the opposite...

What circumstance are you referring too?
 
lets say you have 500 watts at an 8ohm load...it works out to roughly 63volts rms...

lets say you change that load to 1500 watts with a ohm load of 2.67 still the same voltage but obviously with an increse in current...but the same voltage...

but the confusing thing to me is if you do increase the voltage then you in fact increase current...that fact has been pointed out to me..(thanks by the way!)but all the research I've done always says voltage drop increases current...why
 
loadedmike said:
but all the research I've done always says voltage drop increases current...why

A decrease in voltage decreases current if all other factors remain constant.

In an AC circuit a change in current can lead or lag a change in voltage. Could that be what is confusing you.
 
what other factors would those be?
 
In this case resistance.
 
...so would increasing resistance decreases current...if voltage is held constant?
 
Exactly, I = V/R, current is directly proportional to voltage and inversely proportional to resistance.
 
  • #10
awesme thanks guys!
 
  • #11
Let me guess, another car audio guy is set straight...
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
5K
  • · Replies 37 ·
2
Replies
37
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
30
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
Replies
68
Views
7K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 16 ·
Replies
16
Views
2K