## Effect on current, if a higher voltage is applied

I'm confused over this because generally i'm told that if the voltage increases then so does current and this satisfies the equation:V=IR because I=V/R increasing the top of the fraction should mean that the current increases but this contradicts the equation:P=IV be=P/cause this time I=P/V and if you increase the voltage then the current should go down as is the rule with the bottom of the fraction.These two equations just don't make sense with each other-one equation says that increasing the voltage increases the current and the other says that increasing the voltage decreases the current.

Mentor
 Quote by Dalek1099 I'm confused over this because generally i'm told that if the voltage increases then so does current and this satisfies the equation:V=IR because I=V/R increasing the top of the fraction should mean that the current increases but this contradicts the equation:P=IV be=P/cause this time I=P/V and if you increase the voltage then the current should go down as is the rule with the bottom of the fraction.These two equations just don't make sense with each other-one equation says that increasing the voltage increases the current and the other says that increasing the voltage decreases the current.
Given a constant power, then increasing the voltage requires a drop in the current to maintain that constant power. It just depends on what you are trying to hold constant.

If you increase the voltage across a resistor, then both the current and power increase. V=IR and P=VI.

Recognitions:
 Quote by Dalek1099 I'm confused over this because generally i'm told that if the voltage increases then so does current and this satisfies the equation:V=IR because I=V/R increasing the top of the fraction should mean that the current increases but this contradicts the equation: P=IV because this time I=P/V and if you increase the voltage then the current should go down as is the rule with the bottom of the fraction.These two equations just don't make sense with each other-one equation says that increasing the voltage increases the current and the other says that increasing the voltage decreases the current.
If you increase V, then I and P both increase. You can't hold P fixed when I and V are changing and R is being held constant.

Mentor

## Effect on current, if a higher voltage is applied

As one concrete example, consider a high-efficiency DC-DC converter. The output voltage is fixed, and the output current through a resistor would be fairly constant. So the power drawn from the input terminals in this situation is pretty much a constant power Pin.

Now, as you raise the input voltage Vin, the input current Iin drops. It's a very common characteristic for DC-DC converters,

 Quote by berkeman Given a constant power, then increasing the voltage requires a drop in the current to maintain that constant power. It just depends on what you are trying to hold constant. If you increase the voltage across a resistor, then both the current and power increase. V=IR and P=VI.
I think I get it now because you can't change one thing and keep 2 other things constant.

 this time I=P/V and if you increase the voltage then the current should go down as is the rule with the bottom of the fraction.
What you have said is only true if P is constant. All three terms (P, I and V) are normally variables.

In most but not all circuits if you increase V then both I and P will also increase.

For a resistive circuit:

I = V/R
and
P = V2/R

By inspection it's clear that if you change V then P increases faster than I.

Recognitions:
Gold Member