Increased voltage increases or decreases current?

  • Thread starter Thread starter tpodany88
  • Start date Start date
  • Tags Tags
    Current Voltage
AI Thread Summary
An increase in voltage does not necessarily increase current; it depends on the load's resistance or impedance. In the example provided, a 1000W fixture at 120VAC draws 8.33 amps, while at 240VAC, it draws only 4.16 amps, illustrating that power remains constant despite the voltage change. The discussion highlights the importance of understanding the relationship between voltage, current, and resistance, particularly when using different types of ballasts. Magnetic ballasts behave differently than electronic ones, affecting how current is drawn at varying voltages. Ultimately, testing the fixture at both voltages can clarify the current draw and underlying principles.
tpodany88
Messages
7
Reaction score
0
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.
 
Engineering news on Phys.org
Well, tell us what you think should happen.
 
You seem to have a poor understanding of the relationship among impedance, voltage, current, and power. Try looking up the definitions.
 
Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.
 
tpodany88 said:
Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.

read more
 
phinds said:
read more

Obviously that is what I am doing. A little direction would be helpful.
 
The lamp will not be the same lamp on 240 volts as you used on 120 volts.

So, if your lamp was 14.4 ohms, the current in it would be 120 volts / 14.4 ohms = 8.33 amps and the power would be 120 volts * 8.3333 amps or 1000 watts, as you said.

But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
So, the power will be 240 volts * 4.16 amps or 1000 watts.
 
vk6kro said:
But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
So, the power will be 240 volts * 4.16 amps or 1000 watts.

Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.
 
tpodany88 said:
Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.

But now that I realize the bulb doesn't need to be replaced and can run on 120 or 240v.
 
  • #10
You didn't say anything about a ballast.

If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.
 
  • #11
vk6kro said:
You didn't say anything about a ballast.

If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.

Is that the same for a magnetic ballast?
 
  • #12
No, I don't think so.

If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

It definitely would not supply less current to the lamp at the higher voltage.
 
  • #13
vk6kro said:
No, I don't think so.

If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

It definitely would not supply less current to the lamp at the higher voltage.

It is a magnetic ballast that can be used with either 120v or 240v. I don't believe the bulb needs to be changed with a change in supply voltage.

Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.

Thanks for your help tho.
 
  • #14
tpodany88 said:
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.


You are exactly correct if the load is constant power. Most modern ballasts will do exactly what you describe.
 
  • #15
Some ballasts you have to open and change a wire(I assume it is for the number of turns that are energized?) to 240(208?) from 120...
 
  • #16
tpodany88 said:
It is a magnetic ballast that can be used with either 120v or 240v.
In which case I expect you will find that you need to connect it up differently for the different potentials.

Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.
That is sure to give you some answers. Stand well back from it when you apply the 240v.
 
  • #17
tpodany88 said:
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.

Are you talking about Resistors or Transformers?
 

Similar threads

Replies
23
Views
5K
Replies
3
Views
1K
Replies
4
Views
2K
Replies
13
Views
3K
Replies
8
Views
3K
Back
Top