Increased voltage increases or decreases current?

  • Thread starter Thread starter tpodany88
  • Start date Start date
  • Tags Tags
    Current Voltage
Click For Summary

Discussion Overview

The discussion revolves around the relationship between voltage and current in electrical circuits, particularly in the context of a 1000W fixture operating at different voltages (120V and 240V). Participants explore concepts related to power, impedance, and the role of ballasts in determining current draw.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about how increasing voltage leads to a decrease in current for a constant power load, citing specific current values for 120V and 240V.
  • Another participant questions the understanding of the relationship among impedance, voltage, current, and power.
  • Some participants note that the same lamp will not behave identically at different voltages, suggesting that different lamps or ballasts may be involved.
  • There is mention of electronic versus magnetic ballasts, with a participant suggesting that Ohm's Law may not apply to electronic ballasts.
  • Concerns are raised about using a magnetic ballast designed for 120V at 240V, with one participant suggesting it could damage the lamp.
  • Participants discuss the need to potentially change wiring or connections when switching between voltages for certain ballasts.
  • One participant suggests practical experimentation by measuring current draw at both voltages to clarify the confusion.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the implications of changing voltage on current draw, with multiple competing views regarding the behavior of different types of ballasts and the nature of the load.

Contextual Notes

There are unresolved assumptions regarding the type of ballast and load being discussed, as well as the specific electrical characteristics of the fixtures involved.

tpodany88
Messages
7
Reaction score
0
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.
 
Engineering news on Phys.org
Well, tell us what you think should happen.
 
You seem to have a poor understanding of the relationship among impedance, voltage, current, and power. Try looking up the definitions.
 
Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.
 
tpodany88 said:
Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.

read more
 
phinds said:
read more

Obviously that is what I am doing. A little direction would be helpful.
 
The lamp will not be the same lamp on 240 volts as you used on 120 volts.

So, if your lamp was 14.4 ohms, the current in it would be 120 volts / 14.4 ohms = 8.33 amps and the power would be 120 volts * 8.3333 amps or 1000 watts, as you said.

But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
So, the power will be 240 volts * 4.16 amps or 1000 watts.
 
vk6kro said:
But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
So, the power will be 240 volts * 4.16 amps or 1000 watts.

Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.
 
tpodany88 said:
Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.

But now that I realize the bulb doesn't need to be replaced and can run on 120 or 240v.
 
  • #10
You didn't say anything about a ballast.

If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.
 
  • #11
vk6kro said:
You didn't say anything about a ballast.

If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.

Is that the same for a magnetic ballast?
 
  • #12
No, I don't think so.

If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

It definitely would not supply less current to the lamp at the higher voltage.
 
  • #13
vk6kro said:
No, I don't think so.

If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

It definitely would not supply less current to the lamp at the higher voltage.

It is a magnetic ballast that can be used with either 120v or 240v. I don't believe the bulb needs to be changed with a change in supply voltage.

Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.

Thanks for your help tho.
 
  • #14
tpodany88 said:
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.


You are exactly correct if the load is constant power. Most modern ballasts will do exactly what you describe.
 
  • #15
Some ballasts you have to open and change a wire(I assume it is for the number of turns that are energized?) to 240(208?) from 120...
 
  • #16
tpodany88 said:
It is a magnetic ballast that can be used with either 120v or 240v.
In which case I expect you will find that you need to connect it up differently for the different potentials.

Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.
That is sure to give you some answers. Stand well back from it when you apply the 240v.
 
  • #17
tpodany88 said:
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.

Are you talking about Resistors or Transformers?
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
6K
  • · Replies 38 ·
2
Replies
38
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 23 ·
Replies
23
Views
5K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 40 ·
2
Replies
40
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K