Increased voltage increases or decreases current?

In summary, the conversation discusses the relationship between voltage and current in a 1000w fixture. It is explained that increasing the voltage from 120vac to 240vac will cause the current to decrease from 8.333amps to 4.16 amps. The concept of impedance and its effect on current and power is also mentioned. The conversation also touches on the use of ballasts and how they may affect the current draw of a lamp. It is suggested to test the current draw at both 120vac and 240vac to better understand the relationship between voltage and current.
  • #1
tpodany88
7
0
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.
 
Engineering news on Phys.org
  • #2
Well, tell us what you think should happen.
 
  • #3
You seem to have a poor understanding of the relationship among impedance, voltage, current, and power. Try looking up the definitions.
 
  • #4
Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.
 
  • #5
tpodany88 said:
Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.

read more
 
  • #6
phinds said:
read more

Obviously that is what I am doing. A little direction would be helpful.
 
  • #7
The lamp will not be the same lamp on 240 volts as you used on 120 volts.

So, if your lamp was 14.4 ohms, the current in it would be 120 volts / 14.4 ohms = 8.33 amps and the power would be 120 volts * 8.3333 amps or 1000 watts, as you said.

But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
So, the power will be 240 volts * 4.16 amps or 1000 watts.
 
  • #8
vk6kro said:
But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
So, the power will be 240 volts * 4.16 amps or 1000 watts.

Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.
 
  • #9
tpodany88 said:
Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.

But now that I realize the bulb doesn't need to be replaced and can run on 120 or 240v.
 
  • #10
You didn't say anything about a ballast.

If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.
 
  • #11
vk6kro said:
You didn't say anything about a ballast.

If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.

Is that the same for a magnetic ballast?
 
  • #12
No, I don't think so.

If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

It definitely would not supply less current to the lamp at the higher voltage.
 
  • #13
vk6kro said:
No, I don't think so.

If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

It definitely would not supply less current to the lamp at the higher voltage.

It is a magnetic ballast that can be used with either 120v or 240v. I don't believe the bulb needs to be changed with a change in supply voltage.

Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.

Thanks for your help tho.
 
  • #14
tpodany88 said:
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.


You are exactly correct if the load is constant power. Most modern ballasts will do exactly what you describe.
 
  • #15
Some ballasts you have to open and change a wire(I assume it is for the number of turns that are energized?) to 240(208?) from 120...
 
  • #16
tpodany88 said:
It is a magnetic ballast that can be used with either 120v or 240v.
In which case I expect you will find that you need to connect it up differently for the different potentials.

Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.
That is sure to give you some answers. Stand well back from it when you apply the 240v.
 
  • #17
tpodany88 said:
I am a little confused on how an increase in voltage would increase current.

If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

Please help me understand what I am missing here.

Are you talking about Resistors or Transformers?
 

1. How does increasing voltage affect current?

Increasing voltage results in an increase in current, according to Ohm's Law. This means that as voltage increases, the flow of electric charge (current) also increases.

2. Is there a limit to how much voltage can increase current?

Yes, there is a limit to how much voltage can increase current. This limit is determined by the resistance in the circuit, as stated in Ohm's Law (I = V/R). As resistance increases, the current decreases, so there is a point where increasing voltage will no longer result in an increase in current.

3. Does increasing voltage always result in an increase in current?

No, increasing voltage does not always result in an increase in current. This is because other factors, such as the resistance in the circuit, can also affect the flow of current. For example, if the resistance increases at the same time as the voltage, the current may stay the same or even decrease.

4. How does increased voltage impact the components in a circuit?

Increased voltage can potentially damage or overload components in a circuit. This is because a higher voltage can cause a higher current to flow, and if the current exceeds the maximum capacity of a component, it can lead to overheating and failure.

5. Is it always beneficial to increase voltage to increase current?

No, it is not always beneficial to increase voltage to increase current. In some cases, increasing voltage can lead to unnecessary energy consumption and unnecessary stress on components in the circuit. It is important to carefully consider the effects of increasing voltage before making any changes to a circuit.

Similar threads

  • Electrical Engineering
Replies
23
Views
2K
  • Electrical Engineering
Replies
13
Views
2K
Replies
23
Views
3K
  • Electrical Engineering
Replies
21
Views
1K
  • Electrical Engineering
2
Replies
40
Views
2K
Replies
68
Views
3K
  • Electrical Engineering
Replies
12
Views
1K
  • Electrical Engineering
Replies
5
Views
1K
Replies
10
Views
1K
  • Electrical Engineering
Replies
8
Views
2K
Back
Top