Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Increased voltage increases or decreases current?

  1. Jan 21, 2012 #1
    I am a little confused on how an increase in voltage would increase current.

    If i have a 1000w fixture with 120vac supplied the current will be 8.333amps
    After increasing the voltage to 240vac, the same 1000w fixture would draw 4.16 amps.

    Please help me understand what I am missing here.
     
  2. jcsd
  3. Jan 21, 2012 #2

    Pengwuino

    User Avatar
    Gold Member

    Well, tell us what you think should happen.
     
  4. Jan 21, 2012 #3

    phinds

    User Avatar
    Gold Member

    You seem to have a poor understanding of the relationship among impedance, voltage, current, and power. Try looking up the definitions.
     
  5. Jan 21, 2012 #4
    Exactly what happens in my example. But I have read elsewhere that the current is directly proportional to the increase/ decrease of voltage.
     
  6. Jan 21, 2012 #5

    phinds

    User Avatar
    Gold Member

    read more
     
  7. Jan 21, 2012 #6
    Obviously that is what I am doing. A little direction would be helpful.
     
  8. Jan 21, 2012 #7

    vk6kro

    User Avatar
    Science Advisor

    The lamp will not be the same lamp on 240 volts as you used on 120 volts.

    So, if your lamp was 14.4 ohms, the current in it would be 120 volts / 14.4 ohms = 8.33 amps and the power would be 120 volts * 8.3333 amps or 1000 watts, as you said.

    But if you now put a different lamp of 57.69 ohms in, and apply 240 volts, the current will be 4.16 amps.
    So, the power will be 240 volts * 4.16 amps or 1000 watts.
     
  9. Jan 21, 2012 #8
    Thanks! I forgot about the fact the bulb must be replaced... although the ballast remains the same.
     
  10. Jan 21, 2012 #9
    But now that I realize the bulb doesn't need to be replaced and can run on 120 or 240v.
     
  11. Jan 21, 2012 #10

    vk6kro

    User Avatar
    Science Advisor

    You didn't say anything about a ballast.

    If it was an electronic ballast, you can't really apply Ohm's Law to it. It would just give the lamp the current it requires even if the input voltage changed.
     
  12. Jan 21, 2012 #11
    Is that the same for a magnetic ballast?
     
  13. Jan 21, 2012 #12

    vk6kro

    User Avatar
    Science Advisor

    No, I don't think so.

    If you tried to use an iron-cored inductor intended for 120 volts, on 240 volts, I expect it would destroy the lamp and possibly itself with the extra current.

    It definitely would not supply less current to the lamp at the higher voltage.
     
  14. Jan 21, 2012 #13
    It is a magnetic ballast that can be used with either 120v or 240v. I don't believe the bulb needs to be changed with a change in supply voltage.

    Maybe the best way to figure it out would to just wire it on 120v, check the amp draw.. then change the supply to 240v and check amp draw because I feel like I'm only confusing myself more.

    Thanks for your help tho.
     
  15. Jan 21, 2012 #14

    You are exactly correct if the load is constant power. Most modern ballasts will do exactly what you describe.
     
  16. Jan 22, 2012 #15
    Some ballasts you have to open and change a wire(I assume it is for the number of turns that are energized?) to 240(208?) from 120...
     
  17. Jan 22, 2012 #16

    NascentOxygen

    User Avatar

    Staff: Mentor

    In which case I expect you will find that you need to connect it up differently for the different potentials.

    That is sure to give you some answers. Stand well back from it when you apply the 240v.
     
  18. Jan 22, 2012 #17

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2015 Award

    Are you talking about Resistors or Transformers?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?