1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How would a light bulb's Lumens increase with a decrease in Watts?

  1. Nov 8, 2014 #1
    I believe I understand that you cannot directly connect watts to lumens because although they could/may be related, they represent different quantities (I guess it's akin to comparing height and weight measurements for a person). However, given the same exact light bulb, typically the higher the power, the brighter the light bulb. Or, the higher the watts the higher the lumens.

    What could cause a result in a higher lumens for the same light bulb with a lower wattage?

    I found the formula "watts = lumens / (lumens / watt)", but this doesn't really tell me much as to how or why (if I convert formula to lumens), watts result in a higher lumens, or how you can get higher lumens with a lower watts.
     
  2. jcsd
  3. Nov 8, 2014 #2

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    watts = lumens / (lumens / watt) is trivial. It is equivalent to 1=1.

    Less light with more input power? That would be a weird light source. You can certainly construct such a thing with electronics, but where is the point?
    More light with less input power? Then we would operate the device at the lower power, and have the same question as above. I don't think this can happen by accident.
     
  4. Nov 8, 2014 #3

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    Lumens is a spectrally-weighted 'version' of watts (photometry v. radiaomtery).

    I can imagine (but i don't know if any real source can do this) that if the emission spectrum of a bulb increasingly overlapped the spectral response of vision as the driving voltage decreased, that the lumens can go up if the watts goes down. But it's hard to think of a real bulb that odes this.
     
  5. Nov 8, 2014 #4

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    I guess it would need to involve a filament (black body radiation) normally running at such a high temperature that its spectral peak is up in the ultraviolet region.
     
  6. Nov 8, 2014 #5

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Blackbody radiation is always increasing for every frequency if you increase temperature. The efficiency goes down, sure, but the light emission still increases if you increase the temperature.

    It is easy to design a system that switches from a light bulb (at low power, emits light) to some resistor (at high power, just gets hots and does not emit light), I just don't see the application.
     
  7. Nov 10, 2014 #6
    Thanks andy, mfb, sophiecentaur! Really appreciate your responses. Let me try to improve my question.

    First, I'm very aware there is no known way of accomplishing this today. Therefore, while it may be possible theoretically, for practical purposes it really isn't possible. I get that. However, what I'm asking (and would still like to know) is what it would it take to make what I asked happen?

    Here's a simplistic example I used elsewhere/before: Let's say we have a basic battery based circuit with a small 10 watt bulb and 2 batteries: one that is able to produce (or be drawn from) 30W and another at 20W. First you use the 30W battery in the circuit and you get a X lumens. Then, using the same 10W bulb, you switch to the 20W battery but this time you get an increase to Y lumens.

    Again, I'm very aware there is no known way of accomplishing this today, that this is impossible in a practical sense, the load determines the power draw, etc... any other reason or variation of why someone (including myself) would say "this is not possible in a practical sense".

    But...Again, theoretically...

    What variable would have to change to make that possible? Assuming the same bulb, how could you get the same or increase in Lumens from that light bulb if you only reduced the power source wattage?

    Appreciate anyone's help and effort on this.

    Thanks
     
  8. Nov 11, 2014 #7

    davenn

    User Avatar
    Science Advisor
    Gold Member


    Watts isn't a what battery is rated by ... that is you don't get xxx Watt batteries you get XX V batteries capable of supplying current @ XX A/hr

    the Watts used is a result of the battery voltage and the load your bulb presents to the battery

    so if you have a 12V battery and a 12W @ 12V bulb, 1 amp will be drawn from the battery
    its doesn't matter how many different individual 12V batteries you put that 12W bulb across, 12W is still going to be used by the bulb

    If you put the 12W @12V bulb across a lower voltage battery, Say a 6V one, then the bulb will glow with 1/2 the light output
    if you put the 12W @12V bulb across a higher voltage battery, Say a 24V one, then the bulb will glow with twice the light output and is likely to go POP
     
  9. Nov 11, 2014 #8

    russ_watters

    User Avatar

    Staff: Mentor

    It isn't possible in theory either.
     
  10. Nov 11, 2014 #9

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    Sometimes bulb manufacturers provide a 'lm/W' specification as a measure of 'lighting efficiency'. For example, tungsten-filament incandescent bulbs have an 'efficiency' of about 15 lm/W, while fluorescent bulbs are around 70 lm/W. LEDs can have higher efficiencies (several hundred or higher) when they are operated in the low-current regime, with an efficiency that decreases as the current increases:

    http://apps1.eere.energy.gov/buildings/publications/pdfs/ssl/led_energy_efficiency.pdf
    http://www.digikey.com/en/articles/techzone/2011/oct/identifying-the-causes-of-led-efficiency-droop

    That said, it's important to realize that while the luminous efficacy can (slightly) vary inversely with LED drive current, the *absolute* emittance of an LED strongly depends on the drive current. So, the efficiency may slightly increase with a current decrease, the actual radiated power decreases.
     
  11. Nov 11, 2014 #10
    davenn: You're correct. Please note, however, in my example where I said "or be drawn from". In my attempt to quick and simplistic, I wasn't tight enough in my use of language. But, the concept of the example is still correct.

    So... adjusting with my example adjusted... How could you draw a lower wattage from the power source, for the same bulb (correct the rating of bulb to 30W in my example)?

    Please note that I'm only holding the power source and bulb the same. There could be something in between that could change. What I'm wondering is what would it take. Knowing today its impossible in practical terms. Assuming it was possible. What would it take?

    And, russ_waters, it is possible in theory, but thanks for your response.
     
  12. Nov 11, 2014 #11

    russ_watters

    User Avatar

    Staff: Mentor

    Sorry, but I don't think you understand what that means. Please cite and link the theory you are referring to.
     
  13. Nov 11, 2014 #12

    Danger

    User Avatar
    Gold Member

    Based solely upon the wording of the original post, I might have a way that it could theoretically be done. There is nothing in it that specifies the existence of only one filament in the single bulb.
    Suppose then that there's a very high-efficiency light source (maybe an LED) that is totally inert below a transition temperature (say 100° C) fed from a 6 V battery with a thermostatic switch. A normal tungsten filament acts as in a standard bulb, running from a 12 V battery on the flipside of the same thermostat. So, you would get instant light when the main on/off switch is closed. When the filament heats the interior to 100°, the thermoswitch trips over to the secondary circuit.
     
  14. Nov 11, 2014 #13

    davenn

    User Avatar
    Science Advisor
    Gold Member


    you cant

    For a bulb that is rated at a certain wattage at a certain voltage its always going to draw the same current to produce its rated light output
    if you drop either the supplied voltage or supplied current then the light output is going to decrease

    as Scotty from Star Trek said " ye canne break the laws of physics"

    As Andy and others have said, if you want more light for les power used, then you must change the bulb to a more efficient one

    Dave
     
  15. Nov 11, 2014 #14
    Correction. Don't want to get too side tracked, but so not to mislead anyone else reading this... I have to take back partially of what I said about davenn's comments. I fell in the often made mistake of taking the word "draw" too literally when it's meant more as an expression. Loads (e.g.) on a circuit don't draw current. Loads more or less complete a circuit, and accept or allow to current - not draw it. Regardless, I think the heart of what we are saying are the same and correct.

    Interesting Danger. Thank you! I'll have to roll what you said though in my head to clarify. But thanks

    russ_waters. I do know what it means. If you' like to start a separate conversation I'd be happy to continue it there rather then take this off topic. But, again, thanks for your engagement on this.
     
  16. Nov 11, 2014 #15
    I do appreciate your responses, but you're not understanding the question. I'm asking "What it would take". I'm not asking if it's possible. I get that it's not possible practically, but what would it take to make it possible? Grr... If I had hair I'd be pulling it out. haha... I don't know how else to ask the question. o_O

    A completely acceptable answer is: "It would take 'X, Y, and/or Z', but none of things are possible in practical way. But if they were, you could do it"

    I feel like the responses I'm getting are:

    Me: I know this isn't probably possible today, but what would it take to get you to give me a ride to the movies?
    Response: It's not possible.
    Me: Ok. I understand that, but what would it take if it were possible?
    Response: It's not possible.
    Me: Ok. I get that. I completely understand the difficulties of making that happen. But... just bear with me... assuming it were possible... would would it take? How would we do it?
    Response: It's not possible.
    Me: o_O
     
  17. Nov 11, 2014 #16

    russ_watters

    User Avatar

    Staff: Mentor

    My request stands, either way. Discussions on PF are required to adhere to existing scientific theory. This discussion may need to be closed if it is not.
     
  18. Nov 11, 2014 #17

    davenn

    User Avatar
    Science Advisor
    Gold Member

    No, you are not understanding the answers all these people are giving you

    one more time ..... you CANNOT get more light out of a given bulb when you decrease either the voltage or current or both
     
  19. Nov 11, 2014 #18

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    It's more like this:

    Me: I know this isn't probably possible today, but how long would it take to get to the moon by flapping our arms?
    Response: It's not possible.
    Me: Ok. I understand that, but how long would it take if it were possible?
    Response: It's not possible.
    Me: Ok. I get that. I completely understand the difficulties of making that happen. But... just bear with me... assuming it were possible... how long would it take? How would we do it?
     
  20. Nov 12, 2014 #19
    Thanks everyone for your responses and time! I was able to get the answers I was looking for.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: How would a light bulb's Lumens increase with a decrease in Watts?
Loading...