1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Incandescent bulb heat emission

  1. May 10, 2012 #1
    Hi all,

    I have a doubt which I would like to hear your opinion and answers about.
    So, I was comparing different light sources for my research, and I started thinking about a sentence which I've always taken for granted: incandescent light bulbs waste more than 90% of their power as heat (wikipedia).

    Now, for sure the sprectrum of an incandescent is mostly in the infrared region, and that's why it has such a low "luminous efficacy", but isn't at the end ALL the power emitted converted into heat? I mean: what are we referring to when declaring that a bulb wastes a certain percentage of its emitted power as heat? Is it because radiation at different wave-lengths transmits different amounts of heat?

    For my understanding, a 100W bulb will emit 100W of heat. The visible ratio (taken into account by luminous efficacy, [lumen/W]) is just the amount of radiation which we can see..

    Thanks for any comment, cheers!
  2. jcsd
  3. May 10, 2012 #2
    I believe that you are correct. In this case, the word "heat" does not indicate the motion of molecules because the bulb itself is a vacuum. I really doubt that the small amount of matter in that bulb heats up and transfers energy to the glass which then gets conducted out. The efficiency just refers to the amount of energy that goes into useful light.

    I think.
  4. May 10, 2012 #3
    incandescent light bulbs waste more than 90% of their power as heat (wikipedia).

    If you are interested only in the visible light production of a light bulb than that statement would be true.
    If you are interested in the heat and light than the bulb is 100% efficient.
  5. May 10, 2012 #4
    The point is that in many online sources there is a distinction between power transformed into visible light and power lost as heat. My doubt is if there's any basis to this distinction or if, as I think, all the power is transformed into heat and the fact of part of it being visible to our eye is another thing (which is accounted for by luminous efficacy).
  6. May 10, 2012 #5
    Read my post above. That is the distcinction. You are paying for electrical power that is converted into heat that you do not want ( or might not want ) to produce a certain amount of light.
    What would you rather have lighting your home - a bulb with 100% conversion of electrical power into light or one at 10% conversion.

    You have to go back and study the definition of efficiency, which ultimately amounts to:
    efficiency = what is useful / what you pay for

    Who initially would care if all the energy from the light bulb is eventually converted into heat - that is another calculation for HVAC systems. what you are initially interested in is the amount of light you will get and how much will it cost. From your Wiki reference that is less than 10%.
  7. May 10, 2012 #6
    If you read my post above, of course I know the difference between useful light and heat losses. My question, which if you want concerns HVAC loads, but it actually just concern heat transfer in general, wants simply to see whether the transmission of heat through radiation might be different in different regions of the spectrum (i.e. radiation in the visible part of the spectrum transers heat exactly as radiation in the infrared part).

    There are plenty of webpages documenting the heat gain of different light sources, and they calculate the % of power drained which impacts HVAC. For my understanding this is always 100%, but I was wondering whether I was missing something.
  8. May 11, 2012 #7


    User Avatar

    Staff: Mentor

    You aren't missing anything. For the purpose of heat gain, all energy emitted by a light bulb eventually ends up as heat.
  9. May 12, 2012 #8


    User Avatar

    Its simple - total input power (100 W) = Invisible EM Radiation + Visible EM Radiation + heat.

    Radiation, visible or invisible, is not heat. Infrared EM radiation, in particular is not heat. It is called "heat radiation" because it is emitted by heated objects, like the filament in the lamp.

    luminosity is a measure of how much your eye responds to visible radiation and it's different for different wavelenths. Your eye responds really well to green light, not very well to deep violet or deep red at the same power. When you add up the luminosity of the visible light and divide by the total power of the lamp, you get the "luminous efficiency" of the lamp.

    The EM radiation from the lamp, visible and invisible, is eventually probably absorbed somewhere, unless it shoot off into space, and when it is absorbed, it is turned into heat.
  10. May 12, 2012 #9
    Sure. In lighting technologies, there are actually different parameters to account for these phenomena, i.e. luminous efficacy of a source and luminous efficacy of radiation. I assume with "heat" here you mean heat by conduction and convection from the bulb.

    For a typical incandescent bulb, which are the magnitudes of the three components you listed? Isn't almost all of the power converted into EM radiation (negligible losses), and 2% of this radiation is visible light?
  11. May 12, 2012 #10


    User Avatar

    I don't know the numbers, but I think that a lot of the input power is turned into conduction and convection heat. Looking at the Wikipedia page, it says that the spectrum of melting tungsten yields 52 lumens per watt. I think that's lumens per watt of total EM radiation. A 60 watt bulb is 14.5 lumens per lamp watt, so that's 870 lumens. If those lumens were from a filament that was practically melting, that would mean 870/52=16.7 watts of EM radiation from the 60 watt lamp, the rest being convected and conducted heat. The glass bulb probably absorbs some of that EM radiation which is converted to heat at the bulb. For a real lamp, the amount of convected and conducted heat will be even higher, because the filament is cooler. Its a bad calculation (the 60 watt lamp doesn't have a melting filament), but it gives you an idea of the numbers.

    I have no idea what the "2.1% luminous efficiency" means. They never define luminous efficiency, and sometimes they confuse it with luminous efficacy, which is lumens output per input lamp watt.
    Last edited: May 12, 2012
  12. May 12, 2012 #11
    There are even published papers which make confusion between those terms!

    Btw, the luminous efficiency wikipedia is referring to is simply calculated by the ratio of the source luminous efficacy (about 15lm/W) over the maximum possible luminous efficacy (683lm/W).

    Not sure about what you write about the fractions of power loss as heat for conduction/convection though.
    In the wikipedia "luminous efficacy" page it gives almost the same value for the incandescent bulb luminous efficacy (LES) and for its luminous efficacy of radiation (LER), where:

    LES: lumen output / power input at the plug
    LER: lumen output / total EM radiation power

    If LER is close to LES, it means to me that the conversion to EM power is almost perfect..
    Last edited: May 12, 2012
  13. May 13, 2012 #12


    User Avatar

    Hmm - I think you are right. I did the calculation for a 60 watt bulb, (T=2800 K, 840 lumens) and I get LER=15.0 lpw, LES=14.0 lpw. Now that I think about it, it makes some sense. Almost all the power input to the lamp is lost somehow at the filament. The bulb is evacuated and the metal connectors to the filament are very small, so almost all the energy is converted to EM radiation. The bulb wall absorbs some and is convected and conducted away. The atmosphere absorbs some, so the air around the bulb will heat up and convect away. Then again, tungsten does not emit as a perfect black body, maybe there is some complication there.

    The Wikipedia article says 90% of the input power is converted to heat, the rest to visible light. I don't know what that means. If you take the EM power between 360 and 830 nanometers (the "visible range", where the eye response is non-zero) for a black body, and divide it by the total EM power, you get 12.6 percent, so maybe that's what it means. But really, your eye is hardly responding past 700 nanometers. If you take the power between 360 and 700, you get 6.2 percent. The statement doesn't make much sense to me.
  14. May 13, 2012 #13
    Same opinion here, I have looked into different sources but haven't really found a good answer to this.
  15. May 13, 2012 #14


    User Avatar

    I think I will start screwing around with those articles, see if some clarification occurs.
  16. May 28, 2012 #15
    Hi, did you have any good results from your search?
  17. May 28, 2012 #16


    User Avatar

    No, but I think that the 10% visible comes from something like 80% of the input power is converted to black body radiation at 2800 K and 12.6% of that radiation lies between 360 and 830 nm, so 0.8 x 12.6% = 10% "visible" light.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook