Trying to understand an aspect of LED technology never discussed in the literature. There are many bright LED lamps on the market that incorporate the optics, chips, board, driver and heat sink into a single module. I have a question about the driver. My understanding about how drivers work is that they convert a current source (say, 12 volts AC) into the proper forward voltage and current (amps) according to the LED specs. These drivers also regulate the current so (according to mfgs.) the source voltage can be anywhere between 9 volts and 15 volts. From the National Semiconductor site (http://www.national.com/pf/LM/LM3431.html), driver specs (that I struggle to understand) seem to indicate that increasing the voltage above the nominal spec decreases the efficiency of the driver. It seems to me that any excess voltage (over the nominal 12v rating), hence decreased efficiency, must lead to the generation of heat. Heat is a big issue for these newer bright white LED's - this is especially important in outdoor lighting where designers may daisy-chain the fixtures, having 15v at the first fixture and 9v at the last one - if the 15v fixture burns hotter then it will lose brightness earlier leading to an eventual poor display of mixed brightness lighting. Am I correct in thinking that sourcing an LED lamp at 15v will put more heat into the module compared to a 12v source? And, if that's true can anyone direct me to a formula where I might predict the amount of excess heat produced? Even a ballpark percentage would be useful. Thanks.