Hi, I've been curious about his subject for quite a while now, and I figure I need to ask here in these forums to get started in the right direction in understanding this. In classical thermodynamics, we define entropy: [tex] ds = \delta q / T [/tex] However, when it comes to radiation in quantum physics, matching the bandgap is more important. For instance, say I have a PV material tuned to 500 nm (approximately), and I have a couple of lasers, one at 490 nm, one at 525 nm, and one at 500 nm. If I shine the lasers on the PV, I may get some energy from the 525 nm, and maybe a little more from the 490 nm, but the most from the 500 nm laser, since the PV is tuned to that wavelength (this assumes all laser outputs are equal as well). So, how does quantum physics define entropy in terms of electromagnetic radiation? Is there a way to redefine entropy for radiation that takes into account that a laser, for instance, is easier to recover the energy than for the same amount of energy from a black body? In addition, Wien's displacement law gives us a trend (maximum wavelength of a black body): [tex] C_W = 2898 \mu m \cdot K = \lambda T [/tex] This, to me, suggests that since higher temperatures have shorter wavelengths (and therefore characteristically the average photon energies are higher), that perhaps higher energy photons (shorter wavelengths) are more useful generally in terms of work and/or potential. Could this be true, or is it unrelated?