Hey guys. I'm struggling to picture ultraviolet catastrophe. The graph tells me that at certain temperature, intensity should exponentially increase with decrease of wavelength. Of course, the theory is wrong, but how do i even PICTURE this with old model? If i imagine atoms in the dense material as oscillators, and heat the material to certain temperature (lets say 500°C), at that certain temperature most of oscillators would have about the same kinetic energy, right? So even if i even manage to find some of the oscillators with very high frequency (which i guess is less likely) the overall intensity of that "light" should still be low (becouse i have only several of these high-energy electrons). What i want to say is that, if i have a distribution of kinetic energies in the material, the highest energy electrons (whose energy is above average) produce highest frequencies but there is just few of them, so how can i have such great intensity at small wavelength? I'm just getting into this so please don't be too mathy. I would like to get the "feel" for this. Thanks.