I'm wondering if it is possible to slow down beta decay in cobalt 60 by cooling it?
Nope. Radioactivity isn't a thermal effect, nor is it influenced by temperature.
I thought everything was slowed by cooler temperature. Theoretically, would beta decay still occur at Absolute Zero?
I'm now confused as to why my physics teacher said it was a plausible experiment.
But beta decay is governed by WEAK INTERACTION. Heat/temperature is more of molecular vibrations mediated by EM interaction. These two are completely different mechanism.
If it is influenced by temperature, then the half life of everything must be tabulated with the corresponding temperature. You don't see this in CRC Handbook.
The bibliography in successful attempts to vary decay rates via physico-chemical influence is scarce. I thing the greatest resported variation is about 10%, and I can not remmber what substance was involved.
So yes, it is a plausible experiment, as double beta decay is: unlikely, but measurable, and interesting if you get positive results.
Hmm if quantum mechanics were a theory of hidden variables, then we could consider these variables as contributing to temperature. It is more of a redefinition of the concept of Temperature (and Absolute Zero), isn't it?
Separate names with a comma.