So one guy comes to me and says that he doesn't believe that a supernova 50ly away would be dangerous enough to pose a lethal threat to us. I contradicted him because based on my general reading supernovas (depending on their magnitude) between at least a 100 to 3000 light years away would expose us and our atmosphere to overwhelming levels of radiation. However then I thought of going further and actually doing some calculations myself to prove my point but I soon discovered the task is much more difficult than I would like it to be: I start with the assumption that the supernova is 50ly away and radiates the energy worth of 1 Foe (10^44j). This gives me the value of 36Mj per metre squared at our distance. However here come the unknowns: - Over what duration is this energy expelled? Are we looking at most of it shining through in 100 seconds or a month or the whole year? - What % of that is short-wavelength radiation? Because unless we get kilowats worth of infrared radiation we are really mainly concerned with the harmful stuff right? - How much can our atmosphere take? How many wats worth of gamma and x-rays do we have to be exposed to in order to deplete our ozone layer and kill us? I am assuming there will be people here who dealt with a similar issue at some point and can perhaps at least talk from experience if not maths?