Determining a "Safe Distance" from a GRB We are given the total energy of a Gamma Ray Burst, E, over a given time t. Therefore we can conclude that the average power is: P = E/t The question then asks how far one would have to be from this gamma ray burst in order for the average power from it to be equivalent to the average power from the sun's radiation at the earth. We know that the solar constant (i.e. power released by the sun per unit area) is about 1300 Watt/m2. This is where my problem arises, since I don't know how to relate the distance from the GRB with the average power of it and the solar constant. Any help would be greatly appreciated. Thanks.