NanakiXIII
- 391
- 0
Say we have a decay of the form
A e^{-a x} + B e^{-b x}.
I haven't had much luck trying to calculate the half-life of such a decay (I'm not sure it's possible, analytically), i.e. solve
A e^{-a x} + B e^{-b x} = \frac{A+B}{2}.
However, if that's not possible, I'm wondering whether there might still be a way to prove that if the above decay has half-life t, then a decay given by
A e^{-a c x} + B e^{-b c x}
has half-life \frac{t}{c}. This seems to be true empirically and would make sense, I think. Does anyone have an idea how one might prove it to be true, though?
A e^{-a x} + B e^{-b x}.
I haven't had much luck trying to calculate the half-life of such a decay (I'm not sure it's possible, analytically), i.e. solve
A e^{-a x} + B e^{-b x} = \frac{A+B}{2}.
However, if that's not possible, I'm wondering whether there might still be a way to prove that if the above decay has half-life t, then a decay given by
A e^{-a c x} + B e^{-b c x}
has half-life \frac{t}{c}. This seems to be true empirically and would make sense, I think. Does anyone have an idea how one might prove it to be true, though?