The Mystery of Atomic Decay: Is it Truly Random or is There More at Play?

In summary: It depends on what you mean by "wait". The few left over atoms might "wait" until the next time that they have a chance to decay, or they might "wait" until they get a chance to decay in a different way (say, by being absorbed by another atom).
  • #1
artis
1,481
976
Moderator's note: This is spin-off from
https://www.physicsforums.com/threads/why-randomness-means-incomplete-understanding.975227
Reading this thread I would like to point out one aspect that makes me wonder personally and has also been mentioned here by other users.
The decay of atoms. Surely most here as well as elsewhere answer right away that this is random, and arguably isolating a single atom and then "waiting" for it to decay it, really seems random, as the decay times differ in each case. The problem I think gets more serious when we consider a multi particle system, let's take a radioactive substance X, we can have 1kg of that substance corresponding to N number of particles or 10kg having Nx10 number of particles or any other value of particles in any other system of the same substance. We know that the radioactive decay rate aka half-life will still be the same. Surely the individual atom decay is seemingly random yet somehow when there is a whole system of particles, even if separated by distance, as long as they are of the same type of material they have the same half life.

The next answer I read often is about the probabilistic nature of decay (as well as other quantum processes)
I can understand probability and it's randomness but what seems very odd to me is the time accuracy of the decay properties of the system which seemingly operates on purely random no-memory type of mechanism.
I don't say that a dice analogy is the best one as that is a classical (macro) example not a quantum one but surely U235 has been accurate in this "random dice rolling" for billions of years as well as other isotopes like C14.
Maybe I'm missing out something here, and writing in the evening is less productive so pardon for any obvious mistakes, but isn't such "perfect randomness" the exact opposite of what is normally considered random?
By perfect randomness I mean not the decay of individual atoms themselves but the corresponding decay of exactly the right amount of atoms in each half life given atoms have no memory or consciousness.So we could argue about any other system where the individual part of the micro system and it's properties seem random yet when many such individual parts are looked at together they always produce specific results which is the reason why we have any specific properties in our macro world at all isn't it ?

PS. shouldn't a purely random process behind nuclear decay result in a shift in the accuracy of half life when spread over a very long period of time? Otherwise I feel that saying that the process is random is just a way to avoid saying that we don't have an explanation of why seemingly individual/undetermin atoms produce on average very precise
 
Last edited by a moderator:
  • Like
Likes geoelectronics
Physics news on Phys.org
  • #2
artis said:
By perfect randomness I mean not the decay of individual atoms themselves but the corresponding decay of exactly the right amount of atoms in each half life given atoms have no memory or consciousness.
That's a consequence of the law of large numbers. It is fully compatible with randomness. It is the reason why we observe regularity in the large but randomness in microscopic detail.
 
  • Like
Likes geoelectronics, Mentz114, vanhees71 and 1 other person
  • #3
artis said:
shouldn't a purely random process behind nuclear decay result in a shift in the accuracy of half life when spread over a very long period of time?

It depends on what you mean by "random". That word has multiple possible meanings.

In the case of radioactive decay, "random" means that we cannot predict when any individual atom will decay, but the probability of decay per unit time is the same for all atoms of the same type. The half-life is a determinate function of the probability of decay per unit time; that's why it's the same for any sample of the same isotope (i.e., anything made of the same type of atoms). So the half-life is not "random"--it's determinate, because the probability of decay per unit time is. And the probability of decay per unit time is determinate, and the same for all atoms of the same type, because it is determined by the quantum state of the undecayed atom, which is the same for all atoms of the same type (that's what "atoms of the same type" means).

artis said:
isn't such "perfect randomness" the exact opposite of what is normally considered random?

No. It's the exact same as what is normally considered random. You dismiss the analogy to die rolling, but that analogy is exactly right if you want to understand what is "normally" considered random: a process where each individual run is unpredictable, but the probability of each possible result is the same for all runs, because the underlying physical system is the same.

artis said:
most of the atoms that have to decay in a single half life can decay in the beginning of the half life yet the few left over ones will "sit and wait" patiently as if they were told by someone of authority to do so

This won't work, because the probability of decay per unit time is constant for all times, not just for one exact half-life. If most of the atoms decayed at the beginning of the half-life, for example, then the probability of decay per unit time would be larger at the beginning of the half-life than at the end.
 
  • #4
@A. Neumaier Ok I agree that for any system consisting of a large sample size the result is almost always close to the predicted value but we know that half life obeys the same time frame for the same element irrespective of sample size (number of atoms in the sample) does it not?
surely in nature all our radioactive elements have large sample sizes but in theory we could also have a small sample size element couldn't we?

@PeterDonis I agree that the half life is determinate it wasn't my intention to say that it's random , what I was arguing is that it is "weird" at least for me to say that the fundamentally random process of nuclear decay results in a time accurate deterministic half life.
Ok let's take the dice analogy, this dice should probably have only two options on the dice and each time it is rolled it would have to start from position 1 resembling the undecayed atom while position 2 would resemble the decayed atom close enough? so I suppose you would argue that if I roll my dice once every 2 seconds for say 24 hours straight I could develop a large enough sample size from which some average result could be selected which would resemble a probabilistic scenario similar to the quantum decay of atoms in a given system that has the same number of atoms as my count of dice rolls,

how can we using this approach then explain away the case where most of the atoms decay in say the end of the half life, in my dice analogy I would then have to roll the dice for most of the 24 hours with the dice mostly dropping on the very position it was "prepared" in while in the last say hour the dice would constantly fall on the second position.
Is there a flaw in my reasoning because this seems to be the case if we use this analogy?
 
  • #5
artis said:
It is "weird" at least for me to say that the fundamentally random process of nuclear decay results in a time accurate deterministic half life

Yes, and that's because you're not actually considering what "random" means in this connection. You're thinking it means "everything should be random", when all it actually means is "the timing of individual decays can't be predicted in advance".

artis said:
this dice should probably have only two options on the dice and each time it is rolled it would have to start from position 1 resembling the undecayed atom while position 2 would resemble the decayed atom close enough?

No. The die doesn't start in either position. It starts in a "ready" state in which no position is selected. The act of rolling the die produces a result, either "decay" or "don't decay". For each unit of time, you roll one die for each undecayed atom; if the roll comes up "decay", that atom decays in that unit of time. Each die has a fixed probability of coming up "decay".

artis said:
how can we using this approach then explain away the case where most of the atoms decay in say the end of the half life

There is no such case. That never happens. Go read the last paragraph of my post #100 again.
 
  • Like
Likes artis and vanhees71
  • #6
Well pardon my arguments, I'm very lame at them. I agree (after all it's established physics) that from a large sample size the decay rate is constant and it doesn't matter that we cannot predetermine which exact atoms will decay and when(if we added a theoretical counter to each one and added a number to it) as it doesn't matter since we are sure enough will eventually decay so that the number adds up to a specific number determining the length of the half life.And this observation agrees with probability theory.

I guess I have to ask whether this probability applies to a sample size of say few atoms, say we had only 8 atoms, given the intrinsic unpredictability of any given individual atom decaying at any specific time could we still measure a distinct half life in this case? I guess what I'm asking is does the precision of half life spread out with decrease in sample size.
 
  • #7
It's all statistics. Let's take the simple approximation of the exponential decay law.

[CORRECTED!]
The probability for a single nucleus to have NOT decayed at time ##t##, given that it was there at time ##t=0## is
$$P(t)=\exp(-\lambda t),$$
and the probability distribution for the survival time is
$$P_1(t)=-\dot{P}(t)=\lambda \exp(-\lambda t).$$
The meaning of ##\lambda## becomes clear when calculating the mean lifetime:
$$\tau=\langle t \rangle = \int_0^{\infty} t P_1(t) = \lambda \int_0^{\infty} \mathrm{d} t \exp(-\lambda t) = -\lambda \frac{\mathrm{d}}{\mathrm{d} \lambda} \int_0^{\infty} \mathrm{d} t \exp(-\lambda t) = -\lambda \frac{\mathrm{d}}{\mathrm{d} \lambda} \frac{1}{\lambda}=\frac{1}{\lambda}.$$
So the mean lifetime is the inverse of the decay rate, i.e., ##\lambda## is the decay rate.

Now consider ##N## atoms. The decays are completely random, i.e., there's no correlation from any atom to decay with another atom to decay. That's why the probability that out of the ##N## atoms, present at time ##t=0##, ##S## survive (and consequently ##N-S## decay) is given by the binomial distribution,
$$P_N(S,t)=\binom{N}{S} P^S(t) [1-P(t)]^{N-S}.$$
Now you can calculate the mean number of particle surviving at time ##t##
$$\langle S \rangle = \sum_{S=0}^{N} S P_N(S,t).$$
To evaluate the sum define
$$f(x,y)=\sum_{S=0}^N \binom{N}{S} x^S y^{N-S}=(x+y)^N.$$
Then
$$f'(x,y)=\sum_{S=0}^{N} \binom{N}{S} S x^{S-1} y^{N-S}=N(x+y)^{N-1}.$$
Thus
$$\langle S \rangle=P(t) f'[P(t),1-P(t)]=N P(t).$$
Now get the standard deviation ##\Delta S##
$$\Delta S^2=\langle S^2 \rangle-\langle S \rangle^2.$$
To that end we use again ##f(x,y)##:
$$\langle S^2 \rangle=x \partial [x \partial_x f(x,y)]|_{x=P(t),y=1-P(t)}=N P(t)[1-P_1).$$
This gives
$$\Delta S^2 =N P(t) [1-P(t)].$$
So ##\Delta S/\langle S =\rangle \mathcal{O}(1/\sqrt{N})##. For large ##N## the standard deviation becomes negligible compared to the mean.
 
Last edited:
  • Like
Likes geoelectronics, Mentz114, artis and 1 other person
  • #8
artis said:
I guess I have to ask whether this probability applies to a sample size of say few atoms, say we had only 8 atoms, given the intrinsic unpredictability of any given individual atom decaying at any specific time could we still measure a distinct half life in this case? I guess what I'm asking is does the precision of half life spread out with decrease in sample size.

You have to understand probability - and maybe nobody does! There can be no firm guarantees connecting the observed frequency with which an event happens and the probability of that event happening. An absolute guarantee that a event with a probability of , say, 0.03 will happen 3% of the time in a large number of independent trials is a contraction to the concept that the event is probabilistic. It is common practice in applications of probability theory to assume that the probability of an event definitely will be revealed by the actual frequences with which this event occurs. However, such an assumption is not part of mathematical probability theory. Assuming such a connection is a personal judgement. In a manner of speaking, people assume they will have at least "average luck".

Probability theory is essentially circular. If we are given (or assume) certain probabilities then we can deduce other probabilities. If we are given that the probability of an event is 0.03, we can determine the probability that it will happen in 5 out of 7 independent trials, or in 32 out of 100 independent trials. But we cannot make any non-probabilistic claims about how often the event will happen. A probability model for a phenomenon tells you about probabilties. All it can tell you about the actual frequences that will be observed is the probability of observing those frequencies.
 
  • Like
Likes artis
  • #9
artis said:
does the precision of half life spread out with decrease in sample size

The precision of being able to measure the half-life decreases with a decrease in the sample size. As @vanhees71 explained, that's just basic statistics; the standard deviation of a sample will be larger if the number of items in the sample is smaller.
 
  • Like
Likes vanhees71
  • #10
artis said:
I am somewhat relating to @georgir post #67 on this issue as well as @julcab12 post #64,
[
not just the time precision of the total number of decays but also the "Schrodinger factor" is what makes me doubt such absolutely fundamental randomness because from the Schrodinger cat we know that this decay of most of the atoms is also not linear with respect to time but can happen randomly(not taking into account the disturbance potentially caused by an observer interference) which means most of the atoms that have to decay in a single half life can decay in the beginning of the half life yet the few left over ones will "sit and wait" patiently as if they were told by someone of authority to do so. This fact seems incompatible with the general understanding of probability because at least to me the randomness of probability in general over many such atoms in a system seems at odds with the randomness of spontaneous decay of a single atom or in fact of many single atoms at any given time within this system.
This completely wrong. Have you looked into the Poisson distribution? Independent Poisson processes model the radioactive decay phenomenon accurately. There is no magic required.
 
Last edited:
  • Like
Likes vanhees71 and artis
  • #11
@PeterDonis your language based confirmation of my suspicion was great because @vanhees71 math was a bit over my level.
 
  • #12
artis said:
@PeterDonis your language based confirmation of my suspicion was great because @vanhees71 math was a bit over my level.
One take away from the math is that we have reasonable measurements of N (number of atoms at time zero) and then of S (atoms that survive, do not decay after some time interval). So, N minus S provides a figure to estimate decay rate of the sample over that time interval. Also, overall probabilities sum to one. If you know the probability of some event NOT happening (NOT-P), then 1 minus NOT-P gives you P, the probability that an event does occur.

The Poisson distribution? hint gives you a model that is easier to understand without knowing calculus. You can graph results and compare to a standard "bell curve" or use algebra and polynomials, instead.
 
  • #13
That's the point of my derivation of the probabilities. You have of course to do an error analysis for all your measurements. There's usually a systematical and a statistical error.

To measure radioactive decay you can as a most simple way just weigh your sample. Say, we have 1 mole substance. These are ##N_A \simeq 6 \cdot 10^{23}## nuclei. The relative statistical error is
$$\frac{\Delta S}{S} = \frac{1}{\sqrt{N}} \sqrt{1-\exp(-\lambda t)}\exp(+\lambda t/2).$$
If you don't wait too long (say a few lifetimes) you have a relative statistical error of the order ##\mathcal{O}(10^{-10})##.
 
  • Like
Likes geoelectronics
  • #14
vanhees71 said:
That's the point of my derivation of the probabilities. You have of course to do an error analysis for all your measurements. There's usually a systematical and a statistical error.

To measure radioactive decay you can as a most simple way just weigh your sample. Say, we have 1 mole substance. These are ##N_A \simeq 6 \cdot 10^{23}## nuclei. The relative statistical error is
$$\frac{\Delta S}{S} = \frac{1}{\sqrt{N}} \sqrt{1-\exp(-\lambda t)}\exp(+\lambda t/2).$$
If you don't wait too long (say a few lifetimes) you have a relative statistical error of the order ##\mathcal{O}(10^{-10})##.

May I ask what delta you are weighing? The mass deficit of nuclear binding energy?

George Dowell
 

1. What is atomic decay?

Atomic decay, also known as nuclear decay, is the process by which an unstable atomic nucleus loses energy by emitting radiation. This can result in the formation of a different element or a more stable form of the same element.

2. What causes atomic decay?

Atomic decay is caused by the instability of an atom's nucleus. This instability can be due to an excess of protons or neutrons, or an imbalance between the two. As a result, the nucleus will release energy in the form of radiation to reach a more stable state.

3. How is atomic decay measured?

The rate of atomic decay is measured using the half-life, which is the amount of time it takes for half of the radioactive material to decay. This can vary greatly depending on the type of element and is a constant property of each element.

4. Is atomic decay random?

Yes, atomic decay is a random process. While scientists can predict the probability of an atom decaying within a certain time frame, they cannot predict which specific atom will decay at a given time. This randomness is a fundamental aspect of quantum mechanics.

5. How is atomic decay used in everyday life?

Atomic decay has many practical applications, such as in nuclear power plants, medical imaging, and carbon dating. It is also used in smoke detectors, where the decay of a radioactive material creates an electrical current that can detect smoke particles in the air.

Similar threads

Replies
19
Views
2K
  • Quantum Physics
Replies
3
Views
937
Replies
4
Views
692
  • Quantum Physics
Replies
12
Views
2K
  • Quantum Physics
Replies
23
Views
999
  • Quantum Physics
Replies
4
Views
805
  • High Energy, Nuclear, Particle Physics
Replies
3
Views
1K
Replies
3
Views
859
  • Quantum Physics
3
Replies
92
Views
8K
  • Other Physics Topics
Replies
6
Views
1K
Back
Top