Why didn't radioactive decay probabilities cause the same uproar as QM

BruteForce1
Messages
36
Reaction score
1
It is equally puzzling why we are confined to probability amplitudes for RD as in QM measurements. Newtonian determinism is undermined in both, so why were there still Newtonian determinists around when QM hit the scene?

We still have deterministic equations for both ofc but they are limited to probability values. That is all we can know.
 
Physics news on Phys.org
My thought on this is radiation emission could only be measured in a qualitative way at the turn of the century. Radiation was detected by measuring ionization in a gas which required many many particles per second to produce a measurable current whose fluctuations where imperceptible . It is not clear if from the early methods used that radiation was emitted randomly. Rutherford observed alpha particle emanations on a ZnS screen but even though it was assumed to be a particle that caused the scintillations in the screen one might not necessarily deduce that this apparently random flickering wasn't due to a stochastic process such as scattering in the material itself. That is you could have a constant rate of decay with scattering in unpredictable directions. At this time Einstein published his Brownian motion paper demonstrating random collision in a fluid explained classically.
 
gleem said:
My thought on this is radiation emission could only be measured in a qualitative way at the turn of the century. Radiation was detected by measuring ionization in a gas which required many many particles per second to produce a measurable current whose fluctuations where imperceptible . It is not clear if from the early methods used that radiation was emitted randomly. Rutherford observed alpha particle emanations on a ZnS screen but even though it was assumed to be a particle that caused the scintillations in the screen one might not necessarily deduce that this apparently random flickering wasn't due to a stochastic process such as scattering in the material itself. That is you could have a constant rate of decay with scattering in unpredictable directions. At this time Einstein published his Brownian motion paper demonstrating random collision in a fluid explained classically.

Would it be fair to say that both QM measurement and radioactive decay exhibits information loss, or would this be misuse of the term? You have information that gets embedded into probability amplitudes. I don't believe it literally does that, but to our eyes that is what is being observed.
 
Sorry, I am not well versed in issues regarding information so I will not comment.
 
There was nothing like information theory in 1900.
 
BruteForce1 said:
It is equally puzzling why we are confined to probability amplitudes for RD as in QM measurements. Newtonian determinism is undermined in both, so why were there still Newtonian determinists around when QM hit the scene?

We still have deterministic equations for both ofc but they are limited to probability values. That is all we can know.

The problem with QM is not that it is random. The problem is that it requires something that is traditionally called an observer or a measurement device that is "macroscopic" or "classical". It would seem that we should be able to describe the observer using quantum mechanics, but if we do so, then we need another observer to observe the observer. This is traditionally called the measurement problem.

It is discussed briefly by Dirac.
https://blogs.scientificamerican.com/guest-blog/the-evolution-of-the-physicists-picture-of-nature/
"And when this new development occurs, people will find it all rather futile to have had so much of a discussion on the role of observation in the theory, because they will have then a much better point of view from which to look at things."

And in a long article by Bell.
https://m.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf
Against ‘measurement’
 
atyy said:
The problem with QM is not that it is random. The problem is that it requires something that is traditionally called an observer or a measurement device that is "macroscopic" or "classical". It would seem that we should be able to describe the observer using quantum mechanics, but if we do so, then we need another observer to observe the observer. This is traditionally called the measurement problem.

It is discussed briefly by Dirac.
https://blogs.scientificamerican.com/guest-blog/the-evolution-of-the-physicists-picture-of-nature/
"And when this new development occurs, people will find it all rather futile to have had so much of a discussion on the role of observation in the theory, because they will have then a much better point of view from which to look at things."

And in a long article by Bell.
https://m.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf
Against ‘measurement’

Radioactive decay is not random either. It also has a differential equation behind it.
 
BruteForce1 said:
Radioactive decay is not random either. It also has a differential equation behind it.

The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It gives you only a probability for each individual atom.

I don't know the answer and I'm sure you could find out with your own research what, if any, theories there were to explain the (apparent) randomness of radio-active decay. My guess is that there was an assumption that once you knew enough about a particular atom, you could determine when it was going to decay. In the same way that you could study a macroscopic process that followed a statistical law. By looking closely enough you could explain the apparent randomness, perhaps by a distribution of initial conditions or the randomness of external influences.

The difference with QM is that it postulated a fundamental randomness (i.e. only probabilities) at the lowest level: probabilities that, according to the theory, could never be explained in terms of a more fundamental deterministic model. That said, the Bohmian interpretation of QM tries to do just that.

The short answer to your question is perhaps simply that everyone thought the probabilities of radioactive decay could eventually be explained by a deterministic theory. It would be interesing to know whether anyone took the randomness at face value and postulated that atoms might decay according to fundamentally probabilistic laws.
 
  • Like
Likes meopemuk
PeroK said:
The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It gives you only a probability for each individual atom.

I don't know the answer and I'm sure you could find out with your own research what, if any, theories there were to explain the (apparent) randomness of radio-active decay. My guess is that there was an assumption that once you knew enough about a particular atom, you could determine when it was going to decay. In the same way that you could study a macroscopic process that followed a statistical law. By looking closely enough you could explain the apparent randomness, perhaps by a distribution of initial conditions or the randomness of external influences.

The difference with QM is that it postulated a fundamental randomness (i.e. only probabilities) at the lowest level: probabilities that, according to the theory, could never be explained in terms of a more fundamental deterministic model. That said, the Bohmian interpretation of QM tries to do just that.

The short answer to your question is perhaps simply that everyone thought the probabilities of radioactive decay could eventually be explained by a deterministic theory. It would be interesing to know whether anyone took the randomness at face value and postulated that atoms might decay according to fundamentally probabilistic laws.

I don't view this as a problem of determinism since we have the equations which are unambigious and deterministic. A random theory of both would be that they are sometimes trajectory specific, sometimes probabilistic, and sometimes none of the above.

QM measurements are always "non classical" (probabilistic), as is radioactive decay. I view the problem more with regards to our knowledge limitation and the universe itself. It appears that there is no way for us to gain full access in these circumstances. Not even in theory. If that is true, then that's a remarkable aspect of the universe. The universe tends to be very open for our inquires at the macroscopic levels. That was the hole beauty of the universe and science, that it was comprehensible.

Why do we have a very good model for only half a theory (probabilities) that are just as reliable as the macroscopic determinism, but confined to deterministic probabilities? It was initially believed determinism and probabilities were oxy morons, but scientists have learned to live with both existing simultaneously, only that the "flight path Newtonian determinism" breaks down in lower scales.
 
Last edited:
  • Skeptical
Likes weirdoguy
  • #10
Why come to ask a question if you already have an answer?

You get an answer and then you say "no, that's not it it's ...".
 
  • Like
Likes weirdoguy
  • #11
PeroK said:
Why come to ask a question if you already have an answer?

You get an answer and then you say "no, that's not it it's ...".

Because we are not discussing the threads question. My question was about the reception among scientists back then. I simply gave an account of why I don't view it as problematic to speak of determinism and probabilities in the same breath. The math works. In the area of physics back then it had to be more controversial since it was a long held view that flight path determinism rules. If you our current understanding is correct, then the universe at smaller scales is a clockwork not open for full scrutiny.
 
  • #12
BruteForce1 said:
Radioactive decay is not random either. It also has a differential equation behind it.

Radioactivity is random (the differential equation describes the probability of decay), and quantum mechanics is also random. Quantum mechanics describes radioactive decay. However, the necessity for an observer in quantum mechanics was not apparent around 1900, as quantum mechanical theory was not yet developed. The necessity for an observer in quantum mechanics was only understood later from around 1926, after the quantum formalism was more developed.
 
  • #13
atyy said:
Radioactivity is random (the differential equation describes the probability of decay), and quantum mechanics is also random.

Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
 
  • #14
The decay of an individual atom doesn't have a plan, purpose, or pattern, but I don't think the definition from M-W is useful here.
Quantum mechanics is probabilistic in some interpretations and deterministic (but still appearing random to observers) in others. It doesn't make sense to say it has to be one or the other as interpretations are a matter of taste (and shouldn't be discussed here, we have a separate forum for them).
 
  • #15
The theoretical work on radioactive decay started in 1840 and was apparently finished by 1940. So before 1940 there was no partial differential equation?
 
  • #16
mfb said:
The decay of an individual atom doesn't have a plan, purpose, or pattern,

A probability distribution is a pattern.
 
  • #17
An inanimate object could never have a "plan". Lack of "plan" refers to living, thinking, objects exhibiting random behavior.
 
  • #18
Is your argument really "the existence of a probability distribution means it is not random"? That is an extreme outsider view.
BruteForce1 said:
The theoretical work on radioactive decay started in 1840 and was apparently finished by 1940. So before 1940 there was no partial differential equation?
People didn't stop working on it in 1940. And differential equations are older than 1940.
 
  • #19
BruteForce1 said:
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
The Mathematical/Scientific description that applies here is: ( specific outcomes/values) not being predictable but following or being described by a probability distribution. Just like being given a fair coin. Outcome cannot be predicted but expected to follow a distribution over the long run.
 
  • Like
Likes mfb
  • #20
WWGD said:
The Mathematical/Scientific description that applies here is: ( specific outcomes/values) not being predictable but following or being described by a probability distribution. Just like being given a fair coin. Outcome cannot be predicted but expected to follow a distribution over the long run.

Probability distribution is a prediction based on a pattern of behavior. This is elementary.
 
  • #21
mfb said:
People didn't stop working on it in 1940. And differential equations are older than 1940.

I'm talking about the theoretical frame work. I didn't say differential equations started in 1940. What I found so far researching it, suggests that the differential equations of radioactive decay began in 1940, unless 1940 is referring to something else.
 
  • #22
BruteForce1 said:
Probability distribution is a prediction based on a pattern of behavior. This is elementary.
Then why are you using the everyday definition?
 
  • #23
WWGD said:
Then why are you using the everyday definition?

The everyday definition states that randomness is a patternless behavior.

The following is a pattern in my book:

"given a sample of a particular radioisotope, the number of decay events −dN expected to occur in a small interval of time dt is proportional to the number of atoms present N"
 
  • #24
BruteForce1 said:
The following is a pattern in my book:
"given a sample of a particular radioisotope, the number of decay events −dN expected to occur in a small interval of time dt is proportional to the number of atoms present N"
However, that statement is equivalent to stating that a single atom’s decay is patternless and completely random - the probability of it decaying at any given moment is the same for all moments.

It appears to me that you have adopted a definition of “pattern” is inconsistent, in that that there s something that you consider a pattern in patternless behavior. Such inconsistencies are generally a hint that the terms need to be defined more carefully.
 
  • #25
I'd say, there's no difference between the observed randomness of quantum phenomena in general and radioactive decay probabilities. After all the latter are just an application of the former.

I think the apparent problems people have with irreducible randomness is a cultural phenomenon, i.e., it's hard to accept that nature behaves deterministic on a fundamental level, but nature doesn't care for some philosophical prejudices of humans but just behaves as she does ;-).
 
  • Like
Likes hutchphd and PeroK
  • #26
BruteForce1 said:
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern

True randomness means that an event occurring in space and time can on principle never be undone.
 
  • #27
Lord Jestocost said:
True randomness means that an event occurring in space and time can on principle never be undone.
Is this formulation your own or can you ascribe it? ... I rather like it.
 
  • #28
Lord Jestocost said:
True randomness means that an event occurring in space and time can on principle never be undone.

Explain
 
  • #29
BruteForce1 said:
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
If you wish to play semantic games, I will point out that the definition you misquote several times after this is "lacking a definite pattern".
 
  • #30
hutchphd said:
Is this formulation your own or can you ascribe it? ... I rather like it.

My formulation is based on the following sentence in Sir Arthur Stanley Eddington’s book „THE NATURE OF THE PHYSICAL WORLD“ (which I highly recommend): „This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone.
 
  • Informative
Likes hutchphd
  • #31
Could one attribute a single radioactive decay to complex external conditions such as climate, similar to how genetic mutation occurs? Or is there no theory behind it at all?
 
  • #32
.
vanhees71 said:
I'd say, there's no difference between the observed randomness of quantum phenomena in general and radioactive decay probabilities.

One difference is that QM is deterministic when you're not measuring.
 
  • #33
Lord Jestocost said:
My formulation is based on the following sentence in Sir Arthur Stanley Eddington’s book „THE NATURE OF THE PHYSICAL WORLD“ (which I highly recommend): „This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone.

So he views randomness as noise. I don't see how deterministic systems could be undone either, however.
 
  • #34
BruteForce1 said:
Explain

It means: You cannot trace back along a causal chain in space and time why a true random individual event has occurred at a certain space-time coordination.
 
  • #35
My definition of randomness is: a causal process that cannot be attributed to any external or internal antecedent factor. In other words a causeless causal process.

I don't know if radioactive decay and QM really fits that.
 
  • Skeptical
Likes weirdoguy
  • #36
Lord Jestocost said:
It means: You cannot trace back along a causal chain in space and time why a true random individual event has occurred at a certain space-time coordination.

And that's not really true for RD or QM. Not being able to account for is why it happens at a particular point in time is different from why it happens at all.
 
  • Skeptical
Likes weirdoguy
  • #37
Nugatory said:
- the probability of it decaying at any given moment is the same for all moments.

That's a pattern. If something always has a 50% probability of failure, then that is a pattern.
 
  • Skeptical
Likes weirdoguy
  • #38
A random physical process to me is: the absence of everything followed by a quantum fluctuation state, with all the parameters involved in generating a universe, followed by a generated universe.

The first chain is completely unaccounted for.
 
  • Skeptical
Likes weirdoguy
  • #39
An ever existing quantum fluctuation state generating universes in the same fashion as radioactive nucleis decaying is not random to me, however. We have a source at least.

The devil is in the details.
 
  • Skeptical
Likes weirdoguy
  • #40
BruteForce1 said:
An ever existing quantum fluctuation state generating universes in the same fashion as radioactive nucleis decaying is not random to me, however. We have a source at least.

The devil is in the details.
Either I'm not seeing some posts or you're debating with yourself here.
 
  • Like
Likes weirdoguy
  • #41
PeroK said:
Either I'm not seeing some posts or you're debating with yourself here.

I'm fleshing out my line of thinking out. If we have a source, how can it be random? It is a perfect of example of devil in the details.

I also gave an example of what would be a sourceless phenomenon.
 
  • #42
Lord Jestocost said:
It means: You cannot trace back along a causal chain in space and time why a true random individual event has occurred at a certain space-time coordination.
This is the case for radioactive decay. At least today there's no known way to know, when a nucleus precisely decays or why a nucleus has decayed at precisely that point in time at this place. It just happens randomly. All we have is a very precise theory predicting the probability for its decay, the Standard Model.
 
  • #43
vanhees71 said:
This is the case for radioactive decay. At least today there's no known way to know, when a nucleus precisely decays or why a nucleus has decayed at precisely that point in time at this place. It just happens randomly. All we have is a very precise theory predicting the probability for its decay, the Standard Model.

You know it has something to do with the radioactive nucleus, though. It is the most reasonable inference. If your technology breaks down but you can't trace to why it broke down, and why at that point, that doesn't mean it was a random event. Or do you think it was?
 
  • #44
Of course, there's no reasonable doubt that the radioactive decay is the decay of a radioactive nucleus. If I have some Radium nucleus, I know it will at some time randomly emit an ##\alpha## particle (He nucleus) with a half-life of about 1600 years. I.e., investigating a large number of Ra nuclei after 1600 years I have about only half of them left. When a specific Ra nucleus decays, we cannot predict.
 
  • #45
vanhees71 said:
Of course, there's no reasonable doubt that the radioactive decay is the decay of a radioactive nucleus. If I have some Radium nucleus, I know it will at some time randomly emit an ##\alpha## particle (He nucleus) with a half-life of about 1600 years. I.e., investigating a large number of Ra nuclei after 1600 years I have about only half of them left. When a specific Ra nucleus decays, we cannot predict.

And external factors like climate have no bearing on it? I'm trying to think of it like fail rates in technology. We know why some batteries fail earlier than others (storage, heating, etc).

same with genetic mutation.
 
  • #46
Have I understood the mathematical principle correctly here...If you have a lot of radioactive nucleus and you know anyone of them can go off at anytime, probability of a decay is higher since you have more of them?

One of them will go off T-2, another T-10, another T-30, etc and the more you have, the more T-decays scenarios you have made possible?There is nothing more to it than this, right?
 
  • #47
vanhees71 said:
Of course, there's no reasonable doubt that the radioactive decay is the decay of a radioactive nucleus. If I have some Radium nucleus, I know it will at some time randomly emit an ##\alpha## particle (He nucleus) with a half-life of about 1600 years. I.e., investigating a large number of Ra nuclei after 1600 years I have about only half of them left. When a specific Ra nucleus decays, we cannot predict.
Were there experiments that repeatedly prepared single radioactive atoms and waited until they decayed?
 
  • #48
BruteForce1 said:
And external factors like climate have no bearing on it? I'm trying to think of it like fail rates in technology. We know why some batteries fail earlier than others (storage, heating, etc).

same with genetic mutation.
It's very difficult to affect nuclear properties like decay rates due to the typical energy scales involved (MeV rather than eV in atomic physics). The only exception are cases like bound ##\beta## decays, where it can make a huge difference whether you look at the atom or the completely ionized bare nucleus, where due to the Pauli effect the ##\beta## decay is pretty well blocked, and the half-life between the atom and the bare nucleus differs by several orders of magnitude:

https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.77.5190
 
  • Informative
  • Like
Likes DarMM, weirdoguy and A. Neumaier
  • #49
A. Neumaier said:
Were there experiments that repeatedly prepared single radioactive atoms and waited until they decayed?
I'd consider the investigations in storage rings as examples for this. This is a pretty interesting field, also for precision measurements. One fascinating example is the GSI storage-ring result on Rhenium bound ##\beta## decay quoted above. Then there was also a high-precision test for time dilation of the life-time of moving unstable nuclei (at moderate speeds of about ##\beta=1/3##), of course confirming the Lorentz ##\gamma## factor result of Special Relativity.
 
  • Like
Likes mfb
  • #50
vanhees71 said:
I'd consider the investigations in storage rings as examples for this. This is a pretty interesting field, also for precision measurements. One fascinating example is the GSI storage-ring result on Rhenium bound ##\beta## decay quoted above.
In these experiments a large number of radioactive atoms are prepared simultaneously and only the number of decay product atoms counted; one does not know which atom decayed when. Thus this is not what I meant.

My question was whether a single radioactive atom prepared on a surface or an ion trap can be observed to decay.
 
Back
Top