Why didn't radioactive decay probabilities cause the same uproar as QM

  • #1
36
-2

Main Question or Discussion Point

It is equally puzzling why we are confined to probability amplitudes for RD as in QM measurements. Newtonian determinism is undermined in both, so why were there still Newtonian determinists around when QM hit the scene?

We still have deterministic equations for both ofc but they are limited to probability values. That is all we can know.
 

Answers and Replies

  • #2
gleem
Science Advisor
Education Advisor
1,672
1,024
My thought on this is radiation emission could only be measured in a qualitative way at the turn of the century. Radiation was detected by measuring ionization in a gas which required many many particles per second to produce a measurable current whose fluctuations where imperceptible . It is not clear if from the early methods used that radiation was emitted randomly. Rutherford observed alpha particle emanations on a ZnS screen but even though it was assumed to be a particle that caused the scintillations in the screen one might not necessarily deduce that this apparently random flickering wasn't due to a stochastic process such as scattering in the material itself. That is you could have a constant rate of decay with scattering in unpredictable directions. At this time Einstein published his Brownian motion paper demonstrating random collision in a fluid explained classically.
 
  • #3
36
-2
My thought on this is radiation emission could only be measured in a qualitative way at the turn of the century. Radiation was detected by measuring ionization in a gas which required many many particles per second to produce a measurable current whose fluctuations where imperceptible . It is not clear if from the early methods used that radiation was emitted randomly. Rutherford observed alpha particle emanations on a ZnS screen but even though it was assumed to be a particle that caused the scintillations in the screen one might not necessarily deduce that this apparently random flickering wasn't due to a stochastic process such as scattering in the material itself. That is you could have a constant rate of decay with scattering in unpredictable directions. At this time Einstein published his Brownian motion paper demonstrating random collision in a fluid explained classically.
Would it be fair to say that both QM measurement and radioactive decay exhibits information loss, or would this be misuse of the term? You have information that gets embedded into probability amplitudes. I don't believe it literally does that, but to our eyes that is what is being observed.
 
  • #4
gleem
Science Advisor
Education Advisor
1,672
1,024
Sorry, I am not well versed in issues regarding information so I will not comment.
 
  • #5
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
2019 Award
24,782
7,774
There was nothing like information theory in 1900.
 
  • #6
atyy
Science Advisor
13,915
2,187
It is equally puzzling why we are confined to probability amplitudes for RD as in QM measurements. Newtonian determinism is undermined in both, so why were there still Newtonian determinists around when QM hit the scene?

We still have deterministic equations for both ofc but they are limited to probability values. That is all we can know.
The problem with QM is not that it is random. The problem is that it requires something that is traditionally called an observer or a measurement device that is "macroscopic" or "classical". It would seem that we should be able to describe the observer using quantum mechanics, but if we do so, then we need another observer to observe the observer. This is traditionally called the measurement problem.

It is discussed briefly by Dirac.
https://blogs.scientificamerican.com/guest-blog/the-evolution-of-the-physicists-picture-of-nature/
"And when this new development occurs, people will find it all rather futile to have had so much of a discussion on the role of observation in the theory, because they will have then a much better point of view from which to look at things."

And in a long article by Bell.
https://m.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf
Against ‘measurement’
 
  • #7
36
-2
The problem with QM is not that it is random. The problem is that it requires something that is traditionally called an observer or a measurement device that is "macroscopic" or "classical". It would seem that we should be able to describe the observer using quantum mechanics, but if we do so, then we need another observer to observe the observer. This is traditionally called the measurement problem.

It is discussed briefly by Dirac.
https://blogs.scientificamerican.com/guest-blog/the-evolution-of-the-physicists-picture-of-nature/
"And when this new development occurs, people will find it all rather futile to have had so much of a discussion on the role of observation in the theory, because they will have then a much better point of view from which to look at things."

And in a long article by Bell.
https://m.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf
Against ‘measurement’
Radioactive decay is not random either. It also has a differential equation behind it.
 
  • #8
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
13,769
6,256
Radioactive decay is not random either. It also has a differential equation behind it.
The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It gives you only a probability for each individual atom.

I don't know the answer and I'm sure you could find out with your own research what, if any, theories there were to explain the (apparent) randomness of radio-active decay. My guess is that there was an assumption that once you knew enough about a particular atom, you could determine when it was going to decay. In the same way that you could study a macroscopic process that followed a statistical law. By looking closely enough you could explain the apparent randomness, perhaps by a distribution of initial conditions or the randomness of external influences.

The difference with QM is that it postulated a fundamental randomness (i.e. only probabilities) at the lowest level: probabilities that, according to the theory, could never be explained in terms of a more fundamental deterministic model. That said, the Bohmian interpretation of QM tries to do just that.

The short answer to your question is perhaps simply that everyone thought the probabilities of radioactive decay could eventually be explained by a deterministic theory. It would be interesing to know whether anyone took the randomness at face value and postulated that atoms might decay according to fundamentally probabilistic laws.
 
  • Like
Likes meopemuk
  • #9
36
-2
The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It gives you only a probability for each individual atom.

I don't know the answer and I'm sure you could find out with your own research what, if any, theories there were to explain the (apparent) randomness of radio-active decay. My guess is that there was an assumption that once you knew enough about a particular atom, you could determine when it was going to decay. In the same way that you could study a macroscopic process that followed a statistical law. By looking closely enough you could explain the apparent randomness, perhaps by a distribution of initial conditions or the randomness of external influences.

The difference with QM is that it postulated a fundamental randomness (i.e. only probabilities) at the lowest level: probabilities that, according to the theory, could never be explained in terms of a more fundamental deterministic model. That said, the Bohmian interpretation of QM tries to do just that.

The short answer to your question is perhaps simply that everyone thought the probabilities of radioactive decay could eventually be explained by a deterministic theory. It would be interesing to know whether anyone took the randomness at face value and postulated that atoms might decay according to fundamentally probabilistic laws.
I don't view this as a problem of determinism since we have the equations which are unambigious and deterministic. A random theory of both would be that they are sometimes trajectory specific, sometimes probabilistic, and sometimes none of the above.

QM measurements are always "non classical" (probabilistic), as is radioactive decay. I view the problem more with regards to our knowledge limitation and the universe itself. It appears that there is no way for us to gain full access in these circumstances. Not even in theory. If that is true, then that's a remarkable aspect of the universe. The universe tends to be very open for our inquires at the macroscopic levels. That was the hole beauty of the universe and science, that it was comprehensible.

Why do we have a very good model for only half a theory (probabilities) that are just as reliable as the macroscopic determinism, but confined to deterministic probabilities? It was initially believed determinism and probabilities were oxy morons, but scientists have learned to live with both existing simultaneously, only that the "flight path Newtonian determinism" breaks down in lower scales.
 
Last edited:
  • Skeptical
Likes weirdoguy
  • #10
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
13,769
6,256
Why come to ask a question if you already have an answer?

You get an answer and then you say "no, that's not it it's ...".
 
  • Like
Likes weirdoguy
  • #11
36
-2
Why come to ask a question if you already have an answer?

You get an answer and then you say "no, that's not it it's ...".
Because we are not discussing the threads question. My question was about the reception among scientists back then. I simply gave an account of why I don't view it as problematic to speak of determinism and probabilities in the same breath. The math works. In the area of physics back then it had to be more controversial since it was a long held view that flight path determinism rules. If you our current understanding is correct, then the universe at smaller scales is a clockwork not open for full scrutiny.
 
  • #12
atyy
Science Advisor
13,915
2,187
Radioactive decay is not random either. It also has a differential equation behind it.
Radioactivity is random (the differential equation describes the probability of decay), and quantum mechanics is also random. Quantum mechanics describes radioactive decay. However, the necessity for an observer in quantum mechanics was not apparent around 1900, as quantum mechanical theory was not yet developed. The necessity for an observer in quantum mechanics was only understood later from around 1926, after the quantum formalism was more developed.
 
  • #13
36
-2
Radioactivity is random (the differential equation describes the probability of decay), and quantum mechanics is also random.
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
 
  • #14
34,477
10,598
The decay of an individual atom doesn't have a plan, purpose, or pattern, but I don't think the definition from M-W is useful here.
Quantum mechanics is probabilistic in some interpretations and deterministic (but still appearing random to observers) in others. It doesn't make sense to say it has to be one or the other as interpretations are a matter of taste (and shouldn't be discussed here, we have a separate forum for them).
 
  • #15
36
-2
The theoretical work on radioactive decay started in 1840 and was apparently finished by 1940. So before 1940 there was no partial differential equation?
 
  • #16
36
-2
The decay of an individual atom doesn't have a plan, purpose, or pattern,
A probability distribution is a pattern.
 
  • #17
36
-2
An inanimate object could never have a "plan". Lack of "plan" refers to living, thinking, objects exhibiting random behavior.
 
  • #18
34,477
10,598
Is your argument really "the existence of a probability distribution means it is not random"? That is an extreme outsider view.
The theoretical work on radioactive decay started in 1840 and was apparently finished by 1940. So before 1940 there was no partial differential equation?
People didn't stop working on it in 1940. And differential equations are older than 1940.
 
  • #19
WWGD
Science Advisor
Gold Member
2019 Award
5,338
3,279
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
The Mathematical/Scientific description that applies here is: ( specific outcomes/values) not being predictable but following or being described by a probability distribution. Just like being given a fair coin. Outcome cannot be predicted but expected to follow a distribution over the long run.
 
  • Like
Likes mfb
  • #20
36
-2
The Mathematical/Scientific description that applies here is: ( specific outcomes/values) not being predictable but following or being described by a probability distribution. Just like being given a fair coin. Outcome cannot be predicted but expected to follow a distribution over the long run.
Probability distribution is a prediction based on a pattern of behavior. This is elementary.
 
  • #21
36
-2
People didn't stop working on it in 1940. And differential equations are older than 1940.
I'm talking about the theoretical frame work. I didn't say differential equations started in 1940. What I found so far researching it, suggests that the differential equations of radioactive decay began in 1940, unless 1940 is referring to something else.
 
  • #22
WWGD
Science Advisor
Gold Member
2019 Award
5,338
3,279
Probability distribution is a prediction based on a pattern of behavior. This is elementary.
Then why are you using the everyday definition?
 
  • #23
36
-2
Then why are you using the everyday definition?
The everyday definition states that randomness is a patternless behavior.

The following is a pattern in my book:

"given a sample of a particular radioisotope, the number of decay events −dN expected to occur in a small interval of time dt is proportional to the number of atoms present N"
 
  • #24
Nugatory
Mentor
12,852
5,492
The following is a pattern in my book:
"given a sample of a particular radioisotope, the number of decay events −dN expected to occur in a small interval of time dt is proportional to the number of atoms present N"
However, that statement is equivalent to stating that a single atom’s decay is patternless and completely random - the probability of it decaying at any given moment is the same for all moments.

It appears to me that you have adopted a definition of “pattern” is inconsistent, in that that there s something that you consider a pattern in patternless behavior. Such inconsistencies are generally a hint that the terms need to be defined more carefully.
 
  • #25
vanhees71
Science Advisor
Insights Author
Gold Member
2019 Award
15,360
6,742
I'd say, there's no difference between the observed randomness of quantum phenomena in general and radioactive decay probabilities. After all the latter are just an application of the former.

I think the apparent problems people have with irreducible randomness is a cultural phenomenon, i.e., it's hard to accept that nature behaves deterministic on a fundamental level, but nature doesn't care for some philosophical prejudices of humans but just behaves as she does ;-).
 
  • Like
Likes hutchphd and PeroK

Related Threads on Why didn't radioactive decay probabilities cause the same uproar as QM

  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
2
Views
2K
  • Last Post
2
Replies
34
Views
2K
Replies
5
Views
920
Replies
3
Views
832
  • Last Post
Replies
15
Views
3K
  • Last Post
2
Replies
33
Views
3K
  • Last Post
2
Replies
25
Views
485
Top