Why didn't radioactive decay probabilities cause the same uproar as QM

In summary: The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It...
  • #1
BruteForce1
36
1
It is equally puzzling why we are confined to probability amplitudes for RD as in QM measurements. Newtonian determinism is undermined in both, so why were there still Newtonian determinists around when QM hit the scene?

We still have deterministic equations for both ofc but they are limited to probability values. That is all we can know.
 
Physics news on Phys.org
  • #2
My thought on this is radiation emission could only be measured in a qualitative way at the turn of the century. Radiation was detected by measuring ionization in a gas which required many many particles per second to produce a measurable current whose fluctuations where imperceptible . It is not clear if from the early methods used that radiation was emitted randomly. Rutherford observed alpha particle emanations on a ZnS screen but even though it was assumed to be a particle that caused the scintillations in the screen one might not necessarily deduce that this apparently random flickering wasn't due to a stochastic process such as scattering in the material itself. That is you could have a constant rate of decay with scattering in unpredictable directions. At this time Einstein published his Brownian motion paper demonstrating random collision in a fluid explained classically.
 
  • #3
gleem said:
My thought on this is radiation emission could only be measured in a qualitative way at the turn of the century. Radiation was detected by measuring ionization in a gas which required many many particles per second to produce a measurable current whose fluctuations where imperceptible . It is not clear if from the early methods used that radiation was emitted randomly. Rutherford observed alpha particle emanations on a ZnS screen but even though it was assumed to be a particle that caused the scintillations in the screen one might not necessarily deduce that this apparently random flickering wasn't due to a stochastic process such as scattering in the material itself. That is you could have a constant rate of decay with scattering in unpredictable directions. At this time Einstein published his Brownian motion paper demonstrating random collision in a fluid explained classically.

Would it be fair to say that both QM measurement and radioactive decay exhibits information loss, or would this be misuse of the term? You have information that gets embedded into probability amplitudes. I don't believe it literally does that, but to our eyes that is what is being observed.
 
  • #4
Sorry, I am not well versed in issues regarding information so I will not comment.
 
  • #5
There was nothing like information theory in 1900.
 
  • #6
BruteForce1 said:
It is equally puzzling why we are confined to probability amplitudes for RD as in QM measurements. Newtonian determinism is undermined in both, so why were there still Newtonian determinists around when QM hit the scene?

We still have deterministic equations for both ofc but they are limited to probability values. That is all we can know.

The problem with QM is not that it is random. The problem is that it requires something that is traditionally called an observer or a measurement device that is "macroscopic" or "classical". It would seem that we should be able to describe the observer using quantum mechanics, but if we do so, then we need another observer to observe the observer. This is traditionally called the measurement problem.

It is discussed briefly by Dirac.
https://blogs.scientificamerican.com/guest-blog/the-evolution-of-the-physicists-picture-of-nature/
"And when this new development occurs, people will find it all rather futile to have had so much of a discussion on the role of observation in the theory, because they will have then a much better point of view from which to look at things."

And in a long article by Bell.
https://m.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf
Against ‘measurement’
 
  • #7
atyy said:
The problem with QM is not that it is random. The problem is that it requires something that is traditionally called an observer or a measurement device that is "macroscopic" or "classical". It would seem that we should be able to describe the observer using quantum mechanics, but if we do so, then we need another observer to observe the observer. This is traditionally called the measurement problem.

It is discussed briefly by Dirac.
https://blogs.scientificamerican.com/guest-blog/the-evolution-of-the-physicists-picture-of-nature/
"And when this new development occurs, people will find it all rather futile to have had so much of a discussion on the role of observation in the theory, because they will have then a much better point of view from which to look at things."

And in a long article by Bell.
https://m.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf
Against ‘measurement’

Radioactive decay is not random either. It also has a differential equation behind it.
 
  • #8
BruteForce1 said:
Radioactive decay is not random either. It also has a differential equation behind it.

The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It gives you only a probability for each individual atom.

I don't know the answer and I'm sure you could find out with your own research what, if any, theories there were to explain the (apparent) randomness of radio-active decay. My guess is that there was an assumption that once you knew enough about a particular atom, you could determine when it was going to decay. In the same way that you could study a macroscopic process that followed a statistical law. By looking closely enough you could explain the apparent randomness, perhaps by a distribution of initial conditions or the randomness of external influences.

The difference with QM is that it postulated a fundamental randomness (i.e. only probabilities) at the lowest level: probabilities that, according to the theory, could never be explained in terms of a more fundamental deterministic model. That said, the Bohmian interpretation of QM tries to do just that.

The short answer to your question is perhaps simply that everyone thought the probabilities of radioactive decay could eventually be explained by a deterministic theory. It would be interesing to know whether anyone took the randomness at face value and postulated that atoms might decay according to fundamentally probabilistic laws.
 
  • Like
Likes meopemuk
  • #9
PeroK said:
The thing that is random is the lifetime of any particular radioactive atom or molecule. The decay follows a "distribution", which is an overall pattern, but it doesn't tell you at which particular time a particular atom will decay. It gives you only a probability for each individual atom.

I don't know the answer and I'm sure you could find out with your own research what, if any, theories there were to explain the (apparent) randomness of radio-active decay. My guess is that there was an assumption that once you knew enough about a particular atom, you could determine when it was going to decay. In the same way that you could study a macroscopic process that followed a statistical law. By looking closely enough you could explain the apparent randomness, perhaps by a distribution of initial conditions or the randomness of external influences.

The difference with QM is that it postulated a fundamental randomness (i.e. only probabilities) at the lowest level: probabilities that, according to the theory, could never be explained in terms of a more fundamental deterministic model. That said, the Bohmian interpretation of QM tries to do just that.

The short answer to your question is perhaps simply that everyone thought the probabilities of radioactive decay could eventually be explained by a deterministic theory. It would be interesing to know whether anyone took the randomness at face value and postulated that atoms might decay according to fundamentally probabilistic laws.

I don't view this as a problem of determinism since we have the equations which are unambigious and deterministic. A random theory of both would be that they are sometimes trajectory specific, sometimes probabilistic, and sometimes none of the above.

QM measurements are always "non classical" (probabilistic), as is radioactive decay. I view the problem more with regards to our knowledge limitation and the universe itself. It appears that there is no way for us to gain full access in these circumstances. Not even in theory. If that is true, then that's a remarkable aspect of the universe. The universe tends to be very open for our inquires at the macroscopic levels. That was the hole beauty of the universe and science, that it was comprehensible.

Why do we have a very good model for only half a theory (probabilities) that are just as reliable as the macroscopic determinism, but confined to deterministic probabilities? It was initially believed determinism and probabilities were oxy morons, but scientists have learned to live with both existing simultaneously, only that the "flight path Newtonian determinism" breaks down in lower scales.
 
Last edited:
  • Skeptical
Likes weirdoguy
  • #10
Why come to ask a question if you already have an answer?

You get an answer and then you say "no, that's not it it's ...".
 
  • Like
Likes weirdoguy
  • #11
PeroK said:
Why come to ask a question if you already have an answer?

You get an answer and then you say "no, that's not it it's ...".

Because we are not discussing the threads question. My question was about the reception among scientists back then. I simply gave an account of why I don't view it as problematic to speak of determinism and probabilities in the same breath. The math works. In the area of physics back then it had to be more controversial since it was a long held view that flight path determinism rules. If you our current understanding is correct, then the universe at smaller scales is a clockwork not open for full scrutiny.
 
  • #12
BruteForce1 said:
Radioactive decay is not random either. It also has a differential equation behind it.

Radioactivity is random (the differential equation describes the probability of decay), and quantum mechanics is also random. Quantum mechanics describes radioactive decay. However, the necessity for an observer in quantum mechanics was not apparent around 1900, as quantum mechanical theory was not yet developed. The necessity for an observer in quantum mechanics was only understood later from around 1926, after the quantum formalism was more developed.
 
  • #13
atyy said:
Radioactivity is random (the differential equation describes the probability of decay), and quantum mechanics is also random.

Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
 
  • #14
The decay of an individual atom doesn't have a plan, purpose, or pattern, but I don't think the definition from M-W is useful here.
Quantum mechanics is probabilistic in some interpretations and deterministic (but still appearing random to observers) in others. It doesn't make sense to say it has to be one or the other as interpretations are a matter of taste (and shouldn't be discussed here, we have a separate forum for them).
 
  • #15
The theoretical work on radioactive decay started in 1840 and was apparently finished by 1940. So before 1940 there was no partial differential equation?
 
  • #16
mfb said:
The decay of an individual atom doesn't have a plan, purpose, or pattern,

A probability distribution is a pattern.
 
  • #17
An inanimate object could never have a "plan". Lack of "plan" refers to living, thinking, objects exhibiting random behavior.
 
  • #18
Is your argument really "the existence of a probability distribution means it is not random"? That is an extreme outsider view.
BruteForce1 said:
The theoretical work on radioactive decay started in 1840 and was apparently finished by 1940. So before 1940 there was no partial differential equation?
People didn't stop working on it in 1940. And differential equations are older than 1940.
 
  • #19
BruteForce1 said:
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
The Mathematical/Scientific description that applies here is: ( specific outcomes/values) not being predictable but following or being described by a probability distribution. Just like being given a fair coin. Outcome cannot be predicted but expected to follow a distribution over the long run.
 
  • Like
Likes mfb
  • #20
WWGD said:
The Mathematical/Scientific description that applies here is: ( specific outcomes/values) not being predictable but following or being described by a probability distribution. Just like being given a fair coin. Outcome cannot be predicted but expected to follow a distribution over the long run.

Probability distribution is a prediction based on a pattern of behavior. This is elementary.
 
  • #21
mfb said:
People didn't stop working on it in 1940. And differential equations are older than 1940.

I'm talking about the theoretical frame work. I didn't say differential equations started in 1940. What I found so far researching it, suggests that the differential equations of radioactive decay began in 1940, unless 1940 is referring to something else.
 
  • #22
BruteForce1 said:
Probability distribution is a prediction based on a pattern of behavior. This is elementary.
Then why are you using the everyday definition?
 
  • #23
WWGD said:
Then why are you using the everyday definition?

The everyday definition states that randomness is a patternless behavior.

The following is a pattern in my book:

"given a sample of a particular radioisotope, the number of decay events −dN expected to occur in a small interval of time dt is proportional to the number of atoms present N"
 
  • #24
BruteForce1 said:
The following is a pattern in my book:
"given a sample of a particular radioisotope, the number of decay events −dN expected to occur in a small interval of time dt is proportional to the number of atoms present N"
However, that statement is equivalent to stating that a single atom’s decay is patternless and completely random - the probability of it decaying at any given moment is the same for all moments.

It appears to me that you have adopted a definition of “pattern” is inconsistent, in that that there s something that you consider a pattern in patternless behavior. Such inconsistencies are generally a hint that the terms need to be defined more carefully.
 
  • #25
I'd say, there's no difference between the observed randomness of quantum phenomena in general and radioactive decay probabilities. After all the latter are just an application of the former.

I think the apparent problems people have with irreducible randomness is a cultural phenomenon, i.e., it's hard to accept that nature behaves deterministic on a fundamental level, but nature doesn't care for some philosophical prejudices of humans but just behaves as she does ;-).
 
  • Like
Likes hutchphd and PeroK
  • #26
BruteForce1 said:
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern

True randomness means that an event occurring in space and time can on principle never be undone.
 
  • #27
Lord Jestocost said:
True randomness means that an event occurring in space and time can on principle never be undone.
Is this formulation your own or can you ascribe it? ... I rather like it.
 
  • #28
Lord Jestocost said:
True randomness means that an event occurring in space and time can on principle never be undone.

Explain
 
  • #29
BruteForce1 said:
Neither one is random. When I use the word random I go by the Merriam Websters definition of : 1a : lacking a definite plan, purpose, or pattern
If you wish to play semantic games, I will point out that the definition you misquote several times after this is "lacking a definite pattern".
 
  • #30
hutchphd said:
Is this formulation your own or can you ascribe it? ... I rather like it.

My formulation is based on the following sentence in Sir Arthur Stanley Eddington’s book „THE NATURE OF THE PHYSICAL WORLD“ (which I highly recommend): „This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone.
 
  • Informative
Likes hutchphd
  • #31
Could one attribute a single radioactive decay to complex external conditions such as climate, similar to how genetic mutation occurs? Or is there no theory behind it at all?
 
  • #32
.
vanhees71 said:
I'd say, there's no difference between the observed randomness of quantum phenomena in general and radioactive decay probabilities.

One difference is that QM is deterministic when you're not measuring.
 
  • #33
Lord Jestocost said:
My formulation is based on the following sentence in Sir Arthur Stanley Eddington’s book „THE NATURE OF THE PHYSICAL WORLD“ (which I highly recommend): „This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone.

So he views randomness as noise. I don't see how deterministic systems could be undone either, however.
 
  • #34
BruteForce1 said:
Explain

It means: You cannot trace back along a causal chain in space and time why a true random individual event has occurred at a certain space-time coordination.
 
  • #35
My definition of randomness is: a causal process that cannot be attributed to any external or internal antecedent factor. In other words a causeless causal process.

I don't know if radioactive decay and QM really fits that.
 
  • Skeptical
Likes weirdoguy

Similar threads

Replies
4
Views
1K
Replies
2
Views
1K
Replies
21
Views
3K
Replies
14
Views
2K
Replies
143
Views
8K
Replies
25
Views
12K
Replies
36
Views
7K
Back
Top