Is Radioactive Decay Really Random?

Click For Summary
Radioactive decay is fundamentally random, as established by quantum mechanics, which can predict decay probabilities but not specific decay times. Some discussions suggest that environmental factors may influence decay rates, yet this does not negate the inherent randomness of when an individual atom will decay. The analogy of a BB in a ping pong ball illustrates that while decay appears random, it may be influenced by complex interactions, leading to debates about determinism versus randomness. The unpredictability of decay does not equate to proof of randomness, as some argue that hidden variables could exist. Ultimately, the conversation reflects ongoing philosophical questions about the nature of randomness and determinism in quantum mechanics.
  • #31
russ_watters said:
Do you think that we will someday develop the ability to calculate ahead of time where a photon will land after passing through a double-slit?

No I don't. For one thing the rules laid out in the Heisenberg uncertainty principle forbid it. However, submitting to this does not mean I accept that the double slit phonomenon does not have an underlying cause. It's simply a matter of logistics as to why we will probably never get to the bottom of it.

russ_watters said:
Probability certainly is a mathematical, logical construct, but so is the rest of math. Why must the universe follow the rules of addition and subtraction? It just so happens (no, I don't actually think it is a coincidence) that the universe behaves in a logical way, therefore the logic of math applies to it.

Indeed, it does seem to and I have no argument with that.

russ_watters said:
Your position that eventually we will be able to predict the behavior of probabilistic systems is not a popular one in mainstream science. Pretty much everything we know about probabilistic systems implies that they are not just not knowable now, but are inherently unknowable.

This is not my position. I agree that ever predicting such events is most likely, highly unlikely. My only position is that everything happens because something makes it happen. Probability is popular because it is an excellent tool for predicting the outcome of events, but again, it does not and will never explain the reason for these events. You may wish to take the position that because the outcome of an event precisely matches the predicted outcome of a particular law of probability, that this makes the law of probability the underlying cause of the event simply because you (and I) agree that the universe follows the logic of math. This is a leap of faith. There is no direct connection between the two, only a correlation. To say this another way, I believe the laws of probability follow random events and can be used as a tool to follow random events, but this does not mean that random events follow the laws of probability because again, the laws of probability are not true laws of nature, they simply describe it.

russ_watters said:
Unless you're God, you're not entitled to write such laws.

Oh come now, physicists write laws like this all the time. "The speed of light is a constant" is one of a bazillion examples.
 
Physics news on Phys.org
  • #32
Buckethead said:
No I don't. For one thing the rules laid out in the Heisenberg uncertainty principle forbid it. However, submitting to this does not mean I accept that the double slit phonomenon does not have an underlying cause. It's simply a matter of logistics as to why we will probably never get to the bottom of it.
So...you accept that the HUP applies to some things, but not others that it is currently applied to?
This is not my position. I agree that ever predicting such events is most likely, highly unlikely. My only position is that everything happens because something makes it happen. Probability is popular because it is an excellent tool for predicting the outcome of events, but again, it does not and will never explain the reason for these events. You may wish to take the position that because the outcome of an event precisely matches the predicted outcome of a particular law of probability, that this makes the law of probability the underlying cause of the event simply because you (and I) agree that the universe follows the logic of math. This is a leap of faith.
An experimental result is most certainly not a leap of faith except insofar as all experiments depend on the same "leap" that we're not just extraordinarily lucky to see such logic/consistency.
There is no direct connection between the two, only a correlation. To say this another way, I believe the laws of probability follow random events and can be used as a tool to follow random events, but this does not mean that random events follow the laws of probability because again, the laws of probability are not true laws of nature, they simply describe it.
Well let's go a step further: if there is more to it than probability, if we may eventually be able to predict outcomes that are currently modeled as probabilistic, then there must be a pattern that we aren't seeing. If there is a pattern that we aren't seeing, then what we think is random really isn't random -- it means there are errors in our predictions that are so small that we haven't seen them yet. Since we haven't found patterns in the randomness (which would, of course, contradict the concept of "randomness"), evidence supports/strengthens the conclusion that these things really do have a random element. And the more experiments done, the stronger the conclusion that these things really do have a random element and the smaller and smaller the dark corner that a pattern may yet lie in.
Oh come now, physicists write laws like this all the time. "The speed of light is a constant" is one of a bazillion examples.
Most certainly not. That's a postulate and a theory, which has corroborating evidence. You've elevated your idea above that. You believe, despite contradictory evidence, that the true functioning of the universe is a certain way. That's beyond science - the only way to know such a thing would be to be the one who actually wrote those laws into the programming of the universe.
 
  • #33
Interesting question. I can see where Buckethead is coming from. I too, tend to draw a distinction between the inability to measure something precisely on one hand and a total randomness (as in lack of cause) on the other.

For example, classical mechanics is considered fully deterministic. In theory if you know the positions and velocities of all molecules in 1 litre volume of gas with sufficient accuracy, you should be able to predict their positions at some future time, say 1 second ahead. In practice, something like moving 1kg of mass by 10cm somewhere in the vicinity of Sirius is sufficient to throw spanner into the works here on Earth and make the positions completely unpredictable (Penrose gave this argument in one of his books). Then there are pesky questions about knowing the position of all particles in the universe, solving gazillion-body problem, not to mention photons coming from the very fringes of the observable universe which you can't predict because you haven't seen them yet.

Nevertheless we still call it deterministic. Why is that? I think this is because, when a (classical) particle hits the screen, we don't just say "it's random", we have an explanation ready, we say, well it hit here and not there because the sum total of all forces must have been such as to produce this kind of trajectory. If only we knew the forces beforehand we could've surely predicted where it was going to hit.

Now let's look at HUP. We know what it says but why exactly does it say it, where does it come from? Well, from non-commuting projection operators. And where do these come from? From the observables, measurement, wavefunction collapse, Born rule etc. And these? At this point we are supposed to shut up and calculate.

But but but. Just like with the gas pressure, to get more and more accurate results we will have to look at individual molecules, so with quantum measurement we will have to treat the entire measurement apparatus quantum-mechanically. Obviously we can't just replace a hugely complicated system with lots of interacting degrees of freedom (measurement apparatus) with a simple operator and expect to get exactly the same results? Surely this must be some kind of idealization, simplification or generalization just like the gas pressure is the generalization of the forces of individual molecules hitting the wall?

What I'm driving at is, the process of measurement, wavefunction collapse, observables, their projection operators, and therefore HUP are all likely to be emergent phenomena. In other words, HUP is valid for a ideal measurement which is only an approximation for the real measurement, arising from our ignorance of quantum-mechanical nature of the measurement apparatus and its environment.

So, when (this time quantum) particle hits the screen we could say, well it hit here and not there because the relative phases of the wavefunctions of everything the particle had ever interacted with (the atoms of the screen, the source, the two slits, the CMBR photon that was passing by and everything that was entangled with it since the beginning of time), yeah all these phases just happened to be aligned so. Yeah, if only we knew all these phases beforehand we surely could have predicted where it was going to hit. Honestly :blushing:
 
  • #34
russ_watters said:
So...you accept that the HUP applies to some things, but not others that it is currently applied to?

No, I didn't say that nor is that my perspective. My understanding is that HUP can be applied to any subatomic random event. I'm not sure what this has to do with our discussion however. The HUP is a principle about what we can expect when trying to analyze a random subatomic event, it is not a theory of the random event itself.


russ_watters said:
An experimental result is most certainly not a leap of faith except insofar as all experiments depend on the same "leap" that we're not just extraordinarily lucky to see such logic/consistency.

Again, you seem to be confusing my stance that probability calculations and random events are not the same. If you are using probability to make observations about any number of unrelated random events and you find that your calculations match, all you can say about it is that the random events seem to be random, you cannot say that random events are truely random.


russ_watters said:
Well let's go a step further: if there is more to it than probability, if we may eventually be able to predict outcomes that are currently modeled as probabilistic, then there must be a pattern that we aren't seeing. If there is a pattern that we aren't seeing, then what we think is random really isn't random -- it means there are errors in our predictions that are so small that we haven't seen them yet. Since we haven't found patterns in the randomness (which would, of course, contradict the concept of "randomness"), evidence supports/strengthens the conclusion that these things really do have a random element. And the more experiments done, the stronger the conclusion that these things really do have a random element and the smaller and smaller the dark corner that a pattern may yet lie in.

This is a very good point and I appreciate what you are saying here and it does make me wonder about the "algorithm" used by nature to create a seemingly random event. Perhaps this is really gets to the heart of the matter. If our computer generated statistical outputs almost perfectly match those generated by nature, then one could say that nature uses an algorithm as well that is 100% repeatable given the same input. Our computers are not able to generate a truly random event, only a simulation, and perhaps nature is up against the same wall.
 
  • #35
FeDeX_LaTeX said:
Hello;

I remember being taught long ago that radioactive decay is random, but, no one ever explained to me why. Surely there has to be a reason for it? Or is it simply the case of it not being random? (particles in gases don't move randomly, it is dependent on various factors)

Thanks.
Actually, recents observation tends to show that radioactive decay is NOT random. The issue is that so far, no one is able to say why.
Please read: http://news.stanford.edu/news/2010/august/sun-082310.html"
 
Last edited by a moderator:
  • #36
Mannix99 said:
Actually, recents observation tends to show that radioactive decay is NOT random. The issue is that so far, no one is able to say why.
Please read: http://news.stanford.edu/news/2010/august/sun-082310.html"

I don't think that is evidence one way or another about determinism or indeterminism (i.e. "randomness"). In an indeterministic universe, the effect of the neutrinos could be to change the probability distribution (e.g. by lowering the energy barrier for decay). Altered probabilities are still probabilities.

It's not the case that neutrino flux let's scientists predict when a given particle will decay, so these observations don't tend to show that radioactive decay is nonrandom.

Not to suggest the article isn't extremely interesting. :)
 
Last edited by a moderator:

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 20 ·
Replies
20
Views
8K
  • · Replies 24 ·
Replies
24
Views
3K
Replies
13
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 20 ·
Replies
20
Views
5K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K