Why randomness means incomplete understanding

In summary, we lack a fundamental understanding of the measurement process in quantum mechanics. This lack of understanding leads to problems with our ability to create a faithful miniuniverse or deterministic simulator.
  • #71
georgir said:
At least intuitively any model involving randomness can be replaced with a deterministic model with extra variables controlling that randomness. Not knowing those variables can be called "incomplete understanding", though I think I'd be content and even consider our understanding complete for such models where certain variables are not even theoretically knowable, as long as their effect is well defined.

Problem with QM is that even such hidden variable models are proven to not work. For me my "incomplete understanding" stems not from the randomness itself, but from the way the random outcomes under different parameters are related. But I guess this is a whole other topic.
It's not all hidden-variable models that are proven to not work to be honest. E.g., the Bohmian interpretation of non-relativistic QM is a deterministic non-local interpretation. There are no hidden variables thought. The well-known ones are sufficient ;-)).

Only any local deterministic hidden-variable theory, as defined by Bell, is ruled out by the may Bell tests done up to now. All demonstrate the violation of Bell's inequality with astonishing significance and confirm the predictions of QT. The problem with non-local deterministic HV theories is to formulate them in accordance with (special) relativity. That's the reason, why Bohmian QT is not (yet?) satisfactorily formulated for relativistic QFT. It's of course not clear, whether there's some non-local deterministic HV theory consistent with relativity. At least there seems to be no proof for such a no-go theorem. On the other hand up to now nobody has found any such non-local theory yet.
 
Physics news on Phys.org
  • #72
vanhees71 said:
it is obviously difficult, even after all this decades, to simply accept the simple conclusion that nature behaves inherently random.
It will always be, because your conclusion does not follow logically from the phenomenal success of quantum physics. Thus whether or not someone accepts it is a matter of interpretation and philosophical preferences.

In particular, I do not think it is true because the thermal interpretation explains the randomness in quantum objects in precisely the same way as Laplace explained the randomness in classical objects.
vanhees71 said:
Now you claim, there are fluctuations. How do you define them.
What is usually called fluctuations are just q-correlations, which the thermal interpretation handles as nonlocal properties of the system, just like the diameter or volume of a classical object is a nonlocal property.
vanhees71 said:
Just renaming a probabilistic theory avoiding the words statistics, randomness and probability, does not change the mathematical content, as you well know!
Just having something that follows the axioms of probability theory does not make it a true probability in the sense of experiment, either. It is no more the case than a function is a vector pointing somewhere simply because it belongs to a vector space of functions.
vanhees71 said:
So why then call it "thermal" (which is misleading anyway, because it seems to be your intention to provide a generally valid reinterpretation and not just one for thermal equilibrium).
The word 'thermal' was never just a shorthand for thermal equilibrium.

This label for my interpretation emphasizes the motivation that comes from the fact that the observable quantities of nonequilibrium thermodynamics (i.e., hydromechanics and elasticity theory) appear in statistical mechanics as q-expectation values, and that this observation is generalized to arbitrary quantum systems, rather than only those with a thermodynamical interpretation.
 
  • #73
PrashantGokaraju said:
This is more than a formal similarity. The euclidean action essentially is the entropy.
No. The Euclidean action is an unphysical tool to get S-matrix elements, and is only formally analogous to the entropy.
 
  • #74
A. Neumaier said:
No. The Euclidean action is an unphysical tool to get S-matrix elements, and is only formally analogous to the entropy.

It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
 
  • #75
A. Neumaier said:
It will always be, because your conclusion does not follow logically from the phenomenal success of quantum physics. Thus whether or not someone accepts it is a matter of interpretation and philosophical preferences.
I don't claim it's a logical conclusion from the success of QT, but I claim that also the assumption that the world "in reality" behaves deterministic and thus incompleteness of QT is no logical conclusion from our experience with physics either.

Indeed, whether or not someone accepts it is a matter of interpretation and [individual!] philosophical preferences [prejudices?]. As religious belief it's something personal of any individual and thus irrelevant and unrelated to the realm of science.

I still have no clue, what the correct interpretation of your "thermal interpretation" is:

What is usually called fluctuations are just q-correlations, which the thermal interpretation handles as nonlocal properties of the system, just like the diameter or volume of a classical object is a nonlocal property.
Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
 
  • #76
vanhees71 said:
I still have no clue, what the correct interpretation of your "thermal interpretation" is:Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
The physical meaning is given by linear response theory, not by an interpretation in terms of deviations from an average.
 
  • #77
PrashantGokaraju said:
It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.
 
  • #78
Since when is the Euclidean action entropy? In the path-integral formalism of imaginary-time (Matsubara formalism) equilibrium statistics (grand canonical ensemble) you calculate thermodynamic potentials related to the effective quantum action, but it's not entropy (it's rather related to the Landau potential of thermodynamics).
 
  • #79
A. Neumaier said:
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.

Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
 
  • #80
PrashantGokaraju said:
Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
This doesn't matter.

In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.
 
  • #81
A. Neumaier said:
In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.

It is not clear what this entropy is, or what the unruh temperature is. It doesn't have any usual interpretation. Why classical solutions of field equations should have thermodynamical attributes has been a mystery since the discovery of black hole entropy. Things like euclidean action and the periodicity of fields in Euclidean time are closely connected with corresponding thermodynamic quantities. This connection is present/inherent in fact that these thermodynamic properties are real, even though the corresponding microscopic description is not known.
 
Last edited:
  • #82
Swamp Thing said:
That reminds me of this implementation of an external random source: www.youtube.com/watch?v=1cUUfMeOijg
where they claim to be using a video camera watching a lot of lava lamps to generate random keys.

Any thoughts about this -- is it serious technology, or is it just their PR people doing a weird flex?
Note that it's a ccd camera; not an analog video camera, and although it's serious enough for experimental purposes, it requires substantial post-camera selective processing to ensure that it passes rigorous standard tests of apparent randomness, and it's not fast enough for serious data streaming purposes.
 
  • #83
@PeterDonis said here in post 24:

"My understanding of the thermal interpretation (remember I'm not its author so my understanding might not be correct) is that the two non-interfering outcomes are actually a meta-stable state of the detector (i.e., of whatever macroscopic object is going to irreversibly record the measurement result), and that random fluctuations cause this meta-stable state to decay into just one of the two outcomes."

Does this mean that if an EPR experiment is performed "random fluctuations" at detector A are entangled with "random fluctuations" at detector B such that the outcomes are correlated?
 
  • #84
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.

I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.

To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a staistical operator).
 
  • Like
Likes timmdeeg
  • #85
vanhees71 said:
I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.
vanhees71 said:
To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a statistical operator).
Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
 
Last edited:
  • Like
Likes dextercioby
  • #86
A. Neumaier said:
So to predict/simulate events you need quantum mechanics plus a basis that must be added externally, though in reality, things happen without having to specify a basis. Since according to you these additional choices are necessay (rather than implied by the quantum formalism), quantum mechanics alone is incomplete.

A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
 
  • #87
Morbert said:
A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).
 
  • #88
vanhees71 said:
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.
Ok, thanks for clarifying.
 
  • #89
A. Neumaier said:
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).

We have a miniverse that contains a detector and an observable X. If the detector measures X to some suitable standard, this implies that, if we decompose the identity operator into possible detector properties and values of X, there will be some suitably high correlation between possible detector properties and possible values of X. But the detector in the miniverse doesn't impose this decomposition on us. We are free to use other, equally valid decomposition if we like. They'll just be less helpful (or not at all) for understanding the behaviour of the detector and with what it is correlated. The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
 
  • #90
Morbert said:
The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.
 
  • #91
A. Neumaier said:
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.

Well, so far we have only considered a detector, which obviously does not use quantum mechanics to understand its surroundings. But if the miniverse is suitably prepared, quantum mechanics will permit a description of it in terms of possible planetary formations and emergences of biological systems that understand the frequencies and correlations in their observations by constructing a theory they call quantum mechanics.
 
  • #92
Morbert said:
quantum mechanics will permit a description of it in terms of possible planetary formations and emergences of biological systems that understand the frequencies and correlations in their observations by constructing a theory they call quantum mechanics.
This is pure speculation. How? A simulation must be programmable in principle!
 
  • #93
A. Neumaier said:
Summary: We lack a fundamental understanding of the measurement process in quantum mechanics.

The traditional interpretations are way too vague to allow such a blueprint to be made, even in principle
I think in a traditional Copenhagen view or extensions of it like Decoherent histories you cannot make this simulation for the theory only gives one a guide to macroscopic impressions of the microscopic realm. Taking QM as non-representational they would indeed agree that such a blueprint/simulation cannot be made. As I think @Morbert is getting at a simulation of QM would encompass a simulation of the range of answers and their associated probabilities that an experimenter would obtain for a specific history of questions. However it would not be a simulation of reality itself.
 
  • #94
A. Neumaier said:
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.

Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
Great! All of a sudden the "Thermal Interpretation" again has the usual statistical meaning. So, what's the difference to the minimal statistical interpretation?

If you then deny the validity of Born's rule, how can you then justify the (as far as I see) equivalent statement that expectation values of arbitrary observables and thus also all moments of the probability distribution and thus the probability distribution itself is determined by the general Born's rule, i.e., the usual trace formula ##\langle A \rangle = \mathrm{Tr}(\hat{A} \hat{\rho})##.

Concerning your criticism in your paper: Can you prove your claim about the ion trap, i.e., can you show, using standard relativistic QFT that there's faster-than-light propagation? Of course, in non-relativistic QT, nothing prevents this, but why should it since non-relativistic QT is not supposed to obey relativistic causality to begin with.
 
  • #95
A. Neumaier said:
This is pure speculation. How? A simulation must be programmable in principle!

I don't see how this is different from our previous talk about the detector in the miniverse. My understanding so far:

We both agree that a fully quantum theory of the miniverse will consist of an appropriate density operator and dynamics. I say this theory let's us run a simulation that returns probabilities for alternative possible histories of both the detector (or detector + scientist in the miniverse if you like) and the variable it is detecting. I.e. A fully quantum treatment of both the detector and the detected. You say this implies our theory is incomplete, since our simulation expects not only a density operator + dynamics, but also a set of alternatives for which probabilities are returned.

Before I respond: Have I correctly described the issue, or is there some other issue?
 
  • #96
A. Neumaier said:
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.

How do we evaluate whether this requirement is met? Suppose the writer of a simulation says it simulates a user of QM by a certain collection of processes. Can we disprove this? In fact, if the writer of the simulation says the simulation doesn't simulate a user of QM, can we be sure he is correct? Another person might pick a collection of processes within the simulation and claim it can represent a user of QM.

Is the representation or non-representation of a user of QM a matter of the intent and interpretation of the writer of the simulation or his critics?
 
  • #97
I am somewhat relating to @georgir post #67 on this issue as well as @julcab12 post #64,

not just the time precision of the total number of decays but also the "Schrodinger factor" is what makes me doubt such absolutely fundamental randomness because from the Schrodinger cat we know that this decay of most of the atoms is also not linear with respect to time but can happen randomly(not taking into account the disturbance potentially caused by an observer interference) which means most of the atoms that have to decay in a single half life can decay in the beginning of the half life yet the few left over ones will "sit and wait" patiently as if they were told by someone of authority to do so. This fact seems incompatible with the general understanding of probability because at least to me the randomness of probability in general over many such atoms in a system seems at odds with the randomness of spontaneous decay of a single atom or in fact of many single atoms at any given time within this system.Many such peculiar details make me personally draw more towards the idea of built in determinism but by mechanisms which we still don't understand or maybe have no way of understanding or even getting to them, maybe that is also a fundamental property of nature, but surely without being able to prove or disprove this I should not speculate but then again we could argue that saying that nature is fundamentally random can also be speculation just a popular one.
 
  • #98
FactChecker said:
This criteria for "random" has problems. Ruling out repeatable series means that any "random" series that is recorded for future replays could not be considered random. That would include running the random generator first, recording the numbers, and then using them. Being able to repeat the series should not rule it out from being "random". A definition of "random" that has advantages is if it is not possible to compress the series at all. With that definition, most computer pseudo-random number generators are not random, regardless of how they are seeded.

You have to talk about the source (or process), not a given sequence. Note that how compressible a given sequence is has nothing to do with how random its source is, after all there are many realizations of the fair coin that are quite ordered, e.g. 11111111111111. Shannon entrooy is a measure of randomness, but it is defined for disrete random variables, not strings. Woth a long enough string, you can get a crude estimation. For strings you can talk about Kolmogorov complexity, but it's not computable, and interpretation is still open.
 
Last edited:
  • #99
Jarvis323 said:
You have to talk about the source (or process), not a given sequence.
Why?
Note that how compressible a given sequence is has nothing to do with how random its source is, after all there are many realizations of the fair coin that are quite ordered, e.g. 11111111111111.
I would not use a sequence of ones as a random sequence no matter what the source was. I would say that you are giving an example here that contradicts your first statement.
 
  • #100
FactChecker said:
Why?
I would not use a sequence of ones as a random sequence no matter what the source was. I would say that you are giving an example here that contradicts your first statement.

Then how are you choosing your random sequences? Are you pulling out strings that look too ordered? If so, then what sequence shows up after you've selected it will not be random, some of its properties will be predictable. You've chosen it for particular reasons.
 
  • Like
Likes FactChecker
  • #101
I agree. I see your point that a long string of ones allows compression. So that violates the criteria that I had mentioned. I thought you were talking about a sequence of all ones, which I would not use no matter how they were created.
 
  • #102
Too "choose" random sequences (e.g., for doing Monte-Carlo simulations with a computer) is a quite difficult task, and it must be done with great care and precision! FAPP there are "deterministic" sequences which look pretty much like random numbers (usually in very good approximation uniformly distributed over an interval of real numbers), but they are not really random numbers of course.

According to the physical laws we know today, quantum theory can provide "true random numbers", i.e., numbers that are really indetermined. The most simple case to produce a (discrete) sequence of random numbers is to prepare a polarization entangled photon pair in a Bellstate (like the singlet state), which nowadays is easy using parametric downconversion. You can use one of the photons as "trigger, heralding the presence of the other photon". Then you are sure to get a truly random outcome determining the other photon's polarization (encoding, say, horizontal representation with 1 and vertical with 0, you get a "truely random" sequence of 0s and 1s).
 
  • #103
vanhees71 said:
According to the physical laws we know today, quantum theory can provide "true random numbers", i.e., numbers that are really indetermined. The most simple case to produce a (discrete) sequence of random numbers is to prepare a polarization entangled photon pair in a Bellstate (like the singlet state), which nowadays is easy using parametric downconversion. You can use one of the photons as "trigger, heralding the presence of the other photon". Then you are sure to get a truly random outcome determining the other photon's polarization (encoding, say, horizontal representation with 1 and vertical with 0, you get a "truely random" sequence of 0s and 1s).

Which ones, many laws of physics are known to be non-truths (e.g. Newtons laws). And what does it mean to say a theory can provide "true random numbers". It is the physical thing itself that does that. You can argue that the theory (or the randomness in the theory) cannot be replaced, but that in itself doesn't tell us about the physical thing itself. Unpredictability, or uncertainty, and randomness are not the same thing. Heck, even simple, stationary, bounded deterministic dynamical systems (e.g. the logistic map) are mathematically impossible to predict. You would need at least to (1) store infinite amounts of information, (2) process infinite amounts of information, (3) and compute iterative solutions at infinite temporal frequency; All that just to even get a time-invariant bound on accuracy at all, or even a reasonable bound on accuracy over a given fixed length of time. We could go deeper analyzing what we know about the thermodynamic costs of information storage, processing, and so forth.

If that isn't enough, even simple discrete computational problems cannot be solved, like a method to determine the outcome of the game of life for arbitrary initial conditions. The inability for us to make predictions (notwithstanding even bigger surprises than wrong laws of physics) is logically deducible from even our best non-physical axioms.

That said, what goes on at the quantum level, or beyond, is a mystery, and we cannot rule out weirdness that changes the limitations we assume based on our limited models of reality.

More interesting facts that are relevant to the discussion might be that (under classical models of computation) it is possible for a system to simulate itself, but not without an increase in time complexity. That is, supposing somehow we can get around all of the other obstacles and simulate our reality, we could not use it to predict the future, because the predictions will always come later and the system is open (we need the whole state to do the prediction). We could do a hypothetical simulation far in advance (assuming we have all of this power), but then it would still be impossible to know when/if a general hypothetical situation will ever arise. And all of this is true even if we are talking about systems with finite unbounded states. Again, I'm not sure to what extent quantum weirdness could changes these things. But it also is an interesting fact under-looked in the simulation hypothesis (which I won't go into because its off subject).

My main thought is that we cannot in general make assumption proof claims about reality (and what is really going on with photons and electrons and so forth). It's just a mystery. Perhaps, it will be possible to whittle away at what we see as randomness, through new models and assumptions, indirect measurements, logical deduction, and so forth, but (notwithstanding big big surprises) we will never be able to reach the bottom with any predictive model, and this is an issue that is independent of whether or not our perceived randomness is actually deterministic or not. Even supposing we could measure outcomes perfectly, and something seemed perfectly random, it would still be infeasible to tell the difference between randomness and deterministic chaos. At some point further intellectual inquiry into the matter isn't physics anymore.
 
Last edited:
  • Like
Likes vortextor and vanhees71
  • #104
Of course, you are right in saying that nature provides the "random numbers", not the theory. What I meant to say is that according to today's knowledge, formulated as QT, my example provides "true random numbers", i.e., the polarization states are really indetermined and don't take definite values which we don't know in lack of information about "hidden variables".

You are also right that FAPP "deterministic chaos" provides "random numbers", but they are not "true random numbers" in the sense that in classical physics they are in fact determined, though lacking the precise initial conditions we cannot predict them.

I'd not say "Newton's laws" are "non-truths". We only know today that they have a limited realm of validity. They are still very good descriptions of phenomena, where they are applicable. 50 years ago NASA brought men to the moon, successfully using it!

That said, it's of course also true that it may well be that QT is not the final word on the description of nature. Maybe one day some empirical fact will tell us that it has also its limited realm of validity, and maybe we find a more comprehensive theory revealing the successful QT as some of its approximately valid cases under certain special circumstances, such as Newtonian mechanics applies in the limit of small speeds and accelerations (as an approximation of relativity) and macroscopic objects (as coarse grained descriptions of relevant collective observables as averages over many microscopic degrees of freedom as an approximation of QT).
 
  • Like
Likes Jarvis323
  • #105
An interesting example:
One man's random is not another man's random. In a recent test of entanglement at a distance, the light from two distant galaxies was used. They were not just any distant galaxies -- they had to be two galaxies in opposite sides of the visible universe!
 

Similar threads

  • Quantum Interpretations and Foundations
Replies
9
Views
2K
  • Quantum Interpretations and Foundations
Replies
25
Views
975
  • Quantum Interpretations and Foundations
Replies
1
Views
481
  • Quantum Interpretations and Foundations
Replies
19
Views
605
  • Quantum Interpretations and Foundations
3
Replies
89
Views
5K
  • Quantum Interpretations and Foundations
Replies
2
Views
734
  • Quantum Interpretations and Foundations
2
Replies
37
Views
1K
  • Quantum Interpretations and Foundations
3
Replies
76
Views
5K
  • Quantum Interpretations and Foundations
Replies
3
Views
960
  • Quantum Interpretations and Foundations
3
Replies
76
Views
4K
Back
Top