I Why randomness means incomplete understanding

  • Thread starter Thread starter A. Neumaier
  • Start date Start date
  • Tags Tags
    Means Randomness
  • #61
vanhees71 said:
Well, then can you explain to me, why QT is considered the most successful physical theory ever?
Because it actually determines the statistics with phenomenal success. This is quite a feat!
vanhees71 said:
What is undetermined in your opinion?
Each single outcome, and all details of the fluctuations. Thus most of the stuff that is observed.
But only in the traditional interpretations.

In my opinion, the true, complete quantum physics is the quantum formalism plus the thermal interpretation. It accounts for each single outcome, and for all details of the fluctuations.
 
Physics news on Phys.org
  • #62
vanhees71 said:
I think there's no reason to think that nature may not be random at the most fundamental level of describability.
This is a completely unverifiable statement of your faith in the traditional quantum philosophy.
 
  • #63
vanhees71 said:
Nature is not deterministic on the fundamental level...

To my mind, there is need to more profound thinking. The onsets of individual clicks in a counter seem to be totally lawless events, coming by themselves and thus being uncaused. Or can one denote a cause which compels these individual effects?
 
  • Like
Likes meopemuk
  • #64
Randonmess is an artifact of measurables. Its not to say it didnt exist for lack of a better word. It is effective in its domain-- dynamics/relation-- average outcome. Like how flatspace is treated in geometry--GR. TI accounts for both. I bare lack of confidence in the unmaleability/universality of time in QM which accounts for every predictive values and observables in S/GR domain-- even the weird ones. I can only suspect that randonmess/indeterminism is not the underlying factor but mere artifacts of probabilty. In the same manner that flatspace is not observable.
 
  • #65
The statistical character is not something we can get away from. This statistical behavior is described by the partition function

∫ dφ eφT = (2π)n/2(det D)-1/2 = exp -1/2 Tr ln D

where F = 1/2 Tr ln D is the free energy. The first integral is the path integral

∫ e-S = e-F

where F is the free energy and the action is S = φTDφ. The formal similarity to thermodynamics is something that tells us that these expressions can only be interpreted statistically. The free energy is defined as a legendre transform in terms of the conjugate variables J and φ. Expressions such as S = Tr ρ ln ρ cannot interpreted in terms of usual microstates because they are defined through wick rotation.
 
Last edited:
  • #66
PrashantGokaraju said:
The formal similarity to thermodynamics is something that tells us that these expressions can only be interpreted statistically.
No. There are many formal similarities in mathematics and physics that cannot be ascribed to a similar interpretation.

The formal similarity only tells that there is a possibility of interpreting it statistically.
 
  • #67
At least intuitively any model involving randomness can be replaced with a deterministic model with extra variables controlling that randomness. Not knowing those variables can be called "incomplete understanding", though I think I'd be content and even consider our understanding complete for such models where certain variables are not even theoretically knowable, as long as their effect is well defined.

Problem with QM is that even such hidden variable models are proven to not work. For me my "incomplete understanding" stems not from the randomness itself, but from the way the random outcomes under different parameters are related. But I guess this is a whole other topic.
 
  • #68
A. Neumaier said:
Because it actually determines the statistics with phenomenal success. This is quite a feat!

Each single outcome, and all details of the fluctuations. Thus most of the stuff that is observed.
But only in the traditional interpretations.

In my opinion, the true, complete quantum physics is the quantum formalism plus the thermal interpretation. It accounts for each single outcome, and for all details of the fluctuations.
Ok, so we agree on the basic facts concerning QT as a very successful physical theory.

Now, it is obviously difficult, even after all this decades, to simply accept the simple conclusion that nature behaves inherently random. If this is true, as strongly suggested by QT and the strong successful experimental tests it has survived up to this day, then there's no way to predict a single measurement's outcome with certainty (of course except in the case, where the system is prepared in a state, where the measured observable takes a certain determined value), because the observable doesn't take a determined value. Then the complete description are indeed probabilities, and to test these probabilities you need an ensemble. Fluctuations are also referring to an ensemble. So if you accept the probabilistic description as complete, there's nothing lacking with QT simply because the outcome of an individual measurment is inherently random.

I still don't understand the thermal interpretion: Recently you claimed within the thermal interpretation the observables are what's in the usual interpretation of QT is called the expectation value of the observable given the state, i.e., ##\langle O \rangle = \mathrm{Tr}(\hat{\rho} \hat{O})##. This is a single value, i.e., it's determined, given the state. Now you claim, there are fluctuations. How do you define them. In the usual interpretations, where the state is interpreted probabilistically, it's clear: The fluctuations are determined by the moments or cumulants of the probability distribution or, equivalently, all expectation values of powers of ##O##, i.e., ##O_n =\mathrm{Tr} (\hat{\rho} \hat{O}^n)##, ##n \in \mathbb{N}##. But then you have again the usual probabilistic interpretation back (no matter, which flavor of additional "ontology" and "metaphysics" you prefer). Just renaming a probabilistic theory avoiding the words statistics, randomness and probability, does not change the mathematical content, as you well know!

So why then call it "thermal" (which is misleading anyway, because it seems to be your intention to provide a generally valid reinterpretation and not just one for thermal equilibrium).
 
  • Like
Likes meopemuk
  • #69
A. Neumaier said:
No. There are many formal similarities in mathematics and physics that cannot be ascribed to a similar interpretation.

The formal similarity only tells that there is a possibility of interpreting it statistically.

This is more than a formal similarity. The euclidean action essentially is the entropy.
 
  • #70
Lord Jestocost said:
To my mind, there is need to more profound thinking. The onsets of individual clicks in a counter seem to be totally lawless events, coming by themselves and thus being uncaused. Or can one denote a cause which compels these individual effects?
That's precisely my point (take the example of radioactive decay and a Geiger counter): The individual clicks ARE random according to QT. In lack of any deterministic explanation (in view of all these accurate Bell tests confirming QT) my conclusion simply is that nature is inherently random, i.e., when the individual atom decays and thus the Geiger counter clicks, is random.

The mathematical model to describe random events is probability theory, and QT is another theory providing the probability measures to be used to describe measurement outcomes in experiments (which are necessarily random experiments due to the inherent randomness of nature), and as it turns out everything else than "lawless". To the contrary QT provides the best estimates of probabilities for a vast number of observations (in fact all observations so far!) ever. I don't know a single other application of probability theory/applied stastistics, which gives as accurate preditions for probabilities/ statistics than QT! Thus we have a precise probabilistic description of the "inherent randomness of nature". No more no less.
 
  • Like
Likes meopemuk
  • #71
georgir said:
At least intuitively any model involving randomness can be replaced with a deterministic model with extra variables controlling that randomness. Not knowing those variables can be called "incomplete understanding", though I think I'd be content and even consider our understanding complete for such models where certain variables are not even theoretically knowable, as long as their effect is well defined.

Problem with QM is that even such hidden variable models are proven to not work. For me my "incomplete understanding" stems not from the randomness itself, but from the way the random outcomes under different parameters are related. But I guess this is a whole other topic.
It's not all hidden-variable models that are proven to not work to be honest. E.g., the Bohmian interpretation of non-relativistic QM is a deterministic non-local interpretation. There are no hidden variables thought. The well-known ones are sufficient ;-)).

Only any local deterministic hidden-variable theory, as defined by Bell, is ruled out by the may Bell tests done up to now. All demonstrate the violation of Bell's inequality with astonishing significance and confirm the predictions of QT. The problem with non-local deterministic HV theories is to formulate them in accordance with (special) relativity. That's the reason, why Bohmian QT is not (yet?) satisfactorily formulated for relativistic QFT. It's of course not clear, whether there's some non-local deterministic HV theory consistent with relativity. At least there seems to be no proof for such a no-go theorem. On the other hand up to now nobody has found any such non-local theory yet.
 
  • #72
vanhees71 said:
it is obviously difficult, even after all this decades, to simply accept the simple conclusion that nature behaves inherently random.
It will always be, because your conclusion does not follow logically from the phenomenal success of quantum physics. Thus whether or not someone accepts it is a matter of interpretation and philosophical preferences.

In particular, I do not think it is true because the thermal interpretation explains the randomness in quantum objects in precisely the same way as Laplace explained the randomness in classical objects.
vanhees71 said:
Now you claim, there are fluctuations. How do you define them.
What is usually called fluctuations are just q-correlations, which the thermal interpretation handles as nonlocal properties of the system, just like the diameter or volume of a classical object is a nonlocal property.
vanhees71 said:
Just renaming a probabilistic theory avoiding the words statistics, randomness and probability, does not change the mathematical content, as you well know!
Just having something that follows the axioms of probability theory does not make it a true probability in the sense of experiment, either. It is no more the case than a function is a vector pointing somewhere simply because it belongs to a vector space of functions.
vanhees71 said:
So why then call it "thermal" (which is misleading anyway, because it seems to be your intention to provide a generally valid reinterpretation and not just one for thermal equilibrium).
The word 'thermal' was never just a shorthand for thermal equilibrium.

This label for my interpretation emphasizes the motivation that comes from the fact that the observable quantities of nonequilibrium thermodynamics (i.e., hydromechanics and elasticity theory) appear in statistical mechanics as q-expectation values, and that this observation is generalized to arbitrary quantum systems, rather than only those with a thermodynamical interpretation.
 
  • #73
PrashantGokaraju said:
This is more than a formal similarity. The euclidean action essentially is the entropy.
No. The Euclidean action is an unphysical tool to get S-matrix elements, and is only formally analogous to the entropy.
 
  • #74
A. Neumaier said:
No. The Euclidean action is an unphysical tool to get S-matrix elements, and is only formally analogous to the entropy.

It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
 
  • #75
A. Neumaier said:
It will always be, because your conclusion does not follow logically from the phenomenal success of quantum physics. Thus whether or not someone accepts it is a matter of interpretation and philosophical preferences.
I don't claim it's a logical conclusion from the success of QT, but I claim that also the assumption that the world "in reality" behaves deterministic and thus incompleteness of QT is no logical conclusion from our experience with physics either.

Indeed, whether or not someone accepts it is a matter of interpretation and [individual!] philosophical preferences [prejudices?]. As religious belief it's something personal of any individual and thus irrelevant and unrelated to the realm of science.

I still have no clue, what the correct interpretation of your "thermal interpretation" is:

What is usually called fluctuations are just q-correlations, which the thermal interpretation handles as nonlocal properties of the system, just like the diameter or volume of a classical object is a nonlocal property.
Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
 
  • #76
vanhees71 said:
I still have no clue, what the correct interpretation of your "thermal interpretation" is:Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
The physical meaning is given by linear response theory, not by an interpretation in terms of deviations from an average.
 
  • #77
PrashantGokaraju said:
It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.
 
  • #78
Since when is the Euclidean action entropy? In the path-integral formalism of imaginary-time (Matsubara formalism) equilibrium statistics (grand canonical ensemble) you calculate thermodynamic potentials related to the effective quantum action, but it's not entropy (it's rather related to the Landau potential of thermodynamics).
 
  • #79
A. Neumaier said:
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.

Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
 
  • #80
PrashantGokaraju said:
Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
This doesn't matter.

In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.
 
  • #81
A. Neumaier said:
In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.

It is not clear what this entropy is, or what the unruh temperature is. It doesn't have any usual interpretation. Why classical solutions of field equations should have thermodynamical attributes has been a mystery since the discovery of black hole entropy. Things like euclidean action and the periodicity of fields in Euclidean time are closely connected with corresponding thermodynamic quantities. This connection is present/inherent in fact that these thermodynamic properties are real, even though the corresponding microscopic description is not known.
 
Last edited:
  • #82
Swamp Thing said:
That reminds me of this implementation of an external random source: www.youtube.com/watch?v=1cUUfMeOijg
where they claim to be using a video camera watching a lot of lava lamps to generate random keys.

Any thoughts about this -- is it serious technology, or is it just their PR people doing a weird flex?
Note that it's a ccd camera; not an analog video camera, and although it's serious enough for experimental purposes, it requires substantial post-camera selective processing to ensure that it passes rigorous standard tests of apparent randomness, and it's not fast enough for serious data streaming purposes.
 
  • #83
@PeterDonis said here in post 24:

"My understanding of the thermal interpretation (remember I'm not its author so my understanding might not be correct) is that the two non-interfering outcomes are actually a meta-stable state of the detector (i.e., of whatever macroscopic object is going to irreversibly record the measurement result), and that random fluctuations cause this meta-stable state to decay into just one of the two outcomes."

Does this mean that if an EPR experiment is performed "random fluctuations" at detector A are entangled with "random fluctuations" at detector B such that the outcomes are correlated?
 
  • #84
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.

I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.

To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a staistical operator).
 
  • Like
Likes timmdeeg
  • #85
vanhees71 said:
I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.
vanhees71 said:
To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a statistical operator).
Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
 
Last edited:
  • Like
Likes dextercioby
  • #86
A. Neumaier said:
So to predict/simulate events you need quantum mechanics plus a basis that must be added externally, though in reality, things happen without having to specify a basis. Since according to you these additional choices are necessay (rather than implied by the quantum formalism), quantum mechanics alone is incomplete.

A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
 
  • #87
Morbert said:
A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).
 
  • #88
vanhees71 said:
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.
Ok, thanks for clarifying.
 
  • #89
A. Neumaier said:
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).

We have a miniverse that contains a detector and an observable X. If the detector measures X to some suitable standard, this implies that, if we decompose the identity operator into possible detector properties and values of X, there will be some suitably high correlation between possible detector properties and possible values of X. But the detector in the miniverse doesn't impose this decomposition on us. We are free to use other, equally valid decomposition if we like. They'll just be less helpful (or not at all) for understanding the behaviour of the detector and with what it is correlated. The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
 
  • #90
Morbert said:
The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.
 

Similar threads

  • · Replies 155 ·
6
Replies
155
Views
3K
Replies
9
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 42 ·
2
Replies
42
Views
8K
Replies
19
Views
3K
  • · Replies 35 ·
2
Replies
35
Views
6K
  • · Replies 89 ·
3
Replies
89
Views
8K
  • · Replies 76 ·
3
Replies
76
Views
8K