I Why randomness means incomplete understanding

  • Thread starter A. Neumaier
  • Start date
  • Featured

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
I still have no clue, what the correct interpretation of your "thermal interpretation" is:


Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
The physical meaning is given by linear response theory, not by an interpretation in terms of deviations from an average.
 

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.
 

vanhees71

Science Advisor
Insights Author
Gold Member
13,474
5,372
Since when is the Euclidean action entropy? In the path-integral formalism of imaginary-time (Matsubara formalism) equilibrium statistics (grand canonical ensemble) you calculate thermodynamic potentials related to the effective quantum action, but it's not entropy (it's rather related to the Landau potential of thermodynamics).
 
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.
Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
 

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
This doesn't matter.

In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.
 
In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.
It is not clear what this entropy is, or what the unruh temperature is. It doesnt have any usual interpretation. Why classical solutions of field equations should have thermodynamical attributes has been a mystery since the discovery of black hole entropy. Things like euclidean action and the periodicity of fields in Euclidean time are closely connected with corresponding thermodynamic quantities. This connection is present/inherent in fact that these thermodynamic properties are real, even though the corresponding microscopic description is not known.
 
Last edited:
772
353
That reminds me of this implementation of an external random source: www.youtube.com/watch?v=1cUUfMeOijg
where they claim to be using a video camera watching a lot of lava lamps to generate random keys.

Any thoughts about this -- is it serious technology, or is it just their PR people doing a weird flex?
Note that it's a ccd camera; not an analog video camera, and although it's serious enough for experimental purposes, it requires substantial post-camera selective processing to ensure that it passes rigorous standard tests of apparent randomness, and it's not fast enough for serious data streaming purposes.
 

timmdeeg

Gold Member
959
64
@PeterDonis said here in post 24:

"My understanding of the thermal interpretation (remember I'm not its author so my understanding might not be correct) is that the two non-interfering outcomes are actually a meta-stable state of the detector (i.e., of whatever macroscopic object is going to irreversibly record the measurement result), and that random fluctuations cause this meta-stable state to decay into just one of the two outcomes."

Does this mean that if an EPR experiment is performed "random fluctuations" at detector A are entangled with "random fluctuations" at detector B such that the outcomes are correlated?
 

vanhees71

Science Advisor
Insights Author
Gold Member
13,474
5,372
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.

I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.

To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a staistical operator).
 

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.
To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a statistical operator).
Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
 
Last edited:
46
27
So to predict/simulate events you need quantum mechanics plus a basis that must be added externally, though in reality, things happen without having to specify a basis. Since according to you these additional choices are necessay (rather than implied by the quantum formalism), quantum mechanics alone is incomplete.
A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
 

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).
 

timmdeeg

Gold Member
959
64
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.
Ok, thanks for clarifying.
 
46
27
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).
We have a miniverse that contains a detector and an observable X. If the detector measures X to some suitable standard, this implies that, if we decompose the identity operator into possible detector properties and values of X, there will be some suitably high correlation between possible detector properties and possible values of X. But the detector in the miniverse doesn't impose this decomposition on us. We are free to use other, equally valid decomposition if we like. They'll just be less helpful (or not at all) for understanding the behaviour of the detector and with what it is correlated. The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
 

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.
 
46
27
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.
Well, so far we have only considered a detector, which obviously does not use quantum mechanics to understand its surroundings. But if the miniverse is suitably prepared, quantum mechanics will permit a description of it in terms of possible planetary formations and emergences of biological systems that understand the frequencies and correlations in their observations by constructing a theory they call quantum mechanics.
 

A. Neumaier

Science Advisor
Insights Author
7,006
2,910
quantum mechanics will permit a description of it in terms of possible planetary formations and emergences of biological systems that understand the frequencies and correlations in their observations by constructing a theory they call quantum mechanics.
This is pure speculation. How? A simulation must be programmable in principle!
 

DarMM

Science Advisor
Gold Member
2,370
1,390
Summary: We lack a fundamental understanding of the measurement process in quantum mechanics.

The traditional interpretations are way too vague to allow such a blueprint to be made, even in principle
I think in a traditional Copenhagen view or extensions of it like Decoherent histories you cannot make this simulation for the theory only gives one a guide to macroscopic impressions of the microscopic realm. Taking QM as non-representational they would indeed agree that such a blueprint/simulation cannot be made. As I think @Morbert is getting at a simulation of QM would encompass a simulation of the range of answers and their associated probabilities that an experimenter would obtain for a specific history of questions. However it would not be a simulation of reality itself.
 

vanhees71

Science Advisor
Insights Author
Gold Member
13,474
5,372
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.

Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
Great! All of a sudden the "Thermal Interpretation" again has the usual statistical meaning. So, what's the difference to the minimal statistical interpretation?

If you then deny the validity of Born's rule, how can you then justify the (as far as I see) equivalent statement that expectation values of arbitrary observables and thus also all moments of the probability distribution and thus the probability distribution itself is determined by the general Born's rule, i.e., the usual trace formula ##\langle A \rangle = \mathrm{Tr}(\hat{A} \hat{\rho})##.

Concerning your criticism in your paper: Can you prove your claim about the ion trap, i.e., can you show, using standard relativistic QFT that there's faster-than-light propagation? Of course, in non-relativistic QT, nothing prevents this, but why should it since non-relativistic QT is not supposed to obey relativistic causality to begin with.
 
46
27
This is pure speculation. How? A simulation must be programmable in principle!
I don't see how this is different from our previous talk about the detector in the miniverse. My understanding so far:

We both agree that a fully quantum theory of the miniverse will consist of an appropriate density operator and dynamics. I say this theory lets us run a simulation that returns probabilities for alternative possible histories of both the detector (or detector + scientist in the miniverse if you like) and the variable it is detecting. I.e. A fully quantum treatment of both the detector and the detected. You say this implies our theory is incomplete, since our simulation expects not only a density operator + dynamics, but also a set of alternatives for which probabilities are returned.

Before I respond: Have I correctly described the issue, or is there some other issue?
 

Stephen Tashi

Science Advisor
6,816
1,130
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.
How do we evaluate whether this requirement is met? Suppose the writer of a simulation says it simulates a user of QM by a certain collection of processes. Can we disprove this? In fact, if the writer of the simulation says the simulation doesn't simulate a user of QM, can we be sure he is correct? Another person might pick a collection of processes within the simulation and claim it can represent a user of QM.

Is the representation or non-representation of a user of QM a matter of the intent and interpretation of the writer of the simulation or his critics?
 
382
62
I am somewhat relating to @georgir post #67 on this issue as well as @julcab12 post #64,

not just the time precision of the total number of decays but also the "Schrodinger factor" is what makes me doubt such absolutely fundamental randomness because from the Schrodinger cat we know that this decay of most of the atoms is also not linear with respect to time but can happen randomly(not taking into account the disturbance potentially caused by an observer interference) which means most of the atoms that have to decay in a single half life can decay in the beginning of the half life yet the few left over ones will "sit and wait" patiently as if they were told by someone of authority to do so. This fact seems incompatible with the general understanding of probability because at least to me the randomness of probability in general over many such atoms in a system seems at odds with the randomness of spontaneous decay of a single atom or in fact of many single atoms at any given time within this system.


Many such peculiar details make me personally draw more towards the idea of built in determinism but by mechanisms which we still don't understand or maybe have no way of understanding or even getting to them, maybe that is also a fundamental property of nature, but surely without being able to prove or disprove this I should not speculate but then again we could argue that saying that nature is fundamentally random can also be speculation just a popular one.
 
175
38
This criteria for "random" has problems. Ruling out repeatable series means that any "random" series that is recorded for future replays could not be considered random. That would include running the random generator first, recording the numbers, and then using them. Being able to repeat the series should not rule it out from being "random". A definition of "random" that has advantages is if it is not possible to compress the series at all. With that definition, most computer pseudo-random number generators are not random, regardless of how they are seeded.
You have to talk about the source (or process), not a given sequence. Note that how compressible a given sequence is has nothing to do with how random its source is, after all there are many realizations of the fair coin that are quite ordered, e.g. 11111111111111. Shannon entrooy is a measure of randomness, but it is defined for disrete random variables, not strings. Woth a long enough string, you can get a crude estimation. For strings you can talk about Kolmogorov complexity, but it's not computable, and interpretation is still open.
 
Last edited:

FactChecker

Science Advisor
Gold Member
2018 Award
5,111
1,802
You have to talk about the source (or process), not a given sequence.
Why?
Note that how compressible a given sequence is has nothing to do with how random its source is, after all there are many realizations of the fair coin that are quite ordered, e.g. 11111111111111.
I would not use a sequence of ones as a random sequence no matter what the source was. I would say that you are giving an example here that contradicts your first statement.
 
175
38
Why?
I would not use a sequence of ones as a random sequence no matter what the source was. I would say that you are giving an example here that contradicts your first statement.
Then how are you choosing your random sequences? Are you pulling out strings that look too ordered? If so, then what sequence shows up after you've selected it will not be random, some of its properties will be predictable. You've chosen it for particular reasons.
 

Related Threads for: Why randomness means incomplete understanding

  • Last Post
Replies
2
Views
533
Top