I Why randomness means incomplete understanding

  • I
  • Thread starter Thread starter A. Neumaier
  • Start date Start date
  • Tags Tags
    Means Randomness
  • #51
Morbert said:
a basis of mutually exclusive sequences of events
Well, into which basis? If any basis is allowed, then anything can happen. But then it is not determined by the simulation of the wave function but by the additional choice of the basis. This would mean that in our real universe, what happens depends not only on the Schrödinger dynamics but also on choosing a basis. in other words, the basis elements constitute additional hidden variables needed to get real events from quantum mechanics.
 
Physics news on Phys.org
  • #52
vanhees71 said:
It is important to be clear about the concepts. Quantum theory is completely causal, even in a strong sense: Knowing the state at time ##t_0## and knowing the Hamiltonian of the system, you know the state at any time ##t>t_0##.
The wave packet of a particle without interaction/measurement can spread throughout the universe.

/Patrick
 
  • Like
Likes julcab12
  • #53
microsansfil said:
Absolutely not. The prediction of quantum mechanics, in general, is not based on knowledge of the past, as far as measurement is concerned.

What would be the usefulness of a predictive theory that would not require any measures?

/Patrick
I'm not sure what you are asking. Quantum mechanics (which applies to everything as long as you can use non-relativistic physics) just predicts the outcome of experiments. What do you mean by "from the past"? As any dynamical theory QT starts from the description of the state the system is prepared in (or is observed to be prepared in) at time ##t_0## and provides via the dynamical laws what's to be expected to be observed later in a measurement, and that it does, within its realm of applicability, very well.
 
  • #54
microsansfil said:
The wave packet of a particle without interaction/measurement can spread throughout the universe.

/Patrick
Yes, of course, that what comes out of a calculation you do in the QM 1 lecture in the first or 2nd week. So what?
 
  • #55
vanhees71 said:
I'm not sure what you are asking. Quantum mechanics (which applies to everything as long as you can use non-relativistic physics) just predicts the outcome of experiments. What do you mean by "from the past"? As any dynamical theory QT starts from the description of the state the system is prepared in (or is observed to be prepared in) at time ##t_0## and provides via the dynamical laws what's to be expected to be observed later in a measurement, and that it does, within its realm of applicability, very well.
Pictures are often worth more than speeches :

Classical Mechanics

1564649148871.png


Quantum Mechanics

1564649984512.png


1564649581658.png

That what comes out of a presentation you can have in the QM 1 lecture in the first or 2nd week. Did you miss this passage during your studies?

Without measurements, it is only possible to predict probabilities as if the properties are only accessible through measurement operations that at least disturb them or at most generate them. They are not deduced from the past in a deterministic way.

/Patrick
 

Attachments

  • 1564649247250.png
    1564649247250.png
    26.9 KB · Views: 189
  • #56
That's a quite nice summary of QT, though I don't like the very problematic collapse postulate. What I meant with my statement was the spread of a free wave packet in non-relativistic QT. Usually you get the propagation of a Gaussian wave packet according to the Schrödinger equation as a problem in the first few recitation sessions. It's meaning is of course given as on your French slide: ##|\psi(t,x)|^2## is the position-probability distribution at time ##t##, i.e., it gives the probability for a detector to click at time ##t## when sitting at the point ##x## per (small) detector volume. That's all you need to know to make predictions concerning this position measurement.

What the particle does after detection is a question that cannot be part of the general formalism. If you have a von Neumann filter measurement indeed you have to adapt the wave function due to the interaction of the particle with the measurement device based on the knowledge that it registered the particle at at time ##t## at a place ##x## with some uncertainty given by the position resolution of the detector. In this (and only in this) case the "collapse postulate" is a valid FAPP description of a state-preparation procdedure, but no more.
 
  • #57
A. Neumaier said:
Well, into which basis? If any basis is allowed, then anything can happen. But then it is not determined by the simulation of the wave function but by the additional choice of the basis. This would mean that in our real universe, what happens depends not only on the Schrödinger dynamics but also on choosing a basis. in other words, the basis elements constitute additional hidden variables needed to get real events from quantum mechanics.

The quantum theory of the miniverse is in the dynamics and the initial conditions, but not the choice of basis. Different bases make clear different features of the miniverse we might wish to understand. They are not separate, alternative theories of the miniverse.

The theory does constrain our choice insofar as our decomposition has to be one that returns approximately standard probabilities, which is the case if ##Re[\mathbf{Tr}[C_{\alpha'}\rho C^\dagger_\alpha]]\approx 0## for ##\alpha'\neq\alpha##. But this is a feature, not a bug, as it ensures our physical understanding of the miniverse is always logically valid.
 
  • #58
vanhees71 said:
Quantum mechanics (which applies to everything as long as you can use non-relativistic physics) just predicts the outcome of experiments.
No. It leaves most details about the outcomes of experiments undetermined; only their gross statistics is determined.

According to all traditional interpretations, quantum mechanics alone does never predict the outcomes of any single experiment but only the statistics of a large ensemble of similarly prepared experiments.

In contrast, the thermal interpretation predicts the outcomes of experiments individually (from the state of the universe) in terms of the quantum formalism alone, and only our limited knowledge of the latter forces us to statistical considerations.
 
Last edited:
  • Like
Likes julcab12
  • #59
Morbert said:
The quantum theory of the miniverse is in the dynamics and the initial conditions, but not the choice of basis. Different bases make clear different features of the miniverse we might wish to understand. They are not separate, alternative theories of the miniverse.

The theory does constrain our choice insofar as our decomposition has to be one that returns approximately standard probabilities, which is the case if ##Re[\mathbf{Tr}[C_{\alpha'}\rho C^\dagger_\alpha]]\approx 0## for ##\alpha'\neq\alpha##. But this is a feature, not a bug, as it ensures our physical understanding of the miniverse is always logically valid.
So to predict/simulate events you need quantum mechanics plus a basis that must be added externally, though in reality, things happen without having to specify a basis. Since according to you these additional choices are necessay (rather than implied by the quantum formalism), quantum mechanics alone is incomplete.
 
  • #60
A. Neumaier said:
No. It leaves most details about the outcomes of experiments undetermined; only their gross statistics is determined.

According to all traditional interpretations, quantum mechanics alone does never predict the outcomes of any single experiment but only the statistics of a large ensemble of similarly prepared experiments.

In contrast, the thermal interpretation predicts the outcomes of experiments individually (from the state of the universe) from the quantum formalism alone, and only our ignorance of the latter forces us to statistical considerations.
Well, then can you explain to me, why QT is considered the most successful physical theory ever? What is undetermined in your opinion?

You say, it's "only the statistics". But that's the point! Nature is not deterministic on the fundamental level according to QT. E.g., if you have a single radioactive atom (and today you can deal with single atoms, e.g., in traps or storage rings) there's no way to predict the precise time, when it decays (given it is "here" now).

Of course, there's always the possibility that QT is not complete, and we simply do not know the complete set of observables which might determine the precise time, when the atom decays, but so far we don't have any hint that this might be true, and from the various Bell experiments, all confirming QT but disprove any local deterministic HV theories, I tend to believe that QT is rather complete (despite the description of gravity, which is today the only clear indication that QT is not complete). That's of course a believe, I can't prove, but under this assumption, QT tells us that nature is inherently probabilistic, i.e., certain things like the decay of the instable atom simply are random. I don't see, where a problem with this might be. It's rather amazing how accurately we are able to describe this inherent randomness with probability theory (a mathematical axiomatic system, which doesn't tell anything about the concrete probability measure for a given real-world situation) together with QT (a physical theory that provides precise predictions for probabilities of the inherently random processes observed in nature).

I think there's no reason to think that nature may not be random at the most fundamental level of describability.
 
  • Like
Likes meopemuk
  • #61
vanhees71 said:
Well, then can you explain to me, why QT is considered the most successful physical theory ever?
Because it actually determines the statistics with phenomenal success. This is quite a feat!
vanhees71 said:
What is undetermined in your opinion?
Each single outcome, and all details of the fluctuations. Thus most of the stuff that is observed.
But only in the traditional interpretations.

In my opinion, the true, complete quantum physics is the quantum formalism plus the thermal interpretation. It accounts for each single outcome, and for all details of the fluctuations.
 
  • #62
vanhees71 said:
I think there's no reason to think that nature may not be random at the most fundamental level of describability.
This is a completely unverifiable statement of your faith in the traditional quantum philosophy.
 
  • #63
vanhees71 said:
Nature is not deterministic on the fundamental level...

To my mind, there is need to more profound thinking. The onsets of individual clicks in a counter seem to be totally lawless events, coming by themselves and thus being uncaused. Or can one denote a cause which compels these individual effects?
 
  • Like
Likes meopemuk
  • #64
Randonmess is an artifact of measurables. Its not to say it didnt exist for lack of a better word. It is effective in its domain-- dynamics/relation-- average outcome. Like how flatspace is treated in geometry--GR. TI accounts for both. I bare lack of confidence in the unmaleability/universality of time in QM which accounts for every predictive values and observables in S/GR domain-- even the weird ones. I can only suspect that randonmess/indeterminism is not the underlying factor but mere artifacts of probabilty. In the same manner that flatspace is not observable.
 
  • #65
The statistical character is not something we can get away from. This statistical behavior is described by the partition function

∫ dφ eφT = (2π)n/2(det D)-1/2 = exp -1/2 Tr ln D

where F = 1/2 Tr ln D is the free energy. The first integral is the path integral

∫ e-S = e-F

where F is the free energy and the action is S = φTDφ. The formal similarity to thermodynamics is something that tells us that these expressions can only be interpreted statistically. The free energy is defined as a legendre transform in terms of the conjugate variables J and φ. Expressions such as S = Tr ρ ln ρ cannot interpreted in terms of usual microstates because they are defined through wick rotation.
 
Last edited:
  • #66
PrashantGokaraju said:
The formal similarity to thermodynamics is something that tells us that these expressions can only be interpreted statistically.
No. There are many formal similarities in mathematics and physics that cannot be ascribed to a similar interpretation.

The formal similarity only tells that there is a possibility of interpreting it statistically.
 
  • #67
At least intuitively any model involving randomness can be replaced with a deterministic model with extra variables controlling that randomness. Not knowing those variables can be called "incomplete understanding", though I think I'd be content and even consider our understanding complete for such models where certain variables are not even theoretically knowable, as long as their effect is well defined.

Problem with QM is that even such hidden variable models are proven to not work. For me my "incomplete understanding" stems not from the randomness itself, but from the way the random outcomes under different parameters are related. But I guess this is a whole other topic.
 
  • #68
A. Neumaier said:
Because it actually determines the statistics with phenomenal success. This is quite a feat!

Each single outcome, and all details of the fluctuations. Thus most of the stuff that is observed.
But only in the traditional interpretations.

In my opinion, the true, complete quantum physics is the quantum formalism plus the thermal interpretation. It accounts for each single outcome, and for all details of the fluctuations.
Ok, so we agree on the basic facts concerning QT as a very successful physical theory.

Now, it is obviously difficult, even after all this decades, to simply accept the simple conclusion that nature behaves inherently random. If this is true, as strongly suggested by QT and the strong successful experimental tests it has survived up to this day, then there's no way to predict a single measurement's outcome with certainty (of course except in the case, where the system is prepared in a state, where the measured observable takes a certain determined value), because the observable doesn't take a determined value. Then the complete description are indeed probabilities, and to test these probabilities you need an ensemble. Fluctuations are also referring to an ensemble. So if you accept the probabilistic description as complete, there's nothing lacking with QT simply because the outcome of an individual measurment is inherently random.

I still don't understand the thermal interpretion: Recently you claimed within the thermal interpretation the observables are what's in the usual interpretation of QT is called the expectation value of the observable given the state, i.e., ##\langle O \rangle = \mathrm{Tr}(\hat{\rho} \hat{O})##. This is a single value, i.e., it's determined, given the state. Now you claim, there are fluctuations. How do you define them. In the usual interpretations, where the state is interpreted probabilistically, it's clear: The fluctuations are determined by the moments or cumulants of the probability distribution or, equivalently, all expectation values of powers of ##O##, i.e., ##O_n =\mathrm{Tr} (\hat{\rho} \hat{O}^n)##, ##n \in \mathbb{N}##. But then you have again the usual probabilistic interpretation back (no matter, which flavor of additional "ontology" and "metaphysics" you prefer). Just renaming a probabilistic theory avoiding the words statistics, randomness and probability, does not change the mathematical content, as you well know!

So why then call it "thermal" (which is misleading anyway, because it seems to be your intention to provide a generally valid reinterpretation and not just one for thermal equilibrium).
 
  • Like
Likes meopemuk
  • #69
A. Neumaier said:
No. There are many formal similarities in mathematics and physics that cannot be ascribed to a similar interpretation.

The formal similarity only tells that there is a possibility of interpreting it statistically.

This is more than a formal similarity. The euclidean action essentially is the entropy.
 
  • #70
Lord Jestocost said:
To my mind, there is need to more profound thinking. The onsets of individual clicks in a counter seem to be totally lawless events, coming by themselves and thus being uncaused. Or can one denote a cause which compels these individual effects?
That's precisely my point (take the example of radioactive decay and a Geiger counter): The individual clicks ARE random according to QT. In lack of any deterministic explanation (in view of all these accurate Bell tests confirming QT) my conclusion simply is that nature is inherently random, i.e., when the individual atom decays and thus the Geiger counter clicks, is random.

The mathematical model to describe random events is probability theory, and QT is another theory providing the probability measures to be used to describe measurement outcomes in experiments (which are necessarily random experiments due to the inherent randomness of nature), and as it turns out everything else than "lawless". To the contrary QT provides the best estimates of probabilities for a vast number of observations (in fact all observations so far!) ever. I don't know a single other application of probability theory/applied stastistics, which gives as accurate preditions for probabilities/ statistics than QT! Thus we have a precise probabilistic description of the "inherent randomness of nature". No more no less.
 
  • Like
Likes meopemuk
  • #71
georgir said:
At least intuitively any model involving randomness can be replaced with a deterministic model with extra variables controlling that randomness. Not knowing those variables can be called "incomplete understanding", though I think I'd be content and even consider our understanding complete for such models where certain variables are not even theoretically knowable, as long as their effect is well defined.

Problem with QM is that even such hidden variable models are proven to not work. For me my "incomplete understanding" stems not from the randomness itself, but from the way the random outcomes under different parameters are related. But I guess this is a whole other topic.
It's not all hidden-variable models that are proven to not work to be honest. E.g., the Bohmian interpretation of non-relativistic QM is a deterministic non-local interpretation. There are no hidden variables thought. The well-known ones are sufficient ;-)).

Only any local deterministic hidden-variable theory, as defined by Bell, is ruled out by the may Bell tests done up to now. All demonstrate the violation of Bell's inequality with astonishing significance and confirm the predictions of QT. The problem with non-local deterministic HV theories is to formulate them in accordance with (special) relativity. That's the reason, why Bohmian QT is not (yet?) satisfactorily formulated for relativistic QFT. It's of course not clear, whether there's some non-local deterministic HV theory consistent with relativity. At least there seems to be no proof for such a no-go theorem. On the other hand up to now nobody has found any such non-local theory yet.
 
  • #72
vanhees71 said:
it is obviously difficult, even after all this decades, to simply accept the simple conclusion that nature behaves inherently random.
It will always be, because your conclusion does not follow logically from the phenomenal success of quantum physics. Thus whether or not someone accepts it is a matter of interpretation and philosophical preferences.

In particular, I do not think it is true because the thermal interpretation explains the randomness in quantum objects in precisely the same way as Laplace explained the randomness in classical objects.
vanhees71 said:
Now you claim, there are fluctuations. How do you define them.
What is usually called fluctuations are just q-correlations, which the thermal interpretation handles as nonlocal properties of the system, just like the diameter or volume of a classical object is a nonlocal property.
vanhees71 said:
Just renaming a probabilistic theory avoiding the words statistics, randomness and probability, does not change the mathematical content, as you well know!
Just having something that follows the axioms of probability theory does not make it a true probability in the sense of experiment, either. It is no more the case than a function is a vector pointing somewhere simply because it belongs to a vector space of functions.
vanhees71 said:
So why then call it "thermal" (which is misleading anyway, because it seems to be your intention to provide a generally valid reinterpretation and not just one for thermal equilibrium).
The word 'thermal' was never just a shorthand for thermal equilibrium.

This label for my interpretation emphasizes the motivation that comes from the fact that the observable quantities of nonequilibrium thermodynamics (i.e., hydromechanics and elasticity theory) appear in statistical mechanics as q-expectation values, and that this observation is generalized to arbitrary quantum systems, rather than only those with a thermodynamical interpretation.
 
  • #73
PrashantGokaraju said:
This is more than a formal similarity. The euclidean action essentially is the entropy.
No. The Euclidean action is an unphysical tool to get S-matrix elements, and is only formally analogous to the entropy.
 
  • #74
A. Neumaier said:
No. The Euclidean action is an unphysical tool to get S-matrix elements, and is only formally analogous to the entropy.

It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
 
  • #75
A. Neumaier said:
It will always be, because your conclusion does not follow logically from the phenomenal success of quantum physics. Thus whether or not someone accepts it is a matter of interpretation and philosophical preferences.
I don't claim it's a logical conclusion from the success of QT, but I claim that also the assumption that the world "in reality" behaves deterministic and thus incompleteness of QT is no logical conclusion from our experience with physics either.

Indeed, whether or not someone accepts it is a matter of interpretation and [individual!] philosophical preferences [prejudices?]. As religious belief it's something personal of any individual and thus irrelevant and unrelated to the realm of science.

I still have no clue, what the correct interpretation of your "thermal interpretation" is:

What is usually called fluctuations are just q-correlations, which the thermal interpretation handles as nonlocal properties of the system, just like the diameter or volume of a classical object is a nonlocal property.
Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
 
  • #76
vanhees71 said:
I still have no clue, what the correct interpretation of your "thermal interpretation" is:Just to call fluctuations (a probabilistic notion) now "q-correlations" without giving a meaning to this word, is just empty phrasing.
The physical meaning is given by linear response theory, not by an interpretation in terms of deviations from an average.
 
  • #77
PrashantGokaraju said:
It is more than formal. For example, the entropy of a black hole is equal to the Euclidean action.
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.
 
  • #78
Since when is the Euclidean action entropy? In the path-integral formalism of imaginary-time (Matsubara formalism) equilibrium statistics (grand canonical ensemble) you calculate thermodynamic potentials related to the effective quantum action, but it's not entropy (it's rather related to the Landau potential of thermodynamics).
 
  • #79
A. Neumaier said:
That's the only example you can give, and it is based on snippets of theory that don't yet form a coherent whole.

In flat space, the Euclidean action is meaningless.

Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
 
  • #80
PrashantGokaraju said:
Even in flat space, the vacuum state becomes a thermal distrubution when viewed from accelerated coordinates. This is exactly analogous to the Hawking effect, and can be obtained from exactly the same euclidean procedure. The meaning of this procedure is not clear, as you said, but is present in flat space also.
This doesn't matter.

In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.
 
  • #81
A. Neumaier said:
In QFT, the entropy cannot be the Euclidean action since the latter is fixed by the theory whereas the entropy is state dependent.

It is not clear what this entropy is, or what the unruh temperature is. It doesn't have any usual interpretation. Why classical solutions of field equations should have thermodynamical attributes has been a mystery since the discovery of black hole entropy. Things like euclidean action and the periodicity of fields in Euclidean time are closely connected with corresponding thermodynamic quantities. This connection is present/inherent in fact that these thermodynamic properties are real, even though the corresponding microscopic description is not known.
 
Last edited:
  • #82
Swamp Thing said:
That reminds me of this implementation of an external random source: www.youtube.com/watch?v=1cUUfMeOijg
where they claim to be using a video camera watching a lot of lava lamps to generate random keys.

Any thoughts about this -- is it serious technology, or is it just their PR people doing a weird flex?
Note that it's a ccd camera; not an analog video camera, and although it's serious enough for experimental purposes, it requires substantial post-camera selective processing to ensure that it passes rigorous standard tests of apparent randomness, and it's not fast enough for serious data streaming purposes.
 
  • #83
@PeterDonis said here in post 24:

"My understanding of the thermal interpretation (remember I'm not its author so my understanding might not be correct) is that the two non-interfering outcomes are actually a meta-stable state of the detector (i.e., of whatever macroscopic object is going to irreversibly record the measurement result), and that random fluctuations cause this meta-stable state to decay into just one of the two outcomes."

Does this mean that if an EPR experiment is performed "random fluctuations" at detector A are entangled with "random fluctuations" at detector B such that the outcomes are correlated?
 
  • #84
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.

I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.

To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a staistical operator).
 
  • Like
Likes timmdeeg
  • #85
vanhees71 said:
I've no clue what's the meaning of the "thermal interpretation" might be. On the one hand it's claimed, that the q-expectation values (which I'm not allowed to interpret as ensemble averages as in the standard interpretation for some unknown reason), which are determined by the states in QT of course. On the other hand I have to take "fluctuations" into account. Since it's forbidden to think of the states in terms of the standard probability interpretation, cannot not say, what they are in terms of the thermal interpretation.
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.
vanhees71 said:
To say it clearly, the only hitherto found consistent interpretation since 1926 is Born's rule (in the modern generalized form to apply to a general pure or mixed state, described by a statistical operator).
Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
 
Last edited:
  • Like
Likes dextercioby
  • #86
A. Neumaier said:
So to predict/simulate events you need quantum mechanics plus a basis that must be added externally, though in reality, things happen without having to specify a basis. Since according to you these additional choices are necessay (rather than implied by the quantum formalism), quantum mechanics alone is incomplete.

A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
 
  • #87
Morbert said:
A chosen decomposition is just an expression of the properties/events we want to make predictions about. We always need some procedure to connect the content of a theory to predicted consequences.

If a decomposition was instead used as a physical explanation for why events happen, i.e. if a particular decomposition made things happen, then I would agree.
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).
 
  • #88
vanhees71 said:
The outcomes of measurements are correlated in case of measuring observables which are entangled due to the state the system is prepared in, as described by quantum theory in the standard interpretation. Since the measured quantities do not take determined values, they fluctuate. Fluctuations are described by the standard deviation and higher moments or cumulants of the distributions, and of course these quantities are also correlated in case of entangled states.
Ok, thanks for clarifying.
 
  • #89
A. Neumaier said:
In a simulated miniverse, this choice is to be made by the simulated detectors (or their simulated observers) and not by us who interpret the simulation from the outside!

Hence the choice must be encoded in the wave function or whatever else goes into the simulation (such as Bohmian particles; but these are taboo in MWI).

We have a miniverse that contains a detector and an observable X. If the detector measures X to some suitable standard, this implies that, if we decompose the identity operator into possible detector properties and values of X, there will be some suitably high correlation between possible detector properties and possible values of X. But the detector in the miniverse doesn't impose this decomposition on us. We are free to use other, equally valid decomposition if we like. They'll just be less helpful (or not at all) for understanding the behaviour of the detector and with what it is correlated. The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
 
  • #90
Morbert said:
The choice is always made by whoever is using quantum mechanics to understand a system, and they choose based on what it is they are interested in understanding.
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.
 
  • #91
A. Neumaier said:
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.

Well, so far we have only considered a detector, which obviously does not use quantum mechanics to understand its surroundings. But if the miniverse is suitably prepared, quantum mechanics will permit a description of it in terms of possible planetary formations and emergences of biological systems that understand the frequencies and correlations in their observations by constructing a theory they call quantum mechanics.
 
  • #92
Morbert said:
quantum mechanics will permit a description of it in terms of possible planetary formations and emergences of biological systems that understand the frequencies and correlations in their observations by constructing a theory they call quantum mechanics.
This is pure speculation. How? A simulation must be programmable in principle!
 
  • #93
A. Neumaier said:
Summary: We lack a fundamental understanding of the measurement process in quantum mechanics.

The traditional interpretations are way too vague to allow such a blueprint to be made, even in principle
I think in a traditional Copenhagen view or extensions of it like Decoherent histories you cannot make this simulation for the theory only gives one a guide to macroscopic impressions of the microscopic realm. Taking QM as non-representational they would indeed agree that such a blueprint/simulation cannot be made. As I think @Morbert is getting at a simulation of QM would encompass a simulation of the range of answers and their associated probabilities that an experimenter would obtain for a specific history of questions. However it would not be a simulation of reality itself.
 
  • #94
A. Neumaier said:
They are (like everywhere in the statistics of measurement) just predictions of a typical value for the square of the deviation of each measurement result from the best reproducible value.

Born's rule is not consistent in many ways; see my critique in Section 3.3 of Part I, and more explicitly the case for the QED electron in a new thread.
Great! All of a sudden the "Thermal Interpretation" again has the usual statistical meaning. So, what's the difference to the minimal statistical interpretation?

If you then deny the validity of Born's rule, how can you then justify the (as far as I see) equivalent statement that expectation values of arbitrary observables and thus also all moments of the probability distribution and thus the probability distribution itself is determined by the general Born's rule, i.e., the usual trace formula ##\langle A \rangle = \mathrm{Tr}(\hat{A} \hat{\rho})##.

Concerning your criticism in your paper: Can you prove your claim about the ion trap, i.e., can you show, using standard relativistic QFT that there's faster-than-light propagation? Of course, in non-relativistic QT, nothing prevents this, but why should it since non-relativistic QT is not supposed to obey relativistic causality to begin with.
 
  • #95
A. Neumaier said:
This is pure speculation. How? A simulation must be programmable in principle!

I don't see how this is different from our previous talk about the detector in the miniverse. My understanding so far:

We both agree that a fully quantum theory of the miniverse will consist of an appropriate density operator and dynamics. I say this theory let's us run a simulation that returns probabilities for alternative possible histories of both the detector (or detector + scientist in the miniverse if you like) and the variable it is detecting. I.e. A fully quantum treatment of both the detector and the detected. You say this implies our theory is incomplete, since our simulation expects not only a density operator + dynamics, but also a set of alternatives for which probabilities are returned.

Before I respond: Have I correctly described the issue, or is there some other issue?
 
  • #96
A. Neumaier said:
In the simulated miniverse, ''whoever is using quantum mechanics'' is simulated as well, hence their choices must be determined by the simulation alone.

How do we evaluate whether this requirement is met? Suppose the writer of a simulation says it simulates a user of QM by a certain collection of processes. Can we disprove this? In fact, if the writer of the simulation says the simulation doesn't simulate a user of QM, can we be sure he is correct? Another person might pick a collection of processes within the simulation and claim it can represent a user of QM.

Is the representation or non-representation of a user of QM a matter of the intent and interpretation of the writer of the simulation or his critics?
 
  • #97
I am somewhat relating to @georgir post #67 on this issue as well as @julcab12 post #64,

not just the time precision of the total number of decays but also the "Schrodinger factor" is what makes me doubt such absolutely fundamental randomness because from the Schrodinger cat we know that this decay of most of the atoms is also not linear with respect to time but can happen randomly(not taking into account the disturbance potentially caused by an observer interference) which means most of the atoms that have to decay in a single half life can decay in the beginning of the half life yet the few left over ones will "sit and wait" patiently as if they were told by someone of authority to do so. This fact seems incompatible with the general understanding of probability because at least to me the randomness of probability in general over many such atoms in a system seems at odds with the randomness of spontaneous decay of a single atom or in fact of many single atoms at any given time within this system.Many such peculiar details make me personally draw more towards the idea of built in determinism but by mechanisms which we still don't understand or maybe have no way of understanding or even getting to them, maybe that is also a fundamental property of nature, but surely without being able to prove or disprove this I should not speculate but then again we could argue that saying that nature is fundamentally random can also be speculation just a popular one.
 
  • #98
FactChecker said:
This criteria for "random" has problems. Ruling out repeatable series means that any "random" series that is recorded for future replays could not be considered random. That would include running the random generator first, recording the numbers, and then using them. Being able to repeat the series should not rule it out from being "random". A definition of "random" that has advantages is if it is not possible to compress the series at all. With that definition, most computer pseudo-random number generators are not random, regardless of how they are seeded.

You have to talk about the source (or process), not a given sequence. Note that how compressible a given sequence is has nothing to do with how random its source is, after all there are many realizations of the fair coin that are quite ordered, e.g. 11111111111111. Shannon entrooy is a measure of randomness, but it is defined for disrete random variables, not strings. Woth a long enough string, you can get a crude estimation. For strings you can talk about Kolmogorov complexity, but it's not computable, and interpretation is still open.
 
Last edited:
  • #99
Jarvis323 said:
You have to talk about the source (or process), not a given sequence.
Why?
Note that how compressible a given sequence is has nothing to do with how random its source is, after all there are many realizations of the fair coin that are quite ordered, e.g. 11111111111111.
I would not use a sequence of ones as a random sequence no matter what the source was. I would say that you are giving an example here that contradicts your first statement.
 
  • #100
FactChecker said:
Why?
I would not use a sequence of ones as a random sequence no matter what the source was. I would say that you are giving an example here that contradicts your first statement.

Then how are you choosing your random sequences? Are you pulling out strings that look too ordered? If so, then what sequence shows up after you've selected it will not be random, some of its properties will be predictable. You've chosen it for particular reasons.
 
  • Like
Likes FactChecker
Back
Top