Why randomness means incomplete understanding

In summary, we lack a fundamental understanding of the measurement process in quantum mechanics. This lack of understanding leads to problems with our ability to create a faithful miniuniverse or deterministic simulator.
  • #1
A. Neumaier
Science Advisor
Insights Author
8,625
4,669
TL;DR Summary
We lack a fundamental understanding of the measurement process in quantum mechanics.
Summary: We lack a fundamental understanding of the measurement process in quantum mechanics.

vanhees71 said:
why it is considered a problem that nature seems to be "irreducibly probabilistic/random" in the precise sense defined by QT?
julcab12 said:
Randomness is just the absence of knowledge on what it will happen when we will do something we don't have the complete control on it.
akvadrako said:
computers can't create randomness – they always need an outside source. We can't write a RAND() function based off other primitive operations.
Actually ever programming language has such a function, and it is executed to everyone's satisfaction.
vanhees71 said:
That's precisely what I asked! Why do you think that randomness is "just the absence of knowledge"? Why shouldn't nature behave randomly in a way as described by QT?
Suppose we want to create a device that faithfully simulates some aspect of Nature. To do so, we need to know enough about the working of this aspect so that we know how to build the simulation. Being able to create a detailed blueprint for a perfect simulation means that we understood this aspect. Not being able to do this implies lack of understanding.

Suppose now that the task is to create a faithful miniuniverse, a dynamical system with a state space sufficiently complex that there are internal detectors which make measurements working according to the Rules of Quantum Mechanics. Then we need to know in sufficient detail how those many-particle systems called detectors get their measurement results, based solely on the dynamical law of the miniuniverse, and hence independent of the notion of measurement.

The traditional interpretations are way too vague to allow such a blueprint to be made, even in principle. The reason is that on their basis one cannot describe how the dynamics of the complete system, the miniuniverse, implies that the detectors get their particular measurement values. This is impossible because there is no known conceptual relationship between the assumed irreducible randomness (if any) applied on the level of the complete system and the assumed irreducible randomness applied on the level of the detector results.

The current knowledge only allows a piecemeal partial understanding, as the understanding of measurement processes is only a heuristic mix of quantum mechanics, classical mechanics, and ad hoc simplifying assumptions not justified by theory but only by practice, applied separately to models of particles, materials, and components.

This shows that we lack a fundamental understanding of the measurement process in quantum mechanics.
 
Last edited:
  • Like
Likes Jamister, Wes Tausend, GshowenXj and 2 others
Physics news on Phys.org
  • #2
The only way to deterministically predict the future is if you know x and p.

Now you have to answer the following question. Is the uncertainty principle about lack of knowledge or lack of simultaneous existence of these quantities.

Quantum physics says that they don't exist simultaneously, so the unpredictability is not due to incomplete knowledge. If you assume that a particle has to pass through one or the other of the slits of a double-slit experiment, then logically you would have to conclude that the number of particles arriving at a point x is the sum of particles arriving by way 1 and by way 2. This means there is no interference pattern. So when there is an interference pattern, you cannot say that the particle passes through one of the slits. So you cannot say "there is a p, but we don't know it." Assuming that it passes through one of the slits or the other will lead to logically wrong conclusions. So p and x don't exist at the same time, and the foundation for a deterministic description is removed. (not due to lack of knowledge, or incomplete knowledge)

randomness/statistical behavior is not due to incompleteness in our knowledge.
 
  • Like
Likes DanielMB, Ygggdrasil and PeroK
  • #3
You didn't define exactly what is "perfect" about the "perfect simulator".
Even if we fully understood the measurement process, that would not imply that we would be able to create a simulator that would be able to predict the exact result of a specific proposed measurement.

We could (in the future) fully understand how Nature works and thus build that simulator, but the simulator would operate from whatever starting point we gave it. We would still be unable to "download" the universe to it - simply because we can't create a complete-enough copy of it.

What we would be able to do is demonstrate the same kind of apparent "randomness" that we see when QM measurements are made in our own "real", non-simulated world.
 
  • #4
PrashantGokaraju said:
The only way to deterministically predict the future is if you know x and p.

...

So p and x don't exist at the same time, and the foundation for a deterministic description is removed. (not due to lack of knowledge, or incomplete knowledge)
No, you don't need to know x and p unless you are using the wrong model.
The right model would fully recognize the rules of QM - and also know how a measurement result becomes real.
Since the real world knows how to make a decision without fully determined p and x, so should the perfect model.
 
  • #5
That seems to be as meaningless as saying we can know what is x(t + ε) without knowing v(t) or without v existing.
 
  • #6
Perfekt simulator = a simulator that matches the specifications to a sufficient degree. I detailed the specifications required for the mini universe. We don't need to have the power to build the simulator. My point is that we don't even have the knowledge to make a blueprint whose implementation would work.
 
  • #7
.Scott said:
No, you don't need to know x and p unless you are using the wrong model.
The right model would fully recognize the rules of QM - and also know how a measurement result becomes real.
Since the real world knows how to make a decision without fully determined p and x, so should the perfect model.
The real world knows how to make a decision, but the point is that its decision is a probabilistic decision.
 
  • #8
A. Neumaier said:
The reason is that on their basis one cannot describe how the dynamics of the complete system, the miniuniverse, implies that the detectors get their particular measurement values.

My understanding is you can't infer probabilities re/ a sequence of internal detection events based solely on the dynamics of the miniuniverse. But if you also had information about the initial state of the miniuniverse (I.e. not just a Hamiltonian/action but also a density matrix), I think you could in principle assign probabilities to possible alternative sequences of internal detection events, provided the alternative sequences decohere with one another. The internal detection/measurement events would become internal correlation events.
 
  • #9
PrashantGokaraju said:
The real world knows how to make a decision, but the point is that its decision is a probabilistic decision.
In that case, "the point" has not been demonstrated. We describe it with probabilities. But it always comes out to a specific result.
 
  • #10
.Scott said:
In that case, "the point" has not been demonstrated. We describe it with probabilities. But it always comes out to a specific result.
I think the issue is this. These results have to be consistent with everything we know about physics. Even if x and p don't exist simultaneously, they can both be measured in different situations, and these measurements have to consistent with what we know about the meaning of energy, momentum etc. This is why quantum mechanics cannot be defined independently of classical mechanics. To make predictions from quantum mechanics, you must use the classical idea of x and p, hamilton's equations etc. So it cannot be the "wrong model". H = P2/2m + V(X) must be used in shrodinger's equation, which comes from classical mechanics. This way everything is consistent.
 
  • #11
PrashantGokaraju said:
These results have to be consistent with everything we know about physics. Even if x and p don't exist simultaneously, they can both be measured in different situations, and these measurements have to consistent with what we know about the meaning of energy, momentum etc. This is why quantum mechanics cannot be defined independently of classical mechanics. To make predictions from quantum mechanics, you must use the classical idea of x and p, hamiltons equations etc. So it cannot be the "wrong model"
All that means is that both the classical model and QM do not qualify as the "right model". There is something else going on. QM is consistent with this "right model", but there are some other non-local rules we haven't worked out yet.
 
  • #12
Morbert said:
My understanding is you can't infer probabilities re/ a sequence of internal detection events based solely on the dynamics of the miniuniverse. But if you also had information about the initial state of the miniuniverse (I.e. not just a Hamiltonian/action but also a density matrix), I think you could in principle assign probabilities to possible alternative sequences of internal detection events, provided the alternative sequences decohere with one another. The internal detection/measurement events would become internal correlation events.
Of course every dynamical systm has initial conditions. But in quantum mechanics, there is no principle that would tell you what are internal detection events, hence no in principle way of saying what values these would produce. For a simulation, yo'd need to specify the stochastic dynamics for the whole miniuniverse, and deduce that the individual detection events behave according to Born's rule. This is equivalent to solving the measurement problem.
 
Last edited:
  • Like
Likes Stephen Tashi
  • #13
A. Neumaier said:
Of course every dynamical systm has initial conditions. But in quantum mechanics, there is no principle that would tell you what are internal detection events, hence no in principle way of saying what values these would produce. For a simulation, yo'd need to specify the stochastic dynamics for the whole miniuniverse, and deduce ghat the individual detection events behave according to Born's rule. This is equivalent to solving the measurement problem.

You might have to bear with me if I'm saying anything obvious, but consider a miniverse in some initial state ##|\Psi\rangle##. If you want to know the probability that a a time-ordered sequence of events ##(\epsilon_1,\epsilon_2,\dots,\epsilon_n)## occurs within the miniverse, you would compute
$$||\Pi^{\epsilon_n}_{t_n}\dots\Pi^{\epsilon_2}_{t_2}\Pi^{\epsilon_1}_{t_1}|\Psi\rangle||^2$$
where ##\Pi^{\epsilon_i}_{t_i}## are the relevant projection operators and ##t_i## are the times at which they occur. And provided your alternative possible sequences of events decohere (e.g. if they constitute alternative quasiclassical histories), you don't have to invoke some external observer to talk about these events happening. There would be no a priori special category of "detection events", though you could presumably describe these events as measuring any quantities that end up correlating with them.
 
  • #14
A. Neumaier said:
Actually ever programming language has such a function, and it is executed to everyone's satisfaction.

This is not really true. You can create a pseudo-random generator, but it needs to be seeded with an outside source of randomness, otherwise you'll get deterministic behavior. And even that isn't good enough for cryptography, where every bit should be from some outside source, otherwise you'll create patterns that open new avenues for attack. This is why modern processors include an RDRAND instruction which is based on electrical white noise or other quantum sources.
 
  • Like
Likes Locrian, Lish Lash, DanielMB and 2 others
  • #15
A. Neumaier said:
Summary: We lack a fundamental understanding of the measurement process in quantum mechanics.

Actually ever programming language has such a function, and it is executed to everyone's satisfaction.

Suppose we want to create a device that faithfully simulates some aspect of Nature. To do so, we need to know enough about the working of this aspect so that we know how to build the simulation. Being able to create a detailed blueprint for a perfect simulation means that we understood this aspect. Not being able to do this implies lack of understanding.

Suppose now that the task is to create a faithful miniuniverse, a dynamical system with a state space sufficiently complex that there are internal detectors which make measurements working according to the Rules of Quantum Mechanics. Then we need to know in sufficient detail how those many-particle systems called detectors get their measurement results, based solely on the dynamical law of the miniuniverse, and hence independent of the notion of measurement.

The traditional interpretations are way too vague to allow such a blueprint to be made, even in principle. The reason is that on their basis one cannot describe how the dynamics of the complete system, the miniuniverse, implies that the detectors get their particular measurement values. This is impossible because there is no known conceptual relationship between the assumed irreducible randomness (if any) applied on the level of the complete system and the assumed irreducible randomness applied on the level of the detector results.

The current knowledge only allows a piecemeal partial understanding, as the understanding of measurement processes is only a heuristic mix of quantum mechanics, classical mechanics, and ad hoc simplifying assumptions not justified by theory but only by practice, applied separately to models of particles, materials, and components.

This shows that we lack a fundamental understanding of the measurement process in quantum mechanics.
Well, the experimentalists know obviously very well how to describe their detectors, because otherwise they couldn't successfully measure what they measure in particle physics experiments. Big part of the development of detectors consist of doing just what you describe above, namely a computer simulation of an experiment, usually using Monte-Carlo methods, i.e., all you need to know are the very detection probabilities for particles/photons you want to measure.
 
  • #16
vanhees71 said:
Well, the experimentalists know obviously very well how to describe their detectors, because otherwise they couldn't successfully measure what they measure in particle physics experiments. Big part of the development of detectors consist of doing just what you describe above, namely a computer simulation of an experiment, usually using Monte-Carlo methods, i.e., all you need to know are the very detection probabilities for particles/photons you want to measure.
The problem is not on the experimental side but on the theoretical side.

Theorists predicting standard model properties simulate the behavior of a handful particles and their detection, based on few particle quantum physics together with mostly classical physics for the equipment. This is far from simulating by the rules of quantum theory a quantum miniuniverse within which some quantum detector measures some quantum particle properties. How to do the latter even in blueprint fashion, (i.e., assuming unrestricted computer power, speed, and memory) is terra incognita.
 
  • #17
Well, all of condensed matter physics is also quite well understood theoretically. It's of course never a first-principle standard-model calculating leading to the correct predictions of material properties but effective models, partially derivable from them, partially guessed in phenomenological models with the aide of empirical guiding. It doesn't even make sense to simulate a detector from the first principles of relativistic QFT, because it doesn't use the relevant degrees of freedom to begin with.
 
  • #18
A. Neumaier said:
The problem is not on the experimental side but on the theoretical side.

Theorists predicting standard model properties simulate the behavior of a handful particles and their detection, based on few particle quantum physics together with mostly classical physics for the equipment. This is far from simulating by the rules of quantum theory a quantum miniuniverse within which some quantum detector measures some quantum particle properties. How to do the latter even in blueprint fashion, (i.e., assuming unrestricted computer power, speed, and memory) is terra incognita.
Well, in a sense, pure Everett approach should be subject to simulation. You should, for example, find a superposition of detectors with definite readings, if it all works as adherents hope.
 
  • #19
akvadrako said:
This is not really true. You can create a pseudo-random generator, but it needs to be seeded with an outside source of randomness, otherwise you'll get deterministic behavior. And even that isn't good enough for cryptography, where every bit should be from some outside source, otherwise you'll create patterns that open new avenues for attack. This is why modern processors include an RDRAND instruction which is based on electrical white noise or other quantum sources.
This criteria for "random" has problems. Ruling out repeatable series means that any "random" series that is recorded for future replays could not be considered random. That would include running the random generator first, recording the numbers, and then using them. Being able to repeat the series should not rule it out from being "random". A definition of "random" that has advantages is if it is not possible to compress the series at all. With that definition, most computer pseudo-random number generators are not random, regardless of how they are seeded.
 
  • #20
akvadrako said:
This is not really true. You can create a pseudo-random generator, but it needs to be seeded with an outside source of randomness, otherwise you'll get deterministic behavior. And even that isn't good enough for cryptography, where every bit should be from some outside source, otherwise you'll create patterns that open new avenues for attack. This is why modern processors include an RDRAND instruction which is based on electrical white noise or other quantum sources.

That reminds me of this implementation of an external random source: www.youtube.com/watch?v=1cUUfMeOijg
where they claim to be using a video camera watching a lot of lava lamps to generate random keys.

Any thoughts about this -- is it serious technology, or is it just their PR people doing a weird flex?
 
  • #21
Morbert said:
You might have to bear with me if I'm saying anything obvious, but consider a miniverse in some initial state ##|\Psi\rangle##. If you want to know the probability that a a time-ordered sequence of events ##(\epsilon_1,\epsilon_2,\dots,\epsilon_n)## occurs within the miniverse, you would compute
$$||\Pi^{\epsilon_n}_{t_n}\dots\Pi^{\epsilon_2}_{t_2}\Pi^{\epsilon_1}_{t_1}|\Psi\rangle||^2$$
where ##\Pi^{\epsilon_i}_{t_i}## are the relevant projection operators and ##t_i## are the times at which they occur. And provided your alternative possible sequences of events decohere (e.g. if they constitute alternative quasiclassical histories), you don't have to invoke some external observer to talk about these events happening. There would be no a priori special category of "detection events", though you could presumably describe these events as measuring any quantities that end up correlating with them.
This still does not say what an event is, and how a quantum detector in the miniverse would recognize it. What does it mean for a quantum detector to record an event associated with some ##\Pi^{\epsilon}##? When does it happen?
 
  • #22
PAllen said:
Well, in a sense, pure Everett approach should be subject to simulation. You should, for example, find a superposition of detectors with definite readings, if it all works as adherents hope.
Again, what is missing is a definition of what is a reading of a detector. A real measurement takes time - when is the reading a measurement?
 
  • #23
A. Neumaier said:
But in quantum mechanics, there is no principle that would tell you what are internal detection events, hence no in principle way of saying what values these would produce.

A. Neumaier said:
Again, what is missing is a definition of what is a reading of a detector. A real measurement takes time - when is the reading a measurement?

I've never understood how the major interpretations of quantum mechanics view a day in life. We judge that certain macroscopic events definitely do happen. Is the occurrence of a macroscopic event to be interpreted as an outcome of a set of measurements ?

How much of the theoretical problem is unique to quantum mechanics? If we consider a macroscopic event like "I go to the grocery store" then, in any sort of mechanics, is it safe to assume that such an event has an (unambiguous) definition? As time passes both "I" and the grocery store may gain or lose a few atoms and still remain macroscopically the "same", at least in the opinion of the "I".

vanhees71 said:
Well, the experimentalists know obviously very well how to describe their detectors, because otherwise they couldn't successfully measure what they measure in particle physics experiments.

The experimenters don't actually know to describe their detectors in atom-by-atom detail. The macroscopic experimenter can give a macroscopic description of a detector.

If we use a moderate dose of pure mathematics, we can simply assert that there is a subset of S of the set of all "real" trajectories of the universe through its possible states and this set S consists exactly of those where "I go to the grocery store". But a heavy does of pure mathematics may question whether this approach does define a set. After all, it assumes the macroscopic judgement "I go to the grocery store" is unambiguous . So there is circular procedure used in defining the set.
 
  • #24
A. Neumaier said:
Again, what is missing is a definition of what is a reading of a detector. A real measurement takes time - when is the reading a measurement?
Well, if you're an Everett purist, you don't care. You assume that if you have a state sufficient to describe your mini-world, and evolve it per the Schrodinger equation, you are done. Reading out classical results from your simulation may involve problems of definition, but running the simulation, in principle, does not.

(just FYI - I am not an Everett purist, just presenting what I understand of that interpretation).
 
  • Like
Likes akvadrako
  • #25
A. Neumaier said:
This still does not say what an event is, and how a quantum detector in the miniverse would recognize it. What does it mean for a quantum detector to record an event associated with some ##\Pi^{\epsilon}##? When does it happen?

An event which occurs at some time ##t## is a property that obtains at time ##t##, under a consistent quantum description of the system. A detector in the miniverse measures an event if the history of the detector is correlated with the occurrence of the event. Measurement itself isn't the event. It's the correlation between histories of events.
 
  • #26
Morbert said:
An event which occurs at some time ##t## is a property that obtains at time ##t##, under a consistent quantum description of the system. A detector in the miniverse measures an event if the history of the detector is correlated with the occurrence of the event. Measurement itself isn't the event. It's the correlation between histories of events.
So everything measures everything, to some extent, since zero correlations are the exception. And all events happen all the time simultaneously. A very strange miniverse! What then guarantees the coherence of behavior of the objects? How to select one reasonable scenario from the chaos where everything happens?
 
  • #27
akvadrako said:
This is not really true. You can create a pseudo-random generator, but it needs to be seeded with an outside source of randomness, otherwise you'll get deterministic behavior.
Good point, that is correct.
FactChecker said:
A definition of "random" that has advantages is if it is not possible to compress the series at all. With that definition, most computer pseudo-random number generators are not random, regardless of how they are seeded.
Yes. Shannon's information entropy can be used the evaluate the "randomness". A stream of data that has maximum information entropy (which is 8 if each symbol is coded with 8 bits) is random. If the entropy is less than 8, it's not random. I should also add that here I assume that the sample size, i.e. the length of the stream of data, is sufficiently long to make an accurate calculation of the entropy.
 
Last edited:
  • #28
Stephen Tashi said:
The experimenters don't actually know to describe their detectors in atom-by-atom detail. The macroscopic experimenter can give a macroscopic description of a detector.

If we use a moderate dose of pure mathematics, we can simply assert that there is a subset of S of the set of all "real" trajectories of the universe through its possible states and this set S consists exactly of those where "I go to the grocery store". But a heavy does of pure mathematics may question whether this approach does define a set. After all, it assumes the macroscopic judgement "I go to the grocery store" is unambiguous . So there is circular procedure used in defining the set.
Of course you cannot describe macroscopic systems in all microscopic detail. As the application of the standard many-body treatments show the macroscopic "relevant observables" tend to behave in a way well-described by classical physics.

To the contrary it's pretty difficult to observe the quantum behavior of macroscopic observables, though this is possible today thanks to refined technique; these days the have counted single phonons of a nano-mechanical oscillator:

https://arxiv.org/abs/1902.04681https://doi.org/10.1038/s41586-019-1386-x
 
  • #29
A. Neumaier said:
And all events happen all the time simultaneously. A very strange miniverse!

Not all events happen, simultaneously or otherwise. Quantum mechanics will assign probabilities to mutually exclusive sequences of events. If one sequence happens, the others won't. Quantum mechanics won't tell you which sequence actually happens, of course. But that's to be expected if the world really is probabilistic.
 
  • #30
Morbert said:
Not all events happen, simultaneously or otherwise. Quantum mechanics will assign probabilities to mutually exclusive sequences of events. If one sequence happens, the others won't. Quantum mechanics won't tell you which sequence actually happens, of course. But that's to be expected if the world really is probabilistic.
But then how does the simulation proceed in such a way that each simulated detector knows which value it has to display at which time (so that an event happens), while only propagating the wave function of the total system?

It now seems that in the simulation according to your recipe, nothing happens at all.
 
  • Like
Likes OCR
  • #31
vanhees71 said:
Of course you cannot describe macroscopic systems in all microscopic detail.
I think that you are speaking in a practical sense - i.e. such a task is too complex to be done in practice. However, what I am suggesting is that, in addition to being impractical, the task may be theoretically impossible. For example, to define theoretically and microscopically the macroscopic event "I go to the grocery store" presupposes there is some algorithm that can look at a subsequence of microscopic events and declare that it represents "I go to the grocery store".
 
  • Like
Likes FactChecker
  • #32
Stephen Tashi said:
I think that you are speaking in a practical sense - i.e. such a task is too complex to be done in practice. However, what I am suggesting is that, in addition to being impractical, the task may be theoretically impossible. For example, to define theoretically and microscopically the macroscopic event "I go to the grocery store" presupposes there is some algorithm that can look at a subsequence of microscopic events and declare that it represents "I go to the grocery store".
Why would you think this is theoretically impossible? The only reason I can think of is that you believe there is no such subsequence - that is that such a statement has no microscopic underpinning. More specifically, that you believe the microscopic representation of this, by itself, does not represent the macroscopic event; that there is something fundamentally missing, in principle, from any microscopic description. This would imply that if you took all the atoms and states involved, they would not implement the macroscopic event unless you added something else. Do you have any suggestion of what that something else is?
 
  • #33
Hi,

Why randomness means incomplete understanding: it is a point of view that stems, in my opinion, from the fact that quantum mechanics is an essentially predictive theory. It's not an explanatory theory, hence a large number of interpretations. A scientific explanation is traditionally defined as a causal assignment, but quantum mechanics challenges our commonsense picture of causality. For example by implying that some things happen at random, with no apparent cause.

Heisenberg : We have to relearn what understanding really means.

/Patrick
 
  • #34
It is important to be clear about the concepts. Quantum theory is completely causal, even in a strong sense: Knowing the state at time ##t_0## and knowing the Hamiltonian of the system, you know the state at any time ##t>t_0##.

The difference to classical physics is that QT is indeterministic, i.e., having prepared a system in a minimum-entropy state (i.e., a pure state) doesn't imply that all observables take determined values. All the tests of QT indicate that this is not due to some incompleteness of possible knowledge but an inherent feature of nature.
 
  • #35
vanhees71 said:
It is important to be clear about the concepts. Quantum theory is completely causal, even in a strong sense: Knowing the state at time ##t_0## and knowing the Hamiltonian of the system, you know the state at any time ##t>t_0##.

The difference to classical physics is that QT is indeterministic, i.e., having prepared a system in a minimum-entropy state (i.e., a pure state) doesn't imply that all observables take determined values. All the tests of QT indicate that this is not due to some incompleteness of possible knowledge but an inherent feature of nature.
Elsewhere you said that quantum mechanics is solely about predicting experiments. In that sense it predicts nothing in the microscopic domain in a causal fashion. This was the view of Heisenberg at the time he wrote his paper about the uncertainty relation [referred to as equation (1) below], directly opposing what you wrote above:
Werner Heisenberg said:
Da nun der statistische Charakter der Quantentheorie so eng an die Ungenauigkeit aller Wahrnehmung geknüpft ist, könnte man zu der Vermutung verleitet werden, daß sich hinter der wahrgenommenen statistischen Welt noch eine ,,wirkliche'' Welt verberge, in der das Kausalgesetz gilt. Aber solche Spekulationen scheinen uns, das betonen wir ausdrücklich, unfruchtbar und sinnlos. Die Physik soll nur den Zusammenhang der Wahrnehmungen formal beschreiben. Vielmehr kann man den wahren Sachverhalt viel besser so charakterisieren: Weil alle Experimente den Gesetzen der Quantenmechanik und damlt der Gleichung (1) unterworfen sind, so wird durch die Quantenmechanik die Ungültigkeit des Kausalgesetzes definitiv festgestellt.
(Maybe someone can find an English translation.)
 

Similar threads

Back
Top