A Could QM Arise From Wilson's Ideas

  • Thread starter Thread starter bhobba
  • Start date Start date
  • Tags Tags
    Ideas Qm
Click For Summary
The discussion revolves around the relationship between quantum field theory (QFT), quantum mechanics (QM), and the concept of effective field theories. Participants explore the idea that QFT serves as a low-energy approximation of various theories, with QM potentially arising as a limiting case of QFT. There is debate over the claim that effective theories at low energies appear renormalizable, particularly in the context of quantum gravity, which is considered non-renormalizable. The conversation also touches on the implications of Wilson's effective theory framework and how it relates to the fundamental nature of quantum theories. Ultimately, the complexities of integrating quantum mechanics with gravitational interactions remain a central challenge in theoretical physics.
  • #31
WernerQH said:
I´ve been thinking about this for a long time, and I´ve become convinced that the measurement problem will dissolve once we´ve learned to view QFT from the right angle.
For a long time I've been thinking exactly the opposite. But then I learned about the condensed matter view of QFT, where the field is often an effective long distance non-fundamental thing. Then Wilson's ideas started to make more sense to me and the idea that relativity is not fundamental started to look like a very natural idea. With relativity being non-fundamental, Bell nonlocality suddenly becomes non-problematic. And when non-locality ceases to be a problem, additional variable approaches to the measurement problem like Bohmian mechanics become more natural.
 
Last edited:
  • Like
Likes bhobba
Physics news on Phys.org
  • #32
Demystifier said:
With relativity being non-fundamental, Bell nonlocality suddenly becomes non-problematic.
With propagators reaching backwards in time, Bell nonlocality poses no problem for QFT either. And I see the measurement problem as a non-problem.
 
  • Like
Likes bhobba
  • #33
WernerQH said:
With propagators reaching backwards in time, Bell nonlocality poses no problem for QFT either. And I see the measurement problem as a non-problem.

The measurement problem has a pragmatic answer, but it doesn't (yet, as far as I know) have a mathematically consistent answer that doesn't go beyond accepted quantum mechanics.

Suppose you have an electron that has a spin state ##\alpha |U\rangle + \beta |D\rangle## where ##|\alpha|^2 + |\beta|^2 = 1## and ##|U\rangle## and ##|D\rangle## are eigenstates of the z-component of spin with spin +1/2 and -1/2, respectively. Does this mean that the electron has a probability of ##|\alpha||^2## of being spin-up in the z-direction, and a probability of ##|\beta|^2## of being spin-down?

I think most people would say "no". Until it's measured, the particle doesn't have a spin in the z-direction. It's not that we just don't know what its spin is. As a matter of fact, you could say that there is nothing at all uncertain about the electron's spin: It has a definite spin of +1/2 along the axis ##\vec{S}## satisfying

##S_z = \frac{1}{2} (|\alpha|^2 - |\beta|^2)##
##S_x = \frac{1}{2} (\alpha \beta^* + \alpha^* \beta)##
##S_z = \frac{i}{2} (\alpha \beta^* - \alpha^* \beta)##

So now put that electron through a Stern-Gerlach device, so that electrons that are spin-up in the z-direction are deflected to the left, to make a dot on the left side of a photographic plate, and electrons that are spin-down in the z-direction make a dot on the right side.

Most people would say that now probabilities come into play. There will be a probability of ##|\alpha|^2## of a dot on the left, and a probability of ##|\beta|^2## of a dot on the right.

But why? Presumably, Stern-Gerlach devices and photographic plates are made up of electrons and protons and photons and neutrons. Each of these constituents is like the original electron, in having a quantum state. So why isn't it the case for this huge system that there are no probabilities until someone measures ITS state? (And whatever you use to measure its state, you can ask why isn't it described by a quantum state, requiring yet another measurement to give probabilities.)

The measurement problem is basically why probabilities apply to measurements (which presumably are just quantum mechanical interactions involving a huge number of particles) but not to interactions involving a small number of particles (one or two or three electrons, for example)?
 
  • Like
Likes eloheim and physika
  • #34
Demystifier said:
With relativity being non-fundamental, Bell nonlocality suddenly becomes non-problematic.

I would say that before relativity was invented, people had qualms about action at a distance. I seem to recall that Newton had problems with it, even though his theory of gravity was the paradigm example.
 
  • #35
WernerQH said:
With propagators reaching backwards in time, Bell nonlocality poses no problem for QFT either. And I see the measurement problem as a non-problem.
Yes, but propagation backwards in time seems to be at odds with the observed arrow of time. See also https://arxiv.org/abs/1011.4173v5
 
  • #36
stevendaryl said:
I would say that before relativity was invented, people had qualms about action at a distance. I seem to recall that Newton had problems with it, even though his theory of gravity was the paradigm example.
True, but there is no technical problem with nonlocality in the absence of Einstein relativity. Finding nonlocal equations for time evolution is only problematic if one requires that those equations should be relativistic covariant.
 
  • #37
stevendaryl said:
Presumably, Stern-Gerlach devices and photographic plates are made up of electrons and protons and photons and neutrons. Each of these constituents is like the original electron, in having a quantum state.
I think the measurement problem is based on two misconceptions:
(1) Quantum theory is deterministic because of Schrödinger´s equation.
(2) The wave function represents an individual system.

It is better to look at QT as a statistical theory of events. Any experiment can be represented by a series of operators, and the probability of a particular sequence of events is proportional to the trace of the product of those operators. (Usually the shortcut is taken of considering only the forward direction of time, yielding probability amplitudes, but the full story requires to add also operators in the backward direction to the operator product. This is the Schwinger/Keldysh closed time-path formalism.) It is a waste of time to contemplate which "quantum state" a system was "really" in at a particular time. These states are always summed over.

The wave function is but a single piece in a bigger mathematical apparatus. Observable quantities require kets to be combined with bras, and the states have to be summed over. A wave function can characterize a beam of completely polarized particles, but it still is a statistical description. In special cases the experimental results can be predicted with close to 100% probability. I would not call this evidence for an underlying determinism, but just a special probability distribution. Quantum theory makes nothing but statistical predictions.

Ultimately the randomness may of course have its root in the environment, whose influence can never be eliminated completely. But considering randomness as intrinsic to "quantum objects" is more economical. We don´t have to include the entire universe in our calculations.
 
  • #38
Demystifier said:
Yes, but propagation backwards in time seems to be at odds with the observed arrow of time.
So what? Does the "arrow of time" invalidate classical mechanics or electrodynamics?
 
  • #39
WernerQH said:
I think the measurement problem is based on two misconceptions:
(1) Quantum theory is deterministic because of Schrödinger´s equation.
(2) The wave function represents an individual system.

That's a misconception.

The measurement problem is unsolved in the sense that nobody has given a sensible resolution to it. The ensemble view doesn't solve the measurement problem.

It is better to look at QT as a statistical theory of events

Okay, what is an "event"?
 
  • Like
Likes Lord Jestocost
  • #40
stevendaryl said:
Okay, what is an "event"?

The point of view that quantum mechanics is simply a way to make statistical predictions raises the issue of what are the predictions about? An instrumentalist would say that ultimately the predictions are about measurement results. That IS the measurement problem! That isn't the solution to the measurement problem. A measurement result is something along the lines of "there was a black dot on the left side of the photographic plate behind the Stern Gerlach device". Or if you want to be statistical, "out of 100 trials, 26 produced dots on the right side and 74 produced dots on the left side". But a Stern Gerlach device and the photographic plate are made up of electrons, protons, neutrons, etc., each of which is a particle described by quantum mechanics. Why do measurement results have definite values, if simpler things such as "the electron has spin-up in the z-direction" does not?

To say that a theory is only probabilistic, rather than deterministic, presupposes that some events really happen. Statistics is counting something, right? So what are those events? Why does a macroscopic measurement result count as an event, but a microscopic claim such as "the electron has spin up" does not?
 
  • Like
Likes PeterDonis
  • #41
WernerQH said:
So what? Does the "arrow of time" invalidate classical mechanics or electrodynamics?
Sometimes yes. For example, in classical electrodynamics there is an advanced Green function, but it does not correctly describe radiation from a source as seen in nature.
 
  • Like
Likes bhobba
  • #42
stevendaryl said:
The ensemble view doesn't solve the measurement problem.

Okay, what is an "event"?
It depends on what the members of the ensemble are. If they are particles that always have definite properties, then there's a problem of course.

Events are points in spacetime. The absorption of a photon is often considered an event, and I hope you agree with that. In terms of Feynman diagrams it happens at a single point (vertex). But this is a simplification. In the real world the absorption of a photon is not instantaneous, but involves at least two events (points in spacetime). If you look back at the equation in my earlier comment (#26), you can see that it involves two currents ##
j_\mu(x_2,t_2) ## and ## j_\nu(x_1,t_1) ##. These "current" events may themselves be composite events, signifying successive locations of an electron. I consider them real.

You are absolutely right that a theory must say what it is about. And it should be clear what it was that has been measured. The classical world emerges from a vast number of events. We speak of a "track" of a particle in a cloud chamber, though it consists only of separate tiny droplets. Particles and fields are essentially classical concepts, names we attach to particular patterns of events in spacetime.
 
  • #43
Demystifier said:
Sometimes yes. For example, in classical electrodynamics there is an advanced Green function, but it does not correctly describe radiation from a source as seen in nature.

I'm not sure what to think about that. Suppose you have a source of radiation, say a charged particle that is at rest from time ##t \lt t_1##, jiggles back and forth during the interval ##t_1 \lt t \lt t_2##, and then is at rest for ##t \gt t_2##. That information doesn't uniquely determine the electromagnetic field. But if you add the information that there was no radiation prior to ##t_1##, then that additional information determines that there will be radiation spreading out away from the jiggling particle. If instead, you assume that there will be no radiation after ##t_2##, then that implies that there will be radiation converging toward the jiggling particle.

So the use of retarded versus advanced Green functions is connected with the fact that our "initial conditions" are about the past, rather than the future.
 
  • #44
WernerQH said:
Events are points in spacetime.

That's a different notion of "event"

The absorption of a photon is often considered an event, and I hope you agree with that.

I'm asking you what you mean by an event. Photon absorption is an event, but an electron having spin-up is not?

In terms of Feynman diagrams it happens at a single point (vertex).

Feynman diagrams don't represent anything that happens. They are terms of an expansion of an amplitude in powers of the coupling constant.

j_\mu(x_2,t_2) ## and ## j_\nu(x_1,t_1) ##. These "current" events may themselves be composite events, signifying successive locations of an electron. I consider them real.

So does a current have an actual value, as opposed to something like electron spin? Or do they both have actual values at all time?

With a single electron, many people say that it simply doesn't have a value for, say ##S_z## except in the special case where it was prepared as an eigenstate. Do you agree with that? Or do you say that it has definite values for all observables at all times, but we just don't know what those values are until they are measured?

I don't think that the latter is a consistent thing to believe, in light of Bell's inequality.

But if you say the former, then there is the question of why macroscopic quantities have definite values but not microscopic quantities.

You are absolutely right that a theory must say what it is about. And it should be clear what it was that has been measured.

Well, that's the measurement problem. Why should measurements have definite values, when they are made up of lots and lots of microscopic events that don't have definite values?
 
  • #45
stevendaryl said:
That's a different notion of "event"
Perhaps I should have said a "speck" in spacetime, not a point.

So does a current have an actual value, as opposed to something like electron spin? Or do they both have actual values at all time?
Nothing has actual values at all time (except of course the fundamental constants :-)
Current is something intermittent, since it is carried by electrons and the charge comes in lumps. A value of 1 ampere can only be a time average.

Why should measurements have definite values, when they are made up of lots and lots of microscopic events that don't have definite values?
Can you speak of a breeze when the molecules of the air have actual speeds of several hundred meters per second? Of course you can. Averages can have very well defined values.

Electron spin or photon polarization is something that cannot be defined for a single instant of time. It describes correlations between closely spaced but separate events. (In the case of photons between the directions of the electric field a quarter period apart.)
 
  • #46
WernerQH said:
Can you speak of a breeze when the molecules of the air have actual speeds of several hundred meters per second? Of course you can. Averages can have very well defined values.

The question is: Does each molecule have a speed, or not? If not, then what does it mean to take an average of a quantity that doesn't really exist?

In classical statistical mechanics, each molecule actually has a speed, but of course, we don't know those ##10^{26}## values, so we use averages.

WernerQH said:
Electron spin or photon polarization is something that cannot be defined for a single instant of time. It describes correlations between closely spaced but separate events.

Did you answer the question: "What is an event?" Are you talking points in spacetime or things happening?
 
  • #47
stevendaryl said:
Did you answer the question: "What is an event?" Are you talking points in spacetime or things happening?
Both. I'm talking of things (really) happening. At a definite location and a definite time. But of course there may be "big" events like the click of a Geiger counter that are composed of a large number of microscopic events. Those big events can have only an approximately defined time and location.

Demokritos said there are only atoms and the void. I suggest to extend the idea of atomism from ordinary 3-dimensional space to spacetime.
 
  • #48
WernerQH said:
In terms of Feynman diagrams it happens at a single point (vertex).
Feynman diagrams are usually drawn in momentum space, not position space, so a single point does not represent a point in spacetime.
 
  • #49
WernerQH said:
But of course there may be "big" events like the click of a Geiger counter that are composed of a large number of microscopic events. Those big events can have only an approximately defined time and location.
But what is a microscopic event?
 
  • Like
Likes Demystifier and PeterDonis
  • #50
PeterDonis said:
Feynman diagrams are usually drawn in momentum space, not position space, so a single point does not represent a point in spacetime.
True. It obviously simplifies the calculations. But this does not imply that correlation functions in position space are meaningless. One integrates over space because a scattering process can happen anywhere; it does not mean that the scattering is strictly delocalized.
 
  • #51
stevendaryl said:
But what is a microscopic event?
Theorists can create an infinite number of events like "the particle arrives at location ## x_ {2731} ## ...". This is not the kind of event I have in mind. I think of events as real and finite in number in a given spacetime region. As you decrease the volume and time interval you eventually find regions of spacetime that are empty.

Heisenberg rejected the idea of trajectories of electrons. Unfortunately he offered no alternative. Looking at the world-line of an electron with ever increasing resolution, according to Heisenberg, produces nothing but a diffuse haze. This cannot be resolved experimentally because observations at the zeptosecond scale (511 keV) cause pair creation, confounding the issue. But we do not have to assume that electrons vacillate between "actual" positions at measurements and spread-out, uncertain, "potential" positions in between. We can consider isolated interaction events (not necessarily involving "measurements"!) as real, and the electron as a derived concept. A construction of our mind. Residual metaphysical baggage from the classical world. Just as we connect the droplets in the cloud chamber to form a continuous "track".

Rather than a theory of photons and electrons, it may be more appropriate to view QED as a theory of point processes in spacetime. Electrons and photons enter only as "propagators", correlation functions between events.
 
  • #52
WernerQH said:
Theorists can create an infinite number of events like "the particle arrives at location ## x_ {2731} ## ...". This is not the kind of event I have in mind. I think of events as real and finite in number in a given spacetime region.

I don't understand what you have in mind, but it seems to me that elucidating what they are is an important part of solving the measurement problem.
 
  • #53
stevendaryl said:
I don't understand what you have in mind, but it seems to me that elucidating what they are is an important part of solving the measurement problem.
As I've said before, I don't see the measurement problem. The problem with QM is its ontology, the talk of "quantum objects" with or without well defined properties, or only when "measured". These objects can emerge as special patterns of events in spacetime, just as pixels on your screen may form the letter "Q" without Q-ness being a fundamental property of the screen.
 
  • #54
WernerQH said:
As I've said before, I don't see the measurement problem.

Well no offense, but to me, the problem is illustrated by the frustrating vagueness of explanations such as

These objects can emerge as special patterns of events in spacetime, just as pixels on your screen may form the letter "Q" without Q-ness being a fundamental property of the screen.

To me, that’s just mush.
 
  • Like
Likes Lord Jestocost
  • #55
WernerQH said:
this does not imply that correlation functions in position space are meaningless.
I wasn't saying they were. I was only saying that you can't view a vertex in a Feynman diagram as describing an event at a point in spacetime.
 
  • #56
stevendaryl said:
Well no offense, but to me, the problem is illustrated by the frustrating vagueness of explanations such as
[deleted]

I apologize if my post was rude. Let me try one more time to explain the issue.

Quantum mechanics can be described as a way of calculating probabilities. The question is: probabilities of what? If the answer is “of measurement results”, then that raises the question of what is a measurement result, so that’s the measurement problem. If the answer is something else, then that raises the question of what, precisely, is that something else.

One answer that is at least clear is the Bohmian answer: the basic probabilities in QM are probabilities for particles to be in particular locations. Whether they are observed there, or not.

Another answer that I have heard is that the basic probabilities are for macroscopic collective properties. In the limit as the number of particles goes to infinity, collective properties such as total momentum and center of mass commute, even though single-particle momentum and position do not.

And then the relational interpretation or the quantum Bayesianism interpretation say that the probabilities are subjective.
 
  • #57
stevendaryl said:
I apologize if my post was rude. Let me try one more time to explain the issue.
No offense taken. In a way I find your reaction fairly reasonable. I've been trying for years to make my ideas more precise.

The list of your possible answers is rather short, and I have obviously failed to indicate the general direction of my answer. I'm optimistic that progress can be made through
(1) Studying the mathematics of point processes. Functionals play a key role in this mathematical field as they do in QFT.
(2) The Keldysh closed time-path formalism. It eliminates the need for "measurements". It further indicates that there are at least two types of events (on forward and backward time sheets).
(3) Non-commutative geometry. Alain Connes has derived the general features of the Standard Model from as little as two spacetimes "glued" together, if I interpret it correctly. Unfortunately it is way above my mathematical abilities.

I'm not sure I understand your concern about probabilities. They don't pose real problems in statistical mechanics, and QFT shares many features with statistical mechanics.
 
  • #58
WernerQH said:
I'm not sure I understand your concern about probabilities. They don't pose real problems in statistical mechanics, and QFT shares many features with statistical mechanics.

But it doesn’t share the features that make probabilities problematic in quantum mechanics.

Statistical mechanics makes a very specific and understandable use of probabilities: The state of the system is assumed (in some formulations, anyway) to be a point in phase space. The probability distribution (the Boltzmann distribution, for example) gives the probability that the system is in this state versus that state (or more accurately, this neighborhood of phase space, since the probability of being at a specific point is zero).

So that’s what is missing from an interpretation of quantum mechanics: what are the probabilities about?
 
  • #59
stevendaryl said:
So that’s what is missing from an interpretation of quantum mechanics: what are the probabilities about?

Knowledge? Is there a better candidate for what probabilities are about?
 
  • Skeptical
Likes Motore
  • #60
EPR said:
Knowledge? Is there a better candidate for what probabilities are about?

That just raises the question: knowledge about what?
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 69 ·
3
Replies
69
Views
7K
  • · Replies 473 ·
16
Replies
473
Views
30K
  • · Replies 36 ·
2
Replies
36
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 38 ·
2
Replies
38
Views
5K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
299
  • · Replies 18 ·
Replies
18
Views
4K