Peter Morgan (QM ~ random field, non-commutative lossy records?)

In summary, the author argues that we should rethink how we engage with experiment, and that this should involve thinking about how we engage with experiment from a more realistic perspective.
  • #1
Fra
4,105
607
TL;DR Summary
Trying to jumpstart my understanding of Peter Morgan's vision, starting from the big picture rather than details...
"One way to ground everything in reality is to think purely about the records of experiments that are stored in computer memory. Very often, that's a list of times at which events happened."
-- Peter Morgan, old thread meaning-of-wave-function-collapse

"If we are to understand the relationship between classical and quantum mechanics better than through the somewhat ill–defined process of quantization, one way to do so is to construct a Hilbert space formalism for classical mechanics"
-- Peter Morgan, https://arxiv.org/abs/1709.06711v6

"Experimental raw data is taken here to be a finite, lossily compressed record of an arbitrarily detailed collection of possible measurements of noisy signal voltage measurements
...
it is helpful to think of the mathematics of quantum field theory as a continuously indexed field of measurements, not as a field that is measured.
...
quantization valiantly attempts to construct a map from the commutative algebra generated by classical phase space position and momentum observables q, p, to the noncommutative Heisenberg algebra generated by quantum observables ˆq, ˆp, but, to say it bluntly, fails.
...
The paper shares with Wetterich[10] a concern to motivate and to justify the use of noncommutativity as a natural classical tool
"

-- Peter Morgan, https://arxiv.org/abs/1901.00526

I am trying to identify what you aim at here: Do I understand it right, that dissatisfied by the "quantization procedure" and the ad hoc quantum logic, you seek to understand QM, from more realistic "classical" random fields, but which is "made quantum" by means of changing the classical phase space with another classical space, which is non-commutative and lossy? But in a way that doesn't obey Bell inequalities? Where the non-commutative nature of the lossy records is a key?

/Fredrik
 
Last edited:
  • Like
Likes gentzen
Physics news on Phys.org
  • #2
gentzen said:
And maybe also in general, can you try to reduce your spelling mistakes a bit? Sometimes you make so many spelling mistakes that guessing what you wanted to write gets challenging for me.
This is true. I am aware of this, I wrote that too fast in one go without spell checking. This is not just due to a different native tounge, it happens in my native tounge as well.

/Fredrik
 
  • Like
Likes gentzen
  • #3
One of the difficulties I have with presenting this material is that in my experience there are not many articles in QM that have a similar overall "vision". My initial motivation, back in the 90s, was that I could only see events in the standard electron diffraction pictures, for example. From the patterns of such events I can see that we can construct statistics for the events associated with each region where there's a device (each of which we construct to exhibit and record events) but I couldn't see any trajectories, so it seemed to me that our axioms should not begin with something like "For every system we introduce a Hilbert space", which many axiomatizations of QM do. Instead, we associate statistics with regions of space-time where we have placed a device and we associate correlations with pairs (or any number of) regions where we have placed a device. This seems to me rather closer to field theoretic axiomatizations and to algebraic axiomatizations of QM. Say "no" to particles and systems.
If we're trying to rethink how we engage with experiment, I think we have to back out of much of the success of QM and think about how we engage with experiment from somewhere out of the box (but later I think we must make contact through isomorphisms so we don't have to redescribe every single experiment ever.) This is more-or-less my choice of different perspective: events, regions, probabilities, statistics, and a form of empiricism. If this doesn't resonate as a starting point, then I think there is still some of the mathematics in these papers that can be worthwhile, but getting past my motivation may make that difficult. It is not easy to find the right journals and referees to accept these ideas, but nor is it impossible.
Note that there is nothing about quantum or classical in the above. Everything is records of experimental results. In modern experiments, these records are stored to computer memory at Megabytes or more per second, so any involvement of the human mind that there may have been in the 1920s seem to me enough less relevant now that I think the human mind's involvement is essentially in designing an experimental apparatus and protocol and building that apparatus and operating it. That is not insignificant, but I see no utility in thinking that I am, as a theorist right now, collapsing a wave function of an experimenter who is actually working with their apparatus right now.
Once we have this classical raw data, just as the Copenhagen interpretation says we should, we construct perfectly ordinary relative frequencies and add them up and subtract them in the usual way, then we say, look, that's much greater than 2, so the Bell inequalities are violated and CM is bad. Everything we have done is just a sequence of elementary classical algorithms, however, so why can we not describe those algorithms within classical mechanics? The answer to that is that if we introduce noncommutativity into CM, we have a mathematical structure of Hilbert spaces and operators that is as powerful as that of QM and we can describe those algorithms as well as QM can.

The violation of Bell inequalities is often said to be about both realism and locality, however there is an algebraic strand in the literature, starting with Csirelson in 1980 and most commonly attributed to Arthur Fine in 1982. The version I like best is the very mathematical presentation by Lawrence Landau in 1987, for which we almost only need the first page of his paper, so here it is,
1664058765396.png

My apologies to anyone who already knows about this already, but I think the key aspect is that everything in this approach is about noncommutativity. If a formalism is only commutative it must satisfy the Bell inequalities, otherwise not. If we can add noncommutativity into the classical formalism, therefore, which we can, giving us CM+, the violation of Bell inequalities is not a no-go for CM+, even though it is for the old-time CM. I discuss the violation of the Bell inequalities in the video of a talk I gave at IQOQI Vienna in March 2021, , starting at about the 10 minute mark, for about five minutes, however what I say now has changed somewhat, which you can see in the slides of the talk I gave to the particle physics seminar in Bogotá that I linked to earlier, https://www.academia.edu/86450002/The_connection_between_QFT_and_random_fields (for which there is no video).

I've somewhat banged this out, so it's more lightly edited than usual for me and it may have logic holes, but I have some IRL stuff now, so I'll just go with it as it is. I certainly haven't addressed everything in your initial post, so let me know if there's something you want me to focus on. I'm also willing to set up a PF Zoom session if there's interest (I don't care whether it's just one or 20 people, because seeing why people like or dislike specific aspects is good for me).
 
  • Like
Likes Fra
  • #4
Peter Morgan said:
not many articles in QM that have a similar overall "vision"
From my perspective this is certainly not a bad thing. (I represent an odd stance myself, so whatever I like or don't like will most probably not help you get published in the main channels I'm afraid!)
Peter Morgan said:
it seemed to me that our axioms should not begin with something like "For every system we introduce a Hilbert space", which many axiomatizations of QM do. Instead, we associate statistics with regions of space-time where we have placed a device and we associate correlations with pairs (or any number of) regions where we have placed a device.
...
If we're trying to rethink how we engage with experiment, I think we have to back out of much of the success of QM and think about how we engage with experiment from somewhere out of the box (but later I think we must make contact through isomorphisms so we don't have to redescribe every single experiment ever.) This is more-or-less my choice of different perspective: events, regions, probabilities, statistics, and a form of empiricism. If this doesn't resonate as a starting point
I will check your youtube later, but possibly not today. A short note is that I do symphatise with the above - ie. start from the actual recorded data, and construct from there. But one KEY for me is that this data is a lossy retetion of the observation history, and there are "choices" then of what to keep and what to discard.
Because in lossy retention you have to keep the most significant part. For example, you can't afford to fill up your memory with useless noise, instead, transforming the data and finding patterns in the noise is the better way. But WHICH patterns? It depends on what you expect from the future? Fourier transforms seems very natural when you have periodic phenomena for sure. So I agree that optimal coding arguments and signal processing can explain the quantum logic rather than postulate it. IF this is a little bit of what you think, I totally agree on that sub-vision. But this is entangled with lots of other problems of course.

More when I checked your references.

/Fredrik
 
  • Like
Likes Peter Morgan
  • #5
Fra said:
I will check your youtube later, but possibly not today. A short note is that I do symphatise with the above - ie. start from the actual recorded data, and construct from there. But one KEY for me is that this data is a lossy retetion of the observation history, and there are "choices" then of what to keep and what to discard.
Because in lossy retention you have to keep the most significant part. For example, you can't afford to fill up your memory with useless noise, instead, transforming the data and finding patterns in the noise is the better way. But WHICH patterns? It depends on what you expect from the future? Fourier transforms seems very natural when you have periodic phenomena for sure. So I agree that optimal coding arguments and signal processing can explain the quantum logic rather than postulate it. IF this is a little bit of what you think, I totally agree on that sub-vision. But this is entangled with lots of other problems of course.
No worries about going slowly. To echo your comment, one thing I emphasize in that YouTube video is that we make a remarkable choice about lossy compression when we store only the time at which an event happens in a device (which I usually assume is an Avalanche PhotoDiode, but we can use any device that has an output signal on which events can be identified), which typically takes a time that is of the order of a nanosecond, instead of storing the signal level every nanosecond, say, a compression factor that is in the thousands. If we had analog-to-digital convertors that could operate at picosecond scales, then the compression factor would be in the millions, et cetera. We might say, and I would agree, that most of that unrecorded signal level data is information about the inner structure of the APD, so it's not pertinent to knowing about the quantum state we might say we're interested in —that of the noisy electromagnetic field, with all its quantum and thermal fluctuations— but, if we're thinking classically as well as quantumly, every feature of the output signals from many APDs can be used as one contribution in an analysis of the shape over time of the whole experiment (you know, "Can one hear the shape of a drum?") We're allowed to put a new APD wherever we like, which will affect the correlations of events and of signal levels for the original APDs, but we can describe this effect for quantum fluctuations in CM+ just as well as we can in QM because both are Hilbert space formalisms.

It may be, @Fra, that you can use the mathematics of this better than I can. I encourage you to take what you like and ignore wherever you think I make mistakes. I think the overall picture of QM-and-CM+ that I'm suggesting will eventually be capable of a relatively simple presentation that someone will write up in a neat Dirac Book version, but the transition period from QM-is-better-than-CM to QM-and-CM+ is not as simple to present.

I'm discussing the signal analysis aspect in this reply, so it occurs to me to mention here that if we take the difference between quantum and thermal fluctuations to be associated with the Lorentz group, then we have to work in at least 1+1 dimensions, because the Lorentz group is defined in terms of a metric on an n+1-dimensional space-time. I take this and a signal analysis approach to be a new reason why we have to work with field theories. I don't think we can understand QM without entering into QFT and their random field equivalent, whereas much of the philosophy and foundations of Bell-EPR et cetera tries to (and has often said that we should only try to) answer questions at the level of QM and will only try to understand field theoretic models when we have succeeded with QM.

Finally, I note that to me there is no part of the signal that is a priori the "most significant part". We have to be careful because if we allow arbitrary transformations we can, for example, see the Face of Christ in every slice of toast, which for abstract signals can become a kind of generalized pareidolia. Perhaps transformations that give us what we want to see are significant, but which of the other transformations are we going to discard?

Finally turns into "lastly": do you have somewhere I can look for some kind of summary of your ideas (I've linked to three 20 page articles, a YouTube video of the IQOQI talk, and a PDF of another talk for an account of my evolving ideas over the last five years(!), and there's another more recent paper on arXiv about renormalization, which is what I much more want to understand than the measurement problem and all that, but I'm hoping you have a nice five page summary?)
 
  • Like
Likes gentzen
  • #6
Just an intermittent note:

1) I am busy with som irl stuff so it's slow, but I've started to digest your writings, the issue is that I connect to several keys you mention, but my preferred "perspective" and abstractions are very different so i need to translate things and redigest it. About your point that the non-commutative measures constructed from joint measures can be understood with normal probability, that is something I resonate with. But you call it classical mechanics+ and it makes me think there is a lot of baggage. I do not think of the joint probability measures as classical mechanics, I think of them as more abstract, but where one startes with joint measure spaces, that have relations (such as for example Fourier transform). But the way i see it, the "quantum interactions" may well result from when two such structures communicate... this sort of thing is what I felt i connected to, but while you describe it in terms of "Frank" the APD, I prefer to think of abstract material agents, that literally encode the joint measures in their microstate. I felt this was reasonably close to what you seem to say, but i need to think about it another round

(About referencing my own ideas: as I don't do this for a living/funding I do not "have to" publish anything until I have at minimum convinced myself that I have solved some actual problems. Any "partial progress" along the road, I haven't published anywhere. The journey there is indeed not a straight path and I don't expect any sane person to pay interest in that unless it's similar thinker, which I have yet to find. If you are into one of the major specific research programs (say string theory), then you have a community of peers to publish all the immature stuff in, and your peers may love it no matter how far from solving an actual problem you are.

Anyway, I subscribe to an somewhat unique interacting "agent/IGUS/obsever" view, which is a derivative and mix of qbism where the agents are associated to matter. So the search for the optimal agents and their inference system at any complexity scale, is dual to the search for the the elementary particles and their naked interaction rules at various energy scales. I have been inspired by ideas of many other researchers that want to see the laws of physics as following from rules of inference, but in an agent centered way. But even those I have been greatly inspired by fall short on things, but that doesn't prevent you from beeing inspired by lines of reasoning. But as this is not a major interpretation, it's not allowed to discuss it on here. And it's also not just an interpretation, it inspires to reconstruct the measurement theory including the laws and their unification which is a massive not not say impossible task. I am also not yet old enough to think that I should start thinking about noting things down for the benefit of future thinkers, there is still time to do so.)


Anyway, I will continue with your paper and slide when i get to it..

/Fredrik
 
  • #7
Fra said:
Just an intermittent note:

1) I am busy with som irl stuff so it's slow, but I've started to digest your writings, the issue is that I connect to several keys you mention, but my preferred "perspective" and abstractions are very different so i need to translate things and redigest it. About your point that the non-commutative measures constructed from joint measures can be understood with normal probability, that is something I resonate with. But you call it classical mechanics+ and it makes me think there is a lot of baggage. I do not think of the joint probability measures as classical mechanics, I think of them as more abstract, but where one startes with joint measure spaces, that have relations (such as for example Fourier transform). But the way i see it, the "quantum interactions" may well result from when two such structures communicate... this sort of thing is what I felt i connected to, but while you describe it in terms of "Frank" the APD, I prefer to think of abstract material agents, that literally encode the joint measures in their microstate. I felt this was reasonably close to what you seem to say, but i need to think about it another round

(About referencing my own ideas: as I don't do this for a living/funding I do not "have to" publish anything until I have at minimum convinced myself that I have solved some actual problems. Any "partial progress" along the road, I haven't published anywhere. The journey there is indeed not a straight path and I don't expect any sane person to pay interest in that unless it's similar thinker, which I have yet to find. If you are into one of the major specific research programs (say string theory), then you have a community of peers to publish all the immature stuff in, and your peers may love it no matter how far from solving an actual problem you are.

Anyway, I subscribe to an somewhat unique interacting "agent/IGUS/obsever" view, which is a derivative and mix of qbism where the agents are associated to matter. So the search for the optimal agents and their inference system at any complexity scale, is dual to the search for the the elementary particles and their naked interaction rules at various energy scales. I have been inspired by ideas of many other researchers that want to see the laws of physics as following from rules of inference, but in an agent centered way. But even those I have been greatly inspired by fall short on things, but that doesn't prevent you from beeing inspired by lines of reasoning. But as this is not a major interpretation, it's not allowed to discuss it on here. And it's also not just an interpretation, it inspires to reconstruct the measurement theory including the laws and their unification which is a massive not not say impossible task. I am also not yet old enough to think that I should start thinking about noting things down for the benefit of future thinkers, there is still time to do so.)


Anyway, I will continue with your paper and slide when i get to it..

/Fredrik
Thanks, Fra. Again, speed doesn't matter too much here. I'll likely be watching this thread until the day I die, right? I completely understand that some aspects of my work may resonate while others may be a huge turnoff. Take what you like and turn away from the rest. I hope there will be enough you like that you'll take some time and trouble to tell me which is which for you, as you already have done, for which again thanks. My intention in asking you about your own thinking is to make the process of you deciding what you like or dislike in my work as efficient as possible for both of us. I will try not to be too adversarial about what you say you dislike unless I feel that your thinking might move mine. With apologies, it's whether I can find ways to get out of my box that motivates me to talk to you and anyone else.
You're right that CM+ introduces a lot of baggage, but QMics straw-manning CMics seems to me, at the moment, an almost cardinal sin that is all too common in the physics journals. The way I untie CM to give us CM+ is not completely unproblematic, but I think it is natural enough that the argument deserves serious consideration.
For myself, focusing on what you say about "optimal agents and their inference system", I try to identify where a conscious agent does or does not have agency relative to a given physics experiment. I have personally found it very striking that whereas in the 1920s a physicist would literally see and write down the position of a pointer in their lab notebook, in the 2020s a physicist goes for coffee or for a vacation while computers run the carefully designed experimental protocol. The physicist has agency in the design and incremental construction and testing of a new experimental apparatus, but I think they have very little agency in the moment-by-moment operations and I can't see that they have any serious agency at all in the individual "collapses" that some interpretations say cause events that happen at MegaHertz or higher rates. If nothing else, who can think or subconsciously react anywhere near that fast? Because I think CM+ licenses me to think classically as well as QMically, those events are thermodynamic transitions of materials that are chosen and engineered with extreme care, reflecting centuries and more of physical and engineering knowledge. That thermodynamics, however, is a system of classical physics models that include noncommutativity and quantum fluctuations as well as thermal fluctuations. After the experiment has run and there are a few Terabytes or Exabytes of experimental results available to analyze, the physicist again has agency in their decisions about what analysis to run (although good practice has become to specify what analysis will be run before the data is available to avoid cherry-picking and other statistical sins). Crucially, there are some analyses that can be described much more effectively if we allow classical physics to use the tools of noncommutative probability. To me, it seems that this paragraph makes contact with you and others seeing "the laws of physics as following rules of inference, but in an agent centered way", but it leaves me wondering whether the way I am including agency might not be enough for your liking?
I'm quite reserved about the details of Qubism, but I think it does encompass a practical computational reality: we do not and cannot reanalyze all the data that has been recorded since the beginning of time whenever we add a few more Terabytes to our cache. We have to have algorithms and ad hoc rules that allow us to use those new Terabytes to update our models or theories without reanalyzing everything. I suppose the formal side of that updating will be some generalization or variant of the Bayesian rule in the context of noncommutative probability. I think a postdoc here at Yale puts something like this reserved approach to Qubism quite well in the first section of a talk he gave to the series of seminars that is run by Foundations of Physics@Harvard, (I met Alex for the first time last night, so he's fresh in my memory, which I hope meeting him next week over coffee will consolidate.)
 
  • #8
Some quick associations without
Peter Morgan said:
For myself, focusing on what you say about "optimal agents and their inference system", I try to identify where a conscious agent does or does not have agency relative to a given physics experiment.
...
The physicist has agency in the design and incremental construction and testing of a new experimental apparatus, but I think they have very little agency in the moment-by-moment operations and I can't see that they have any serious agency at all in the individual "collapses" that some interpretations say cause events that happen at MegaHertz or higher rates. If nothing else, who can think or subconsciously react anywhere near that fast?
I do not use the term "conscious" , this way tends to cause misunderstandings on par with how I think many misunderstood some of the founders of QM when they talke about "observer".
About the agents agency: for me I think of the the agents action, as ultimately a random walk - but the random walk is relative to a agent-specific prior. This a bit in line with this...

Peter Morgan said:
those events are thermodynamic transitions of materials that are chosen and engineered with extreme care, reflecting centuries and more of physical and engineering knowledge.

Peter Morgan said:
That thermodynamics, however, is a system of classical physics models that include noncommutativity and quantum fluctuations as well as thermal fluctuations.
It might be that we think alike, if you mean that it's essentially stochasitcally drive, BUT the fact that there are non-commmutative and lossy structures involves, the result is dynamics that is not just simple dissipation but essentially any complex physics? If so, that is pretty much in line with how i think!
Peter Morgan said:
After the experiment has run and there are a few Terabytes or Exabytes of experimental results available to analyze, the physicist again has agency in their decisions about what analysis to run (although good practice has become to specify what analysis will be run before the data is available to avoid cherry-picking and other statistical sins). Crucially, there are some analyses that can be described much more effectively if we allow classical physics to use the tools of noncommutative probability. To me, it seems that this paragraph makes contact with you and others seeing "the laws of physics as following rules of inference, but in an agent centered way", but it leaves me wondering whether the way I am including agency might not be enough for your liking?
A summary here is that i imagine understanding of explaning say a hamiltonian flow, instead as a random walk, and instead of pulling the hamiltonian out of a hat, the random walk guide should have evolved and correspond to an optimal inference rule. And the optimal inference (to be proven of course) implies joint non-commutative statistical structures beeing retained.
Peter Morgan said:
I'm quite reserved about the details of Qubism, but I think it does encompass a practical computational reality: we do not and cannot reanalyze all the data that has been recorded since the beginning of time whenever we add a few more Terabytes to our cache. We have to have algorithms and ad hoc rules that allow us to use those new Terabytes to update our models or theories without reanalyzing everything. I suppose the formal side of that updating will be some generalization or variant of the Bayesian rule in the context of noncommutative probability.
By qbism derivative, i meant mainly the agent stance and the guiding take of probability. I am not a strict qbist at all. I also happen to not stick to the plain bayesian rule. So this paragraph is roughly to my liking! Actually my starting point is deeper for a reasons and brings us down to the assignment of degree of belief to a real number, I do not accept the physical basis for such uncountablemeasure, because i find it sooner or lager leads to bad problems with divergences etc that requires ambigious renormalisation to be cured. I think it does not have to be like that.

/Fredrik
 
  • Like
Likes Peter Morgan
  • #9
Fra said:
Some quick associations without

I do not use the term "conscious" , this way tends to cause misunderstandings on par with how I think many misunderstood some of the founders of QM when they talke about "observer".
About the agents agency: for me I think of the the agents action, as ultimately a random walk - but the random walk is relative to a agent-specific prior. This a bit in line with this...

It might be that we think alike, if you mean that it's essentially stochasitcally drive, BUT the fact that there are non-commmutative and lossy structures involves, the result is dynamics that is not just simple dissipation but essentially any complex physics? If so, that is pretty much in line with how i think!

A summary here is that i imagine understanding of explaning say a hamiltonian flow, instead as a random walk, and instead of pulling the hamiltonian out of a hat, the random walk guide should have evolved and correspond to an optimal inference rule. And the optimal inference (to be proven of course) implies joint non-commutative statistical structures beeing retained.

By qbism derivative, i meant mainly the agent stance and the guiding take of probability. I am not a strict qbist at all. I also happen to not stick to the plain bayesian rule. So this paragraph is roughly to my liking! Actually my starting point is deeper for a reasons and brings us down to the assignment of degree of belief to a real number, I do not accept the physical basis for such uncountablemeasure, because i find it sooner or lager leads to bad problems with divergences etc that requires ambigious renormalisation to be cured. I think it does not have to be like that.

/Fredrik
I think I'm not seeing any serious disagreements here? I fully accept that my expression here and in my papers is not perfect (indeed that some of my thinking and how I express it will certainly be unclear or wrong or wrong-headed, on past experience) and that other people would say some of it differently, but it's always my hope to write something today that helps other people go to greater heights so I can admire the sparkles from safely on the ground.

I'll pick up on your last worry about the real numbers. I think it's OK to use the reals because although all our past measurements are finitely accurate and precise, so that we represent them by a finite number of bits (if we're finicky, there will be error bars, which will also be represented by a finite number of bits), our imagination about the future is not so limited. We can always imagine what the results would be if we decide in future to measure something between what we have previously measured. A particular idea of distance and imagining that there can always be a between are all we need to construct the reals in our imagination. With a different idea of distance, we can also play with p-adics, but so far I haven't wanted to do that. Nobody has time to do everything¯\_(ツ)_/¯

I mention above that there is a section in my talk in Bogotá that discusses renormalization, which cites my (unpublished) paper https://arxiv.org/abs/2109.04412. In other words, I share your concern that renormalization is a problem. Many physicists now will tell you that the mathematics of the renormalization group solves the problem, sometimes even vehemently. Given that we think that despite the renormalization group there still is a problem, people will consider different aspects of how we construct interacting quantum fields as a focus for how to fix that problem. My view has become that I either want an axiomatic system or, keeping Gödel's take-down of Hilbert's ambitions in mind, I would like to understand at least a little why we can't have an axiomatic system. With that in mind, I have wondered what we can change about the Wightman axioms, which seem to me to be pretty minimal, but there are gaps. It turns out that it's rather natural, even not only in my Koopman-motivated world, to introduce a nonlinearity into the Wightman axioms by not taking the measurement operators of the theory to be generated by an operator-valued distribution, so that's what I've been doing. If we consider how we decide what renormalization scale we should use when we model a particular experiment, that seems to give us quite a solid basis to say that in practice there is some nonlinearity in our current practice, so we should look for ways to introduce nonlinearity into the Wightman or some other axioms.
 
  • Like
Likes Fra
  • #10
Peter Morgan said:
I think I'm not seeing any serious disagreements here? I fully accept that my expression here and in my papers is not perfect (indeed that some of my thinking and how I express it will certainly be unclear or wrong or wrong-headed, on past experience) and that other people would say some of it differently, but it's always my hope to write something today that helps other people go to greater heights so I can admire the sparkles from safely on the ground.
My views are admittedly odd, so I focus on possible common elements rather than what differs. The "baggage" in your papers (I try to see beyond), is to large part of course just common conventional theory, so similar baggage is in other papers. One such an example the notion of 4D spacetime, and all other the notions from classical meachanics.

You wrote in a previosus thread
"One way to ground everything in reality is to think purely about the records of experiments that are stored in computer memory. Very often, that's a list of times at which events happened."

I like to take this to say extremes and consider the whole agents illusion of reality to emerge from the encoded and indeed "recorded events" in the agents "hardware". I mean this in an abstract but litteral sense. I felt that MAYBE this resonated with your agent Frank, except Frank is a sort of classical macroscopic device that can share information by his "signal lines" right? I like what you write, but I think that maybe I want to take this to a more extreme position that you suggest?
Peter Morgan said:
I'll pick up on your last worry about the real numbers. I think it's OK to use the reals because although all our past measurements are finitely accurate and precise, so that we represent them by a finite number of bits (if we're finicky, there will be error bars, which will also be represented by a finite number of bits), our imagination about the future is not so limited. We can always imagine what the results would be if we decide in future to measure something between what we have previously measured. A particular idea of distance and imagining that there can always be a between are all we need to construct the reals in our imagination.
My issue with the reals is quite specific to my own thinking and hard to convey without going into inappropriate details. Indeed there is no problems with the reals as they can represend rational numbers as well, and the reals have obvious advantages as well when you work with complex system, then the "continuum" is simply easier than to consider the complexity of a possible discrete finestructure. I don't want to do away with the reals, I just think that for the intermediate reconstruction I think wee need - they are a inappropriate measure of degree of belief in the [0,1] range.
The problem starts when and if you attempt to consider are the possible information states, and wants to make inferences. Then the mathematical embedding is imprecise as the agent doing the random walk will soon get lost as the map is uncountable, giving us a fine tuning problems that isn't physical. I think it's a fiction of the model. Uncountable as a limit would be fine as an "approximation" to very LARGE systems, but the way theories are construted today, the tracks of the limits are irreversibly lost even when we have pulled out the hamiltonian. So we are forced to try to cure it. I think this is one of the reasons to non-renormalisation problems, and it's one reasons why I do not find the renormalisation theory satisfactory enough for the open problems such as unification of forces (including gravity).
Peter Morgan said:
With that in mind, I have wondered what we can change about the Wightman axioms, which seem to me to be pretty minimal, but there are gaps.
From my perspetive the issue with Wightman axioms, premsume there is a minkowski spacetime or peferred poincare group, so it's far from "minimal" IMO. In my view, spacetime should be emergent (and thus also poincare symmetry).

/Fredrik
 

1. Who is Peter Morgan and what is his area of expertise?

Peter Morgan is a scientist and mathematician known for his work in quantum mechanics and random fields. He specializes in the study of non-commutative lossy records and their applications in information theory.

2. What is a random field in the context of Peter Morgan's research?

A random field is a mathematical concept used to describe a collection of random variables that are indexed by a set of points in space or time. In Peter Morgan's research, random fields are used to model and analyze complex systems and phenomena.

3. What is non-commutative lossy records and why is it important?

Non-commutative lossy records is a mathematical framework for studying the behavior of quantum systems. It is important because it allows scientists to better understand and predict the behavior of quantum systems, which can have practical applications in fields such as cryptography and quantum computing.

4. How does Peter Morgan's research contribute to the field of quantum mechanics?

Peter Morgan's research contributes to the field of quantum mechanics by providing new insights and methods for studying and understanding quantum systems. His work on random fields and non-commutative lossy records has applications in quantum information theory, quantum cryptography, and quantum computing.

5. What are the potential real-world applications of Peter Morgan's research?

The potential real-world applications of Peter Morgan's research include improved methods for secure communication and data storage using quantum systems, advancements in quantum computing technology, and a better understanding of complex systems in fields such as biology and economics.

Similar threads

  • Quantum Interpretations and Foundations
Replies
21
Views
2K
  • Quantum Interpretations and Foundations
Replies
5
Views
2K
  • Quantum Interpretations and Foundations
Replies
34
Views
4K
  • Quantum Interpretations and Foundations
Replies
6
Views
1K
  • Quantum Physics
Replies
4
Views
1K
Replies
1
Views
980
  • Quantum Physics
Replies
17
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
  • Beyond the Standard Models
Replies
3
Views
2K
  • Quantum Physics
2
Replies
45
Views
10K
Back
Top