Information as the key underlying physical principle

  • #31
A pure state is an extreme point in a convex set of states. Hardy gives examples of pure states in Eq 2.

I don't know if this is true in general, but in both classical physics and in quantum mechanics, it means that within the theory, a pure state can be taken to be the complete state of a single object. Then an ensemble in which 50% of the objects are in pure state A and 50% are in pure state B is said to be in a mixed state.

In classical physics, relying on classical probability, the convex set of states is a simplex. In quantum mechanics, the convex set of states is not a simplex (usually drawn as a circle).
 
Last edited:
Physics news on Phys.org
  • #32
This leaves me still confused about the relationship between "N" and "M" in Hardy's argument. I had initially envisioned N as the total of all potential states of an entangled quantum system, and interpreted M to be a subset of those states that might be reduced to by establishing a specific parameter (K) of the system by measurement/observation.
All of this is well above my competency level in mathematics, quantum mechanics and/or information science, but I had speculated that it might be something like the conditional "rescaling" of probabilities as described on page 13 of this paper on "States of Convex Sets".
http://www.cs.ru.nl/B.Jacobs/PAPERS/convex-states.pdf

But, if N is the quantity of outcome states from a SINGLE measurement, how is the subset M of N defined?
 
  • #34
atyy said:
Let's avoid Bohmian Mechanics here. Bohmian Mechanics aims to remove the notion of observers from quantum mechanics.

http://arxiv.org/abs/1011.6451
Informational derivation of Quantum Theory
G. Chiribella, G. M. D'Ariano, P. Perinotti
Sorry, I disagree that we should avoid BM here. In http://arxiv.org/abs/1103.3506 I have developed (or, more accurate, propose to develop) a Bayesian variant for BM, where we have a real configuration q(t) -- the Bohmian trajectory -- but the wave function has a Bayesian interpretation, thus, describes only our incomplete knowledge about q(t), and not some independent really existing animal.

And now, thank you for this link, I was not aware of it, it appears that they have found a derivation of QM which very nicely fits into such a Bayesian approach. We have, in this paper, a list of axioms which are fulfilled also for classical theory, and, then, a single principle which distinguishes quantum mechanics. And what it is?

"Informally speaking, our postulate states that the ignorance about a part is always compatible with a maximal knowledge of the whole." Wow. It remains to understand what this exactly means, but I think "maximal knowledge" can be translates as "maximal knowledge possible/available in quantum mechanics", so quantum states can be understood as states of knowledge, which are restricted by the ignorance "ignorance about a part".

This looks like a chance to understand quantum wave functions in a way similar to thermodynamic states with maximal entropy simply as not some special physical states predefining frequencies, but as states which describe information we have about the real states. So, very nice news for me. (It was, obviously, an big error to ignore this direction after seeing a lot of papers which I have not found interesting at all.)

The problem I have seen with the informational approach has always been the question "information about what?", and this is a question nicely answered by the Bohmian approach: information about the trajectory.
 
  • #35
I've figured out an interpretation of Hardy's approach, which I shall write up. Briefly, the facts are:

I will call Hardy's "Measurement Device" an "Outcome device" because by Hardy's terminology, the device produces "outcomes", not "measurements". You may think of the Outcome device as having L+1 real numbers stamped on it and having a pointer that points to one of these numbers after the system enters it. Hence the set of possible "outcomes" is the same for all experiments. It doesn't change as a function of anything. Denote the set of outcomes by {r{0], r[1], r{2] ...r{L+1]}. We stipulate that the pointer points to r[0] = 0 when no system is present in the device.

A "measurement" is the following procedure: Fix the knobs on all the devices. Define a particular event E in terms of the possible outcomes. The event E will be a statement that the observed outcome is in some given subset of the possible outcomes. (It might be a statement such as "The outcome is r[1] " or a statement such as "The outcome is r[1] or r[2] or r[5] ".) Perform repeated experiments and determine the probability p of the event E. The probability p is the result of the "measurement". The result of a measurement is not a vector of several probabilities. It is single number.

Because "measurement" and "outcome" are different concepts in the context of Hardy's approach, it is best to avoid hybrid phrases like "outcome of a measurement" or "measurement of an outcome", since they are ambiguous..

To define the "degrees of freedom" K of the population of physical systems being input into the experimental equipment, it is necessary to talk about a subset of a certain kind of functions being a "basis" for the entire set of functions. Considering a "measurement" to be a function, it computes a probability p as a function of the following variables:
S_P : the knob setting on the Preparation device.
S_T: the knob setting on the Transformation device
S_O: the knob setting on the Outcome device
E: The event that has been defined as a subset of the possible outcomes

I (not Hardy) define a "measurement function" to be a function h(S_P, S_T) defined by fixing the variables S_O and E. So the set of "measurement functions" is a family of functions whose variables are (S_P,S_T) and the family is parameterized by parameters S_O and E.

(Hardy says in the beginning of the paper that unless otherwise noted, we are to assume the Transformation device does not change the state of the system. However, I think no harm is done by including the variable S_T in my definition. I will assume there exists a knob setting S_Tnull on the transformation device that effectively removes the Transformation device from the experiment and allows the system to go directly from the Preparation device to the Outcome device. If the knob is to be set to S_Tnull, I will say so explicitly. )

To express the idea that there is a set of information for a state that allows us to deduce all other information about it, make the following assumption:
There exists at least one subset B = {h1,h2,..hk} of the family of measurement functions such that any measurement function h can be expressed as some function F(h1,h2,..hk) of the measurement functions in B and B has the smallest cardinality possible for a set with this property. The function F may be different for different measurement functions h, but for the particular h in question, its particular F "works" for all values of (S_P, S_T).

The cardinality K of the B is the "degrees of freedom" of the population of physical systems that are inputs to the experiments.

We can define "the number of distinguishable states" of a population of systems by avoiding any technical discussion of "state" and instead defining "state vectors". Define the function H(S_P,, ST) to be the vector valued function H(S_P,S_T) = ( h1(S_P, S_T), h2(S_P, S_T),...hk(S_P, S_T) ), where the h's are the functions in B. Evaluated at a particular (S_P, S_T) , the value of the H is a vector of probabilities. A vector of such probabilities is defined to be a "state vector".

Set S_T= S_Tnull. Vary S_P over all possible settings and compute H(S_P, S_Tnull) for each S_P. We will assume only N distinct values of H(..) occur. (i.e. only N distinct "state vectors" are observed.) The number N is the "number of distinguishable states" of the population of physical systems. The number N is also the "dimension" of the population of physical systems.

The assumption that N is some function of K treats K as a variable. The interpretation is that there is a (single) function F such that any population of physical systems that has degrees of freedom K has dimension N = F(K). It is a statement that considers varying the population of physical systems that are used as inputs to experiments. (I find this a surprising assumption.)
 
Last edited:
  • Like
Likes atyy
  • #36
I feel as though "information" is just the keyword or buzzword within the physics community currently that allows for multidisciplinary problems to be formalized rigorously. You can see similar ideas within other disciplines such as mathematics
 
  • #37
Stephen Tashi said:
The assumption that N is some function of K treats K as a variable. The interpretation is that there is a (single) function F such that any population of physical systems that has degrees of freedom K has dimension N = F(K). It is a statement that considers varying the population of physical systems that are used as inputs to experiments. (I find this a surprising assumption.)

Is this like saying that as systems get bigger or smaller, one doesn't get a transition from a classical to a quantum system?
 
  • #38
In this paper
the author writes:
As we have dealt with characterization of quantum informati
on it is natural to ask about its role and status in
quantum physics. In particular, our motivation to discuss q
uantum information in the context of philosophy of
physics follows in part from the fact that its impact on inter
pretative problems is rather little. For instance, in a
recent interesting review article on interpretations of Qu
antum Mechanics the term ”quantum information” does not
occur even once

I think that you will get no definition of "quantum information" Maybe information is not encoded in the wave function but is the wave function itself.
 
  • #39
Digitalism said:
I feel as though "information" is just the keyword or buzzword within the physics community currently that allows for multidisciplinary problems to be formalized rigorously. You can see similar ideas within other disciplines such as mathematics

I think that this misses the initial conjecture of the thread... that the "information" contained in the quantum state vector doesn't simply describe physical existence, but at a fundamental level, it "is" physical existence.
 
  • #40
atyy said:
But perhaps one should also remember that "information is physical" :)
I'd appreciate it if atyy could expound on this idea. I'm intrigued.
 
  • #41
  • #42
Thanks Naima... Yes, that's the general idea that the thread began with. As suggested by John Wheeler, ". . . one enormous difference separates the computer and the universe--chance. In principle, the output of a computer is precisely determined by the input . . . . Chance plays no role.In the universe, by contrast, chance plays a dominant role. The laws of physics tell us only what may happen. Actual measurement tells us what is happening (or what did happen). Despite this difference, it is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer."
Yet, while this implication seems to be indicated by quantum physics, I still struggle with the conceptualization of physical existence consisting of only information. That's why I was hoping atyy would clarify his statement earlier.
 
  • #43
Feeble Wonk said:
I think that this misses the initial conjecture of the thread... that the "information" contained in the quantum state vector doesn't simply describe physical existence, but at a fundamental level, it "is" physical existence.
I did not miss the point, I was specifically guarding against that conjecture which I view to be an error. Perhaps I am incorrect.
 
  • #44
Sorry Digitalism. I meant no offense. I was simply trying to return to the initial line of inquiry. But, on second thought, perhaps your statement is not off the mark at all... because the crux of the debate is precisely the question of whether the "information" contained in the state vector (and/or quantum state) is merely mathematical formalism or the fundamental essence of physical existence.
 
  • #45
It is interesting to see that information (as energy) cannot be destroyed.
look at fig 1
When you try to hide information, it skips somewhere else in the environment.
We can say that all the information which was in particle 1 skipped to particle 3.
But we can say that particle 3 was replaced by particle 1.
 
Last edited:
  • #46
Feeble Wonk said:
Sorry Digitalism. I meant no offense. I was simply trying to return to the initial line of inquiry. But, on second thought, perhaps your statement is not off the mark at all... because the crux of the debate is precisely the question of whether the "information" contained in the state vector (and/or quantum state) is merely mathematical formalism or the fundamental essence of physical existence.

You are overkind. By no means did I mean to stifle inquiry. It is an interesting question, I was simply advising caution. Thank you for listening.
 
  • #47
naima said:
It is interesting to see that information (as energy) cannot be destroyed.
look at fig 1
When you try to hide information, it skips somewhere else in the environment.
We can say that all the information which was in particle 1 skipped to particle 3.
But we can say that particle 3 was replaced by particle 1.

Sorry Naima, but I'm not sure I understand what you are saying. Did you mean to ask whether we can say that your hypothetical particle 1 was replaced by particle 3? If so, I think that's a very pertinent question.

I have heard of speculative descriptions of quantized space-time, such as with loop quantum gravity, where the Hilbert space is thought of as interconnected yet discrete nodes across which particles "hop". So, if we think of a unitary translation from one quantum state to the next... as you asked (if you meant to)... if the "information" of particle 1 skips to particle 3, is it meaningful to say that particle 3 "replaced" particle 1. I'm not sure about that. Yet, the information content describing particle 1 would be maintained in the subsequent quantum state (now referred to as particle 3?).
 
  • #48
We see that when you have a perfect knowledge about a particle if you try to erase all this information (with a maximum entropy) all this information skip elsewhere (here on another particle in the environment). Here particle and its state are the same.
Things become more difficult when only a part of the information is hidden. In the case of a Bell pair where each particle has a maximun entropy we often read that the whole information is in the correlation. Pati writes that information cannot be created nor destroyed but i never saw something like an energy balance: at the beginning we had a total information of 10 bits here and here and here and at the end we have 3,5 here in the correlations between a and b an c and ... and 6,5 in particle p an q and ...
 
  • #49
atyy said:
A pure state is an extreme point in a convex set of states. Hardy gives examples of pure states in Eq 2.

I don't know if this is true in general, but in both classical physics and in quantum mechanics, it means that within the theory, a pure state can be taken to be the complete state of a single object. Then an ensemble in which 50% of the objects are in pure state A and 50% are in pure state B is said to be in a mixed state.

In classical physics, relying on classical probability, the convex set of states is a simplex. In quantum mechanics, the convex set of states is not a simplex (usually drawn as a circle).

OK, but can a person interpret Hardy's words as stand-alone document? Does his paper really describe a precise model? (I wonder if people who claim to interpret his paper actually interpret what he wrote or do they have the usual approach to quantum mechanics so much in the back of their minds that they just make a "free association" on the phases that appear in it. Do they think "Oh, he's really talking about ..." and substitute-in a different model? Do other papers in the Parade Of Links for this thread have similar problems?)

(Hardy's lecture "Reconstructing quantum theory from reasonable postulates" and other lectures are available at http://pirsa.org/index.php?p=speaker&name=Lucien_Hardy. )Hardy writes about the "N distinguishable states" and later in the paper says these are the "pure" states. That conflicts with my interpretation in previous post that the N distinguishable states are those with distinguishable state vectors. We we take the state vectors of the N distinguishable states by my definition and form their convex hull then the corners are the "pure states". But would it follow that these mathematically defined corners represent states that can actually be output by the Preparation device?

Hardy says in a lecture that the Preparation device may emit "composite" systems (on some knob settings). As far as I can see, the Preparation device might also emit some mixed states - just as long as running over all possible knob settings on it only produces a finite number N of distinguishable states - however those are to be defined.
 
  • #50
If there is information, it is information about something. Else, I would not name it information. So, information IMHO presupposes the existence of something, else it would be meaningless. Thus, it is something derived from real existence. So, "bit is about it", which makes "it from bit" circular.
Moreover, information is always stored in something which really exists. This storage is, of course, something completely different than what the information is about. The nice pictures on the stick are usually not pictures of the stick. But, nonetheless, this is a second direction where the bit is impossible without a preexisting it.

So I would clearly reject any attempts to consider information as fundamental.

On the other hand, I think we should learn the lessons of the interpretation of statistical physics. Here, we have the frequency interpretation, where probability theory looks physical, with frequencies as defined by reality, by physical law, and entropy being something which can be measured. On the other hand, we have the Bayesian interpretation of it, which derives entropy and frequencies and all the statistical physics from the available information. The second approach seems to me much more justified (ergodicity, even if it could be proven, usually it isn't, fails to justify statistics because of the astronomic time which would be necessary to obtain it), and has a much wider domain of applicability (it is much more natural to apply it in non-equilibrium situations). Here, the error was to interpret something as real, physical, which is in fact not about reality but about our information about reality.

To correct this, we have to make a shift in the interpretation of, in particular, entropy: From something real to information about something real. Means, from it to bit. After this, entropy is no more something real, but information. And I think a similar shift it necessary also in the interpretation of quantum theory.

But, note: Entropy in the Bayesian approach does not become some "pure information". It remains information about something, namely, information about the real configuration of the system. Which exists, and is even well-defined by the equations, it is simply unknown, with only a very restricted information available about it.

And this reality is what I miss in the "it from bit" concept.
 
  • #51
Ilja said:
If there is information, it is information about something. Else, I would not name it information. So, information IMHO presupposes the existence of something, else it would be meaningless. Thus, it is something derived from real existence. So, "bit is about it", which makes "it from bit" circular.
Moreover, information is always stored in something which really exists. This storage is, of course, something completely different than what the information is about. The nice pictures on the stick are usually not pictures of the stick. But, nonetheless, this is a second direction where the bit is impossible without a preexisting it.And this reality is what I miss in the "it from bit" concept.

Precisely IIja! This is exactly the discussion I was hoping for. This seems to be the intuitively obvious position that I've always believed myself.
Yet, I struggle with being able to conceptualize an objective, substantive "it" that is consistent with the physical action described by quantum physics (at least to the feeble degree that I understand it).

So, again, I was hoping for atyy to clarify what he meant by his statement that "information" is "physical".
 
  • #52
atyy said:
But perhaps one should also remember that "information is physical" :)

I don't mean to press, but I know that your posting on multiple threads which take your attention. So I just wanted to bump this thread in hopes that you would take a little time to explain what you meant.

The definition of *physical* (other than the biological meanings) according to dictionary.reference.com is..."of or relating to that which is material:
the physical universe; the physical sciences."

The freedictionary.com is slightly more inclusive, offering two (nonbiological) definitions that might apply... "3. Of or relating to material things: a wall that formed a physical barrier; the physical environment.
4. Of or relating to matter and energy or the sciences dealing with them, especially physics."

All of these refer to "material" existence with respect to being something *physical*. In what manner do you view information as being physical.
 
  • #53
I suspect that there is an elephant in this particular room, namely the distinction between what we are and what we do.

Just as an elephant is an animal, so are we. And, although elephants do communicate well enough for elephant purposes, we excel in this respect -- as in this interesting thread --- having invented various languages to serve the human purpose of exchanging 'information'. But a language, even quantitative mathematics, is only a mental construct; not something as physical as say, a brick, despite the way we physically represent it as 'squiggles on paper' or binary bits.

That's why I also
Feeble Wonk # 42 said:
... struggle with the conceptualization of physical existence (as) consisting of only information.
. Could this concept be just a bit of human foolishness?
 
  • Like
Likes marcus
  • #54
Ilja said:
If there is information, it is information about something. ...
Moreover, information is always stored in something which really exists.
But the basis of matter is the quantum mechanical wave function, which seems to be a probabilistic creature by nature. So it seems the basis of reality is probabilistic. What is the wavefunction a distribution of, if not pure possibility from which we get information?
 
  • #55
Paulibus said:
Just as an elephant is an animal, so are we. And, although elephants do communicate well enough for elephant purposes, we excel in this respect -- as in this interesting thread --- having invented various languages to serve the human purpose of exchanging 'information'. But a language, even quantitative mathematics, is only a mental construct; not something as physical as say, a brick, despite the way we physically represent it as 'squiggles on paper' or binary bits.
Very well written Paulibus. I would agree that, intuitively, the assertion that information (and only information) is the fundamental essence of *physical* existence would appear on its face to be utter "human foolishness".
Yet, having said that, I'd also suggest that there is a fundamental difference between human spoken/written language, which is a human creation, and quantitative mathematics, which is not. I've often heard it said that Newton and/or Leibniz created "The Calculus". But that's sheer silliness. It's like claiming that some ancient pebble pusher created "The Addition"... as if 2+2 had not equaled 4 prior to that. At best, Newton and/or Leibniz "discovered" calculus. Or you could say that they developed the mathematical "language" to manipulate the formulas that represent the underlying mathematics itself. However, the logical and quantitative relationships expressed by the mathematical "language" simply are what the are because they are what they are. That self referential consistency, which appears to be reflected in nature, gives me sufficient pause to not reflexively give into my intuitive inclinations.
 
  • #56
I'm afraid I agree with the ancient pebble pusher. Even at the risk of being thought silly , I resist the proposition that two and two make four can be characterised as some sort of eternal truth, and prefer to think of calculus as an evolved and heroic human invention; certainly not as a complex of discoveries. When I walk in the woods I don't expect an abstract descriptive label like a number to jump out of a bush and bite my leg, as it were. I maintain that abstractions are invented, however cleverly, and not discovered; and that it's only long familiarity that tempts us to confuse abstract concepts with real things. Perhaps a matrix is more easily recognised as an abstraction than a counting number formula? I see mathematical language not as compendium of relationships that 'simply are what they are because they are what they are', but as a human construct that wonderfully serves to usefully describe the physical situation we find ourselves in. Viva mathematics, viva!
 
  • Like
Likes marcus
  • #57
Paulibus said:
I see mathematical language not as compendium of relationships that 'simply are what they are because they are what they are', but as a human construct that wonderfully serves to usefully describe the physical situation we find ourselves in. Viva mathematics, viva!
I certainly didn't mean to belittle the accomplishments of mathematicians throughout history. On the contrary, advanced mathematics, particularly its application in the physical sciences, would have to be considered one of the pinnacles of human intellectual achievement.
Yet, as miraculous as that achievement is, it still seems to me that what they have done is to recognize, decipher and manipulate the extant mathematical patterns, not create them. Can you tell me that the ancient brute, before our pebble pusher, when holding two rocks in one hand and two in the other was not holding four rocks?
 
  • #58
Again, what we actually are (walking, talking, and now writing primates) is key here. We describe what matters to us because we can. In your example the ancient brute created a four-rock pattern which you so described with the help of the extant language of arithmetic; an ancient abstract , human construct, not an eternal truth that always existed to be recognised. Mathematics is revered because it has a predictive and therefore verifiable character, which helps amazingly with living, prospering and surviving in this physically complex universe, so strangely equipped with past, present and future. But I think that mathematical patterns are 'only' intangible constructs of our minds, rather than tangible realities. As they say in France, à chacun son goût .
 
  • Like
Likes marcus
  • #59
Paulibus said:
I suspect that there is an elephant in this particular room, namely the distinction between what we are and what we do.

Just as an elephant is an animal, so are we. And, although elephants do communicate well enough for elephant purposes, we excel in this respect -- as in this interesting thread --- having invented various languages to serve the human purpose of exchanging 'information'. But a language, even quantitative mathematics, is only a mental construct; not something as physical as say, a brick, despite the way we physically represent it as 'squiggles on paper' or binary bits.

That's why I also .
[as Feeble Wonk # 42 said:]
"... struggle with the conceptualization of physical existence (as) consisting of only information."
Could this concept be just a bit of human foolishness?
I think its fair to be skeptical of ideas about what existence IS or CONSISTS of. But I wouldn't object to the idea that "physics is ABOUT information".
Physics is about measurement and interaction, which are exchanges of information. Time is about changing from one quantum state to another and this is a change of information. Entropy is unavailable or irrelevant information to the observer. The idea of "observer" is an information theoretical idea. Rovelli channeled Bohr when he said "we are not concerned with what Nature IS but with how she responds to measurement" or something like that. Theories do not say what Nature IS, they predict, again information.
So maybe we can throw this idea of nature "consisting" of information into the garbage.

OK Physics is ABOUT information---we all know that, and it is not a new idea, but that is not the same thing as "Nature consists".

Excuse me if I am talking vaguely and haven't studied the thread enough. Just saw a couple of posts that I liked, and wanted to say something.
 
  • #60
Paulibus said:
... prefer to think of calculus as an evolved and heroic human invention; certainly not as a complex of discoveries.
... viva!
I think that is right. And the idea of numbers as mental constructs agrees verbatim with how numbers appear in the foundations of mathematics. Based on axiomatic set theory, the cardinal number 3 is the set of all sets with three elements. there is a one-to-one mapping between a set of three tigers to a set of three lions and so both those sets are elements of the cardinal number 3.
and the ordinal number 3 is {∅, {∅, {∅}}}
If S is an ordinal number you take the NEXT ordinal by forming the set consisting of the empty set ∅ and S, so the next ordinal is {∅, S}. You can see how I formed the ordinal number 3, by taking the number 2 and forming the next ordinal after that. One can also represent the ordinals as a sequence of tree graphs.

Clearly the numbers, in mathematics, are not "discovered" :w They did not jump from behind a bush and bite Pythagoras on the leg as he was ambling through the woods in Magna Graecia, as per Paulibus example.

However it is just possible that some aliens orbiting a nearby star, perhaps only 1000 lightyears from here, who were busy developing their civilization, could ALSO have thought up numbers. If they have thought of axiomatic set theory, all the better! It could be a bond between us, so that love or at least toleration, could grow up between intelligent (to use a flattering term) species.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K