# Please Define Information

Tags:
1. Sep 7, 2015

### Staff: Mentor

Searches of PF archives for Information Conservation turn up many threads in which the question is more or less dismissed. It sounds like many PF regulars don't believe in it, as if it were fringe science or metaphysics. Contrast that with the prominence given to the recent thread Stephen Hawking offers new resolution of black hole paradox. Pity the poor layman. It is hard to imagine scientific views being more polarized on a physics topic.

My interest in this topic was renewed by this article by Ethan Seigel that I saw today. It said:

The problem with black holes is with the information that goes into them. In the Universe as we understand it, there are certain properties of matter and energy that contain information. A particle like a proton or an electron contains not only a mass, an electric charge and a spin, but also other quantum properties like baryon number, lepton number, weak hypercharge, color charge, and quantum entanglements connecting one particle to another. If you form a black hole out of a whole host of particles (it normally takes some 10^58 of them), each with their own unique properties, including not just protons and electrons but also neutrons, photons, neutrinos, antineutrinos, positrons and more, you expect that information to meaningfully make its way into the black hole.
I would like to start simple. The most often heard phrase heard in dismissals of the question in PF threads seems to be "It depends on how you define information." The next most common is diversion into discussions of entropy. Therefore, before asking questions about information conservation, I need a scientific consensus definition of information suitable to the context of conservation and the black hole paradox. Can someone please help me with this?

A PF Insights article on Information would be very welcome if a simple reply to this thread is not enough. Even a link to an external layman's article would be welcome.

Last edited by a moderator: May 7, 2017
2. Sep 9, 2015

### Bystander

Is there anything in particular missing from https://en.wikipedia.org/wiki/Information_theory ? The topic intrigues me, and I'm not above being tutored if you/we/I can put together meaningful questions for the resident experts.

3. Sep 9, 2015

### Staff: Mentor

Thanks. Hours has been the only reply. The wiki article tals about measuring information, encoding it, and manipulating it. But a bit is to information as a kg is to mass. Defining a kg does not define mass.

The subject that really interests me is that information is said to be conserved in physical processes, analogous to energy conservation. I wanted a physics definition of information, not a human cognitive definition.

4. Sep 9, 2015

### Staff: Mentor

Wiki does have an article on information, and has this to say:

At its most fundamental, information is any propagation of cause and effect within a system. Information is conveyed either as the content of a message or through direct or indirect observation of some thing. That which is perceived can be construed as a message in its own right, and in that sense, information is always conveyed as the content of a message.

5. Sep 9, 2015

### DocZaius

That doesn't seem very easy to measure. There could be aspects of an observation that I consider to be information and that you do not consider to be information. Is there a more rigorous definition in the context of physics? Unfortuantely, I have a feeling that the Information Theory wiki article linked earlier is going to be as good as it gets.

6. Sep 9, 2015

### Staff: Mentor

Is it that, or is it just information that I don't want?

No idea.

7. Sep 9, 2015

### Staff: Mentor

Thank you Drakkith's. That explains the lack of responses to this thread. But it also means that the many discussing the featured thread about Stephen Hawking and the black hole information paradox, have no definition of what they are talking about.

8. Sep 9, 2015

### DrewD

The rigorous definition of information in physics is essentially the same as the information theoretic definition. If I get a chance I'll grab my Nielsen and Chuang, but from memory it is the negative log of the trace over the system's density matrix(?).

9. Sep 9, 2015

### Staff: Mentor

Thank you. And is that quantity conserved with unitary evolutions?

10. Sep 9, 2015

### DrewD

Don't thank me too quickly. I made a few errors. The information is essentially defined the same way as entropy (info theoretic) which makes sense because high entropy systems require more resources to define (can't be as easily compressed). This is $-tr\left(\rho\ln\rho\right)$. Since multiplication of matrices under a trace is commutative, this should be conserved under unitary transformations I think. Look up Von Neumann Entropy and more about Shannon Entropy if this doesn't make sense.

To be honest, I am not completely sure that this is the same definition that the big guns are talking about when they talk about black holes, but I think that it probably is. I recall seeing Susskind talking about BH information and mentioning the similarity to Shannon Entropy.

11. Sep 9, 2015

### atyy

Forget about the rest of the posts. In the context of the black hole information paradox, it just means non-unitary evolution.

There may be a link to the quantum mutual information, which is identical to the zero-temperature entanglement entropy, but that is still very much a subject of research.

http://arxiv.org/abs/1104.3712
http://arxiv.org/abs/0704.3906v2

Last edited: Sep 9, 2015
12. Sep 9, 2015

### DrewD

I'm confused about what you are saying here. Are you saying the information has something to do with non-unitary evolution? I thought that the issue is that in QM information is conserved by unitary evolution and something similar is expected from a BH. Are you saying the the paradox has to do with black holes causing non-unitary evolution? I should probably stop guessing what you are saying and just wait.

13. Sep 9, 2015

### atyy

Time evolution in quantum mechanics is unitary. That is broken by quantum black holes. That is all that "information loss" means in that context.

Last edited: Sep 9, 2015
14. Sep 9, 2015

### DrewD

I agree. I was always confused about why we should assume only unitary evolution other than the fact that it is a postulate of QM. If I recall correctly the Von Neumann Entropy would decrease if a number of states mapped to a single state and that would cause energy issues. It has been a while since this was something I thought about, so my memory may be doing me a disservice.

15. Sep 9, 2015

### atyy

You already got the definition of information wrong above. It is not the same as entropy. Classically, information is a difference in entropy. It is more related to a relative entropy, than to an entropy.

16. Sep 9, 2015

### atyy

So in the context of the black hole information paradox, there are at least 3 different definitions of "information" and/or "entropy".

(1) In the context of the black hole information paradox, the most basic definition of "information loss" is "non-unitary evolution".

(2) Because Hawking radiation has a thermal spectrum with a well-defined temperature, we can give it a thermodynamic entropy. We are of course tempted to associate it with a statistical mechanical entropy in the sense of "number of microscopic states equivalent to a macroscopic state". Whether this can be done is a matter of research. A celebrated calculation showing that this is the case for some black holes in string theory was carried out by Strominger and Vafa.

(3) The entanglement entropy is related to the quantum mutual information, and is identical to it at zero temperature. Is the black hole thermodynamic entropy an entanglement entropy? This is also still being researched.

As a note, in classical information theory, it is the statistical mechanical entropy (Boltzmann-Gibbs-Shannon entropy) that is used. The statistical mechanical entropy is the number of microscopic states consistent with the macroscopic state, so it is in some sense an uncertainty about the microscopic state. The mutual information is (roughly) a difference in statistical mechanical entropy or uncertainty, such that one can think of gaining information as a reduction in uncertainty.

Last edited: Sep 9, 2015