Is Information Truly Conserved in Black Hole Dynamics?

In summary, the conversation discusses the topic of information conservation in the context of physics. The speaker expresses frustration with the dismissive attitude towards the topic in the Physics Forums (PF) community and requests a scientific definition of information that is suitable for understanding conservation and the black hole paradox. They also mention an article by Ethan Seigel that sparked their interest in the topic. The conversation then shifts to discussing the Information Theory wiki article and the difficulty of measuring and defining information in the context of physics. The speaker expresses a need for a more rigorous definition of information in this context.
  • #1
anorlunda
Staff Emeritus
Insights Author
11,308
8,732
Searches of PF archives for Information Conservation turn up many threads in which the question is more or less dismissed. It sounds like many PF regulars don't believe in it, as if it were fringe science or metaphysics. Contrast that with the prominence given to the recent thread Stephen Hawking offers new resolution of black hole paradox. Pity the poor layman. It is hard to imagine scientific views being more polarized on a physics topic.

My interest in this topic was renewed by this article by Ethan Seigel that I saw today. It said:

The problem with black holes is with the information that goes into them. In the Universe as we understand it, there are certain properties of matter and energy that contain information. A particle like a proton or an electron contains not only a mass, an electric charge and a spin, but also other quantum properties like baryon number, lepton number, weak hypercharge, color charge, and quantum entanglements connecting one particle to another. If you form a black hole out of a whole host of particles (it normally takes some 10^58 of them), each with their own unique properties, including not just protons and electrons but also neutrons, photons, neutrinos, antineutrinos, positrons and more, you expect that information to meaningfully make its way into the black hole.
I would like to start simple. The most often heard phrase heard in dismissals of the question in PF threads seems to be "It depends on how you define information." The next most common is diversion into discussions of entropy. Therefore, before asking questions about information conservation, I need a scientific consensus definition of information suitable to the context of conservation and the black hole paradox. Can someone please help me with this?

A PF Insights article on Information would be very welcome if a simple reply to this thread is not enough. Even a link to an external layman's article would be welcome.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
anorlunda said:
a scientific consensus definition of information suitable to the context of conservation and the black hole paradox.
Is there anything in particular missing from https://en.wikipedia.org/wiki/Information_theory ? The topic intrigues me, and I'm not above being tutored if you/we/I can put together meaningful questions for the resident experts.
 
  • #3
Bystander said:
Is there anything in particular missing from https://en.wikipedia.org/wiki/Information_theory ? The topic intrigues me, and I'm not above being tutored if you/we/I can put together meaningful questions for the resident experts.

Thanks. Hours has been the only reply. The wiki article tals about measuring information, encoding it, and manipulating it. But a bit is to information as a kg is to mass. Defining a kg does not define mass.

The subject that really interests me is that information is said to be conserved in physical processes, analogous to energy conservation. I wanted a physics definition of information, not a human cognitive definition.
 
  • #4
Wiki does have an article on information, and has this to say:

At its most fundamental, information is any propagation of cause and effect within a system. Information is https://en.wikipedia.org/wiki/Conveyed_concept either as the content of a message or through direct or indirect observation of some thing. That which is perceived can be construed as a message in its own right, and in that sense, information is always conveyed as the content of a message.
 
  • #5
Drakkith said:
Wiki does have an article on information, and has this to say:

At its most fundamental, information is any propagation of cause and effect within a system. Information is https://en.wikipedia.org/wiki/Conveyed_concept either as the content of a message or through direct or indirect observation of some thing. That which is perceived can be construed as a message in its own right, and in that sense, information is always conveyed as the content of a message.

That doesn't seem very easy to measure. There could be aspects of an observation that I consider to be information and that you do not consider to be information. Is there a more rigorous definition in the context of physics? Unfortuantely, I have a feeling that the Information Theory wiki article linked earlier is going to be as good as it gets.
 
  • #6
DocZaius said:
That doesn't seem very easy to measure. There could be aspects of an observation that I consider to be information and that you do not consider to be information.

Is it that, or is it just information that I don't want?

Is there a more rigorous definition in the context of physics?

No idea.
 
  • #7
Drakkith said:
No idea.

Thank you Drakkith's. That explains the lack of responses to this thread. But it also means that the many discussing the featured thread about Stephen Hawking and the black hole information paradox, have no definition of what they are talking about.
 
  • #8
DocZaius said:
Is there a more rigorous definition in the context of physics?

The rigorous definition of information in physics is essentially the same as the information theoretic definition. If I get a chance I'll grab my Nielsen and Chuang, but from memory it is the negative log of the trace over the system's density matrix(?).
 
  • #9
DrewD said:
The rigorous definition of information in physics is essentially the same as the information theoretic definition. If I get a chance I'll grab my Nielsen and Chuang, but from memory it is the negative log of the trace over the system's density matrix(?).

Thank you. And is that quantity conserved with unitary evolutions?
 
  • #10
anorlunda said:
Thank you. And is that quantity conserved with unitary evolutions?

Don't thank me too quickly. I made a few errors. The information is essentially defined the same way as entropy (info theoretic) which makes sense because high entropy systems require more resources to define (can't be as easily compressed). This is ##-tr\left(\rho\ln\rho\right)##. Since multiplication of matrices under a trace is commutative, this should be conserved under unitary transformations I think. Look up Von Neumann Entropy and more about Shannon Entropy if this doesn't make sense.

To be honest, I am not completely sure that this is the same definition that the big guns are talking about when they talk about black holes, but I think that it probably is. I recall seeing Susskind talking about BH information and mentioning the similarity to Shannon Entropy.
 
  • #11
anorlunda said:
Thank you. And is that quantity conserved with unitary evolutions?

Forget about the rest of the posts. In the context of the black hole information paradox, it just means non-unitary evolution.

There may be a link to the quantum mutual information, which is identical to the zero-temperature entanglement entropy, but that is still very much a subject of research.

http://arxiv.org/abs/1104.3712
http://arxiv.org/abs/0704.3906v2
 
Last edited:
  • #12
atyy said:
In the context of the black hole information paradox, it just means non-unitary evolution.

I'm confused about what you are saying here. Are you saying the information has something to do with non-unitary evolution? I thought that the issue is that in QM information is conserved by unitary evolution and something similar is expected from a BH. Are you saying the the paradox has to do with black holes causing non-unitary evolution? I should probably stop guessing what you are saying and just wait.
 
  • #13
DrewD said:
I'm confused about what you are saying here. Are you saying the information has something to do with non-unitary evolution? I thought that the issue is that in QM information is conserved by unitary evolution and something similar is expected from a BH. Are you saying the the paradox has to do with black holes causing non-unitary evolution? I should probably stop guessing what you are saying and just wait.

Time evolution in quantum mechanics is unitary. That is broken by quantum black holes. That is all that "information loss" means in that context.
 
Last edited:
  • #14
I agree. I was always confused about why we should assume only unitary evolution other than the fact that it is a postulate of QM. If I recall correctly the Von Neumann Entropy would decrease if a number of states mapped to a single state and that would cause energy issues. It has been a while since this was something I thought about, so my memory may be doing me a disservice.
 
  • #15
DrewD said:
I agree. I was always confused about why we should assume only unitary evolution other than the fact that it is a postulate of QM. If I recall correctly the Von Neumann Entropy would decrease if a number of states mapped to a single state and that would cause energy issues. It has been a while since this was something I thought about, so my memory may be doing me a disservice.

You already got the definition of information wrong above. It is not the same as entropy. Classically, information is a difference in entropy. It is more related to a relative entropy, than to an entropy.
 
  • #16
So in the context of the black hole information paradox, there are at least 3 different definitions of "information" and/or "entropy".

(1) In the context of the black hole information paradox, the most basic definition of "information loss" is "non-unitary evolution".

(2) Because Hawking radiation has a thermal spectrum with a well-defined temperature, we can give it a thermodynamic entropy. We are of course tempted to associate it with a statistical mechanical entropy in the sense of "number of microscopic states equivalent to a macroscopic state". Whether this can be done is a matter of research. A celebrated calculation showing that this is the case for some black holes in string theory was carried out by Strominger and Vafa.

(3) The entanglement entropy is related to the quantum mutual information, and is identical to it at zero temperature. Is the black hole thermodynamic entropy an entanglement entropy? This is also still being researched.

As a note, in classical information theory, it is the statistical mechanical entropy (Boltzmann-Gibbs-Shannon entropy) that is used. The statistical mechanical entropy is the number of microscopic states consistent with the macroscopic state, so it is in some sense an uncertainty about the microscopic state. The mutual information is (roughly) a difference in statistical mechanical entropy or uncertainty, such that one can think of gaining information as a reduction in uncertainty.
 
Last edited:

1. What is the definition of information?

Information refers to any data or knowledge that is communicated or received. It can be in the form of facts, statistics, opinions, or any other meaningful content.

2. How is information different from data?

Data is raw, unprocessed facts or figures, while information is data that has been organized, interpreted, and given context to make it useful and meaningful to the recipient.

3. What are the different types of information?

There are various types of information, including verbal, written, visual, and numerical. It can also be classified as primary (first-hand) or secondary (interpreted or analyzed).

4. How does information contribute to scientific research?

Information is crucial in scientific research as it provides the foundation for generating new knowledge and understanding. Scientists use information to form hypotheses, design experiments, and analyze results.

5. Can information be biased?

Yes, information can be biased, as it is often influenced by the perspective, beliefs, and values of the source. As scientists, it is important to critically evaluate information and consider potential biases when conducting research.

Similar threads

Replies
0
Views
750
  • Special and General Relativity
Replies
7
Views
289
  • Beyond the Standard Models
Replies
4
Views
2K
  • Special and General Relativity
Replies
12
Views
827
Replies
4
Views
570
  • Beyond the Standard Models
Replies
20
Views
2K
  • Quantum Physics
Replies
11
Views
1K
  • Beyond the Standard Models
Replies
29
Views
11K
  • Cosmology
Replies
11
Views
1K
  • Special and General Relativity
Replies
8
Views
889
Back
Top