Entropy, Relative entropy, Mutual information

In summary, entropy and relative entropy are both important concepts in the study of string theory and quantum gravity. While entropy may be more fundamental in the context of black holes, relative entropy offers a more refined understanding of the entanglement structure of these theories.
  • #1
atyy
Science Advisor
15,167
3,374
The Bekenstein-Hawking entropy is expected to be, and has been shown to be in some cases, derived from counting states.

However, entropy is not defined for continuous probability densities, and so I have heard it said that relative entropy (of which the mutual information is a form) is more fundamental.

I think in classical statistical mechanics, the entropy is computed using canonical coordinates, since the phase space is continuous, which is one way to get round the need for discrete probability distributions.

In the context of string theory and quantum gravity, is entropy or the relative entropy more fundamental?Some possibly related comments:
"the mutual information offers a more refined probe of the entanglement structure of quantum field theories because it remains finite in the continuum limit" http://arxiv.org/abs/1010.4038
 
Physics news on Phys.org
  • #2
I would like to offer my perspective on the question of whether entropy or relative entropy is more fundamental in the context of string theory and quantum gravity.

Firstly, let's define what we mean by entropy and relative entropy. Entropy is a measure of the disorder or randomness in a system, and is typically calculated using the probability distribution of the system's states. In classical statistical mechanics, as mentioned in the forum post, this is done using canonical coordinates to deal with the continuous nature of the system's phase space.

On the other hand, relative entropy is a measure of the difference between two probability distributions. It is often used to compare the information content of two systems and is closely related to the concept of mutual information, which measures the amount of information shared between two systems.

In the context of string theory and quantum gravity, the Bekenstein-Hawking entropy is a measure of the entropy of a black hole, and has been shown to be derived from counting the quantum states of the black hole. This suggests that entropy is a fundamental concept in these theories.

However, as mentioned in the forum post, entropy is not well-defined for continuous probability distributions. This is where the concept of relative entropy becomes important. In the continuum limit, the mutual information offers a more refined probe of the entanglement structure of quantum field theories. This suggests that relative entropy may be a more fundamental concept in the context of string theory and quantum gravity, as it remains finite in the continuum limit.

In conclusion, both entropy and relative entropy are important concepts in the study of string theory and quantum gravity. While entropy may be more fundamental in the context of black holes, relative entropy offers a more refined understanding of the entanglement structure of these theories. Further research and study are needed to fully understand the role of these concepts in the context of string theory and quantum gravity.
 

1. What is entropy?

Entropy is a measure of the amount of uncertainty or randomness in a system. In thermodynamics, it is a measure of the disorder or randomness of molecules in a system. In information theory, it is a measure of the average amount of information contained in a message.

2. How is entropy calculated?

In thermodynamics, entropy is calculated using the formula S = k ln(W), where k is the Boltzmann constant and W is the number of possible microstates in a system. In information theory, entropy is calculated using the formula H = -Σp(i)log(p(i)), where p(i) is the probability of a particular event occurring.

3. What is relative entropy?

Relative entropy, also known as Kullback-Leibler divergence, is a measure of the difference between two probability distributions. It indicates how one distribution differs from another and can be used to compare the amount of information contained in two messages.

4. How is relative entropy calculated?

Relative entropy is calculated using the formula D(p||q) = Σp(i)log(p(i)/q(i)), where p(i) and q(i) are the probabilities of an event occurring in two different distributions. It is important to note that relative entropy is not a symmetric measure, meaning D(p||q) is not equal to D(q||p).

5. What is mutual information?

Mutual information is a measure of the amount of information shared between two random variables. It indicates how much knowing the value of one variable can reduce the uncertainty of the other variable. It is often used in machine learning and data compression algorithms.

Similar threads

  • Beyond the Standard Models
Replies
9
Views
2K
  • Beyond the Standard Models
Replies
3
Views
2K
  • Special and General Relativity
Replies
6
Views
918
  • Thermodynamics
Replies
1
Views
696
  • Quantum Interpretations and Foundations
Replies
3
Views
935
Replies
6
Views
708
  • Beyond the Standard Models
Replies
23
Views
2K
  • Beyond the Standard Models
3
Replies
94
Views
29K
Replies
2
Views
2K
  • Beyond the Standard Models
Replies
14
Views
3K
Back
Top