Undergrad Von Neumann entropy for "similar" pvm observables

Click For Summary
SUMMARY

The von Neumann entropy for an observable is defined as ##S=-\sum\lambda\log\lambda##, where ##\lambda## represents the eigenvalues of the observable. In the case of two different positive operator-valued measures (pvm) observables, ##A## and ##B##, that share the same resolution of the identity but have different eigenvalues, it is established that ##s_A>s_B##. However, the von Neumann entropy is fundamentally an entropy of a quantum state represented by a positive density matrix ##\rho##, which must satisfy the normalization condition ##{\rm Tr}{\rho}=1##. This indicates that the entropy is independent of the choice of observable and reflects the information content of the quantum state.

PREREQUISITES
  • Understanding of von Neumann entropy and its mathematical formulation
  • Familiarity with positive operator-valued measures (pvm) in quantum mechanics
  • Knowledge of density matrices and their properties
  • Basic concepts of quantum state representation and Hilbert spaces
NEXT STEPS
  • Study the mathematical derivation of von Neumann entropy in quantum mechanics
  • Explore the implications of positive operator-valued measures (pvm) on quantum measurements
  • Investigate the relationship between density matrices and quantum state information
  • Learn about Shannon-Jaynes entropy and its application in quantum information theory
USEFUL FOR

Quantum physicists, researchers in quantum information theory, and students studying advanced quantum mechanics will benefit from this discussion, particularly those interested in the properties of quantum states and measurements.

forkosh
Messages
6
Reaction score
1
TL;DR
Why is the entropy different if the pvm's use the same resolution of the identity?
The von Neumann entropy for an observable can be written ##s=-\sum\lambda\log\lambda##, where the ##\lambda##'s are its eigenvalues. So suppose you have two different pvm observables, say ##A## and ##B##, that both represent the same resolution of the identity, but simply have different eigenvalues, with ##\lambda_{A_i}>\lambda_{B_i}## always. Then ##s_A>s_B##, but why should that be?

If they both represent the same resolution of the identity, then exactly the same experimental apparatus measures them both. Just change the labels on the pointer dial from the ##A##-values to the ##B##-values. For example, the ##A##-measurement could be mass in grams, whereas ##B## is simply in kilograms. Why should the entropy of those two measurements be any different?
 
Physics news on Phys.org
There is no such thing as von Neumann entropy of an observable. The von Neumann entropy is an entropy of a state, represented by a positive density matrix ##\rho## which satisfies ##{\rm Tr}{\rho}=1##. Due to the latter condition, it's impossible that all eigenvalues of one ##\rho## are larger than all eigenvalues of another ##\rho## in the same Hilbert space.
 
Last edited:
  • Informative
  • Like
Likes Klystron and vanhees71
The von Neumann entropy is simply
$$S=-k_{\text{B}} \langle \ln \hat{\rho}=-k_{\text{B}} \mathrm{Tr}[\hat{\rho} \ln \hat{\rho}],$$
and this is independent of any choice of basis or which observable is being measured. It gives you the missing information, given the state ##\hat{\rho}##. You have "full information" in the information-theoretical sense when using the von Neumann entropy (which is the Shannon-Jaynes entropy for quantum theory) iff ##\hat{\rho}=|\psi \rangle \langle \psi|##, i.e., if the state is pure.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 9 ·
Replies
9
Views
6K
  • · Replies 23 ·
Replies
23
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 235 ·
8
Replies
235
Views
18K