# Are density matrices part of a real vector space?

• I
• Gere
In summary, density matrices as objects can be seen as vectors with an inner product. Of course, not all vectors are valid density matrices, but only those with trace 1. However, as we will be interested only in inner products or convex combinations, we are not concerned about other vectors (hence "part of" a vector space).
Gere
TL;DR Summary
An attempt to do quantum mechanics in no more than a real vector space
Is the following a correct demonstration that quantum mechanics can be done in a real vector space?

If you simply stack the entries of density matrices into a column vector, then the expression ##\textrm{Tr}(AB^\dagger)## is the same as the dot product in a complex vector space (Frobenius inner product). Therefore density matrices as objects can been seen as vectors with an inner product. Of course, not all vectors are valid density matrices, but only those with trace 1. However, as we will be interested only in inner products or convex combinations, we are not concerned about other vectors (hence "part of" a vector space).

Since the density matrix is Hermitian, and entries opposite of the diagonal are conjugates, we can change the basis with a similarity transformation to get an all real matrix. The easiest way to do that is to consider only the upper triangular and for the off diagonal terms we simply concatenate ##\sqrt{2}\Re(x_{ij})## and ##\sqrt{2}\Im(x_{ij})## to our column vector instead of ##x_{ij}## and ##x_{ji}## which were complex. Think of using the basis ##|x_i\rangle\langle x_j|+|x_j\rangle\langle x_i|## and ##i(|x_i\rangle\langle x_j|-|x_j\rangle\langle x_i|)## and normalize.

Therefore the density matrix can be a real matrix in some basis. If we take an observable eigenvector and use ##|\lambda\rangle\langle\lambda|## in our inner product, we can see that the probability for observing this eigenvector is an inner product ##P(\lambda)=\rho \cdot \lambda##. Here ##|\lambda\rangle\langle\lambda|## is like an operator where the observable value of ##|\lambda\rangle## is 1 and 0 for all other eigenvectors so that we calculate the expectation value. We use the same basis so that it will be real. Note that obserable outcomes (eigenvalues of the observable operator) are not used.

We conclude that to project out probabilities of observables, we can use a real vector space and an inner product. As an example, a single spin in this notation would simply be $$\psi_{\hat{n}}=\frac{1}{\sqrt{2}}(\hat{n}+e_4)$$. This is similar to the equation at the end of the paragraph Density matrix, but note that ##\frac{1}{\sqrt{2}}(\hat{n}+e_4)## it does not need complex numbers or Pauli matrices. There is even a simple geometric way to get this solution right away. It is straightforward to see that a measurement of ##|\uparrow\rangle## in an arbitrary direction comes out as $$\frac{1}{\sqrt{2}}(\hat{n}_1+e_4)\cdot\frac{1}{\sqrt{2}}(\hat{n}_2+e_4)=\frac{1}{2}(\cos\theta +1)=\cos^2\frac{\theta}{2}$$ as required.

The only issue for now is, that the von Neumann equation for time evolution will make the density matrix complex again. However, if at each time instant we make our matrix real again, then at least as long as we stay during that time instant, we can do the observable projection part in our real vector space. We cannot combine vectors from different times as long as he have not derived a good mapping. However, there will be a mapping because we can take our real vector, build up the complex density again, evolve the density matrix and finally get back a new real vector for a later time.

Does this show that quantum mechanics can be done in a real vector space (once you have written out the form of the time evolution)?

Update: To make it more clear. Effectively I'm doing all of QM on real vectors and operators are not needed anymore. Only the time evolution will need a fix - probably geometric algebra.

Last edited:
I'm going to try and address your question by stepping a level back in the ladder of abstraction:

If you look at how the density matrices are actually used operationally speaking, they are always traced over with operators to yield expectation values. Now the SU(n) Lie algebras and Lie groups are fundamentally real objects (you don't complexify the parameters, else you'd have SL(n,C). )

Now the Lie algebra generators are the anti-hermitian operators associated with our hermitian observables. However, there is no reason to split the two IMNSHO and one can absorb the imaginary unit utilized to get real expectation values into the "imaginary trace of the density operator times generator = trace of density operator times observable operator" and just combine these into elements of the real dual space to the real vector space of ##\mathfrak{su}(n)##.

In detail, for observable ##X## associated with generator ##\Xi = -iX## (sign?)
Define ##\langle X \rangle = \varrho[\Xi] = i\cdot tr( \rho \cdot -iX) ##.

So there's no need mathematically to utilize complex numbers per se. One does need to restate the eigen-value principle (invoking imaginary eigen-values for the "anti-observable" generators but that's in the interpretation, not in the mathematical reformulation.

Functionally speaking, density matrices effect co-operators i.e. dual vectors of the operators treated as elements of a real vector space which map those operators to corresponding expectation values of the respective observables.

I got this understanding from my thesis advisor. I think it is mentioned in his book: https://www.amazon.com/dp/3540570845/?tag=pfamazon01-20

I don't recall if this observation is original to him.

I hope this is helpful to you.

bhobba and PeterDonis
jambaugh said:
I'm going to try and address your question by stepping a level back in the ladder of abstraction:

If you look at how the density matrices are actually used operationally speaking, they are always traced over with operators to yield expectation values. [...]
I may need more time to understand all of the above as it introduces a lot of new concepts.

However, as a conclusion, is this a "yes" or a "no" to the question whether QM can be done in a purely real vector space with the dot product, without even using operators?

Gere said:
Does this show that quantum mechanics can be done in a real vector space (once you have written out the form of the time evolution)?
Yes It can be done but everything becomes conceptually and computationally ugly.

Efficiency and transparency are the most important reasons why the mathematical side of quantum mechanics is formulated the way it is.

bhobba and vanhees71
A. Neumaier said:
Yes It can be done but everything becomes conceptually and computationally ugly.
Thanks. Could you please suggest some basic example I could try to calculate to see how it becomes ugly?
Ideally you would also write it out in conventional notation (it should be simple?!).

I've only tried the spin case so far and
$$\hat{n}+e_4$$
was a lot easier than
$$I+\sigma_x n_x+\sigma_y n_y+\sigma_z n_z$$
where in the latter case I need Pauli matrices and the complex algebra, whereas in the first case ##\hat{n}_1\cdot\hat{n}_2=\cos\theta## is rather basic. And results are identical.

Last edited:
Lets see... starting with a two "state" system, be it photon polarization, or spin of a spin-1/2 particle, the mathematical framework is the same.

We consider the real algebra of complex 2x2 matrices. The group of probability conserving transformations is U(2), consisting of 2x2 unitary matrices which is generated (via exponentiation) from the Lie algebra of 2x2 anti-hermitian matrices (and ##i\cdot \boldsymbol{1}##). The Pauli spin matrices form a basis of the traceless hermitian matrices so multiplying by i gives a basis for the anti-hermitian traceless ones: ## \xi_x=i\sigma_x, \xi_y=i\sigma_y, \xi_z=i\sigma_z##. Now add to that ##\eta = i\boldsymbol{1}## and you have the 4 generators of U(2).
$$U = \exp(\theta_0 \eta +\theta_1\xi_x + \theta_2 \xi_y + \theta_3\xi_z )$$
So, what we actually use in QM are the unitary transformations, the elements of U(2), and the observables which generate it (when multiplied by i) with which we derive expectation values from density operators.

Now we associate observables with the hermitian operators ## X = x_0 \boldsymbol{1}+x_1 \sigma_x + x_2\sigma_y + x_3\sigma_z## but we could as easily pin them down by giving the corresponding anti-hermitian operator, ## \Xi= i\cdot X = x_0\eta + x_1\xi_x + x_2\xi_y+x_3\xi_z##.

Let's use a column matrix representation from this basis:
$$\Xi \sim \left[ \begin{array}{c}x_0 \\ x_1 \\ x_2\\ x_3\end{array}\right]$$

What we need to fully "realize" our formulation is an expectation value formula for the generators which does not invoke i and that is equivalent to the usual trace with density operators. Density matrices must be hermitian, positive definite, and have unit trace. But our real goal is not the density operators themselves but their actions on the observables in that they yield expectation values. ## \langle X\rangle = tr(\rho\cdot X)##. We will use two important properties:
• This action is linear, ##\langle aX+bY\rangle = a\langle X\rangle + b\langle Y \rangle##.
• The normalized positive weighted sum of valid density operators is a valid density operator, ##\rho = p\rho_1 + q\rho_2## where ##p+q=1, p,q\ge 0##.
Using linearity we can express the expectation of our observable via a valid dual vector, the density functional:
$$\langle X\rangle = \varrho[\Xi] = \left[ \begin{array}{cccc} r_0 &r_1 & r_2 & r_3\end{array}\right]\left[ \begin{array}{c}x_0 \\ x_1 \\ x_2\\ x_3\end{array}\right]$$

Now you are welcome to think of this as a 4-vector dot product: ##\boldsymbol{r}\bullet\boldsymbol{x}## but I think it's better to understand it in terms of vector and dual vector contraction as this avoids inserting an implicit extra metric (the dot product). That keeps the transformations clearer.

I'll leave it as an exercise to work out the conditions on the density coefficients ##r_1, r_2, r_3## for this to be a valid expectation value but to begin with the unit trace requirement will force ##r_0 = 1## since it is the same as requiring that ##\langle \boldsymbol{1} \rangle = 1##.

The final stroke is to work out the matrix form of the adjoint representation of the Lie group ##U(2)##.

With this scheme, we can do much of QM purely within the Lie group, Lie algebra and the dual space of the Lie algebra. I do however believe we'll have to reconstruct the covering algebra (the associative product on the generators) in some form to calculate variances of observables. We would need these to distinguish e.g. the density functionals associated with sharp modes (zero variance cases) and thus exact measurements i.e. spectra.

To my mind, it is sufficient to understand that we can do this and then work within the usual operator trace formulation on the complex representation. I see the complex numbers here as no different than using complex numbers or quaternions as opposed to matrices to represent rotations. But what I get out of it conceptually is a sense that we should not take the Hilbert space too seriously. I think the density functional is more operationally fundamental. It helps one appreciated that QM is always and necessarily a statistical theory and we are describing statistical distributions of phenomena not states of reality.

Gere said:
Thanks. Could you please suggest some basic example I could try to calculate to see how it becomes ugly?
Try an anharmonic oscillator and first order perturbation theory for it!

jambaugh said:
With this scheme, we can do much of QM purely within the Lie group, Lie algebra and the dual space of the Lie algebra.
But only for a single spin.

A. Neumaier said:
But only for a single spin.
How is that? The above (my) recipe works for any density matrix and hence an arbitrary number of spins can be modeled with a dot product between real vectors?

Gere said:
How is that? The above (my) recipe works for any density matrix and hence an arbitrary number of spins can be modeled with a dot product between real vectors?
This comment was for jambaugh's remark.

Your recipe is always valid but becomes cumbersome when the dimension increases. The transformation matrices U(t) double in size, and manifest unitarity is lost. All formulas look twice as messy as in the standard formulation. This is not something for a practitioner.

Physicists use complex numbers whenever they simplify the formulas. You can see it in Fourier analysis, which becomes much more elegant with complex numbers though it was originally - in the 18xx - defined in terms of real numbers. Electrical circuits are also handled in practice using complex numbers though the basic theory begins in a purely real setting.

Nothing at all is gained by trying to avoid i.

Last edited:
DennisN, vanhees71, bhobba and 2 others

## 1. What is a density matrix?

A density matrix is a mathematical representation of a quantum system that describes the probability of finding the system in a particular state. It takes into account both the quantum state and the statistical mixture of states that the system can be in.

## 2. How is a density matrix related to a real vector space?

A density matrix is a Hermitian matrix, meaning it is equal to its own conjugate transpose. This property allows it to be represented as a real vector in a real vector space, where each element of the matrix corresponds to a vector in the space.

## 3. Can density matrices be used to describe classical systems?

No, density matrices are specifically used to describe quantum systems. Classical systems can be described using probability distributions, but these do not have the same mathematical properties as density matrices.

## 4. How do density matrices differ from state vectors?

Density matrices and state vectors both describe the state of a quantum system, but they have different purposes. State vectors are used to represent pure states, while density matrices are used to represent mixed states. Additionally, state vectors are used in the Schrödinger equation, while density matrices are used in the density matrix formalism.

## 5. What is the significance of density matrices in quantum mechanics?

Density matrices play a crucial role in quantum mechanics as they allow for the description of mixed states, which cannot be described by state vectors. They also allow for the calculation of observables and the prediction of measurement outcomes in quantum systems. Additionally, density matrices are used in various quantum information processing tasks, such as quantum state tomography and quantum error correction.

• Quantum Physics
Replies
21
Views
2K
• Quantum Physics
Replies
8
Views
2K
• Quantum Physics
Replies
8
Views
852
• Quantum Physics
Replies
6
Views
2K
• Quantum Physics
Replies
11
Views
1K
• Quantum Physics
Replies
0
Views
528
• Quantum Physics
Replies
4
Views
844
• Quantum Physics
Replies
2
Views
1K
• Quantum Physics
Replies
17
Views
2K
• Quantum Physics
Replies
1
Views
2K