# Proof of trace of density matrix in pure/mixed states

1. Apr 27, 2010

### barnflakes

Can someone help me prove that $tr(\rho^2) \leq 1$ ?

Using that $$\rho = \sum_i p_i | \psi_i \rangle \langle \psi_i |$$

$$\rho^2 = \sum_i p_i^2 | \psi_i \rangle \langle \psi_i |$$

$$tr(\rho^2) = \sum_{i, j} p_i^2 \langle j | \psi_i \rangle \langle \psi_i | j \rangle$$

Where do I go from here? Thanks guys.

2. Apr 27, 2010

### genneth

Density matrices may be diagonalised, and their trace is one:

$$Tr(\rho) = \sum_i p_i = 1$$

Then you need a bit of basic algebra:

$$\sum_i p_i^2 \le \left(\sum_i p_i\right)^2 = 1$$

3. Apr 27, 2010

### barnflakes

Can you be a bit more explicit? So they can be diagonalised, ie. $$\rho = \sum_i \lambda_i p_i| i \rangle \langle i |$$

$$\rho^2 = \sum_i \lambda_i^2 p_i^2 | i \rangle \langle i |$$
$$tr(\rho^2) = \sum_j \lambda_j^2 p_j^2 \leq (\sum_j \lambda_j p_j)^2$$

?? does that final term = 1?

4. Apr 27, 2010

### Fredrik

Staff Emeritus
This isn't always true, since the $|\psi_i\rangle$ don't have to be orthogonal.

Start with $$\operatorname{Tr}\rho^2=\sum_n\langle n|\rho^2|n\rangle$$, where the $|n\rangle$ are members of an arbitrary orthonormal basis. Use the correct expression for $$\rho^2$$. Then rearrange some stuff and recognize the identity operator in what you've got. Then you're almost done, but you'll need the Cauchy-Schwarz inequality to finish it.

5. Apr 27, 2010

### genneth

In the diagonal basis, the eigenvalues of $$\rho^2$$ are just $$p_i^2$$, where $$p_i$$ are the eigenvalues of $$\rho$$.

6. Apr 27, 2010

### barnflakes

OK so I get to $$\sum_{i,j} p_i p_j \langle \psi_j | \psi_i \rangle \langle \psi_i | \psi_j \rangle$$

and now I need to use cauchy schwartz you say?

$$\sum_{i,j} p_i p_j \langle \psi_j | \psi_i \rangle \langle \psi_i | \psi_j \rangle = \sum_{i,j} p_i p_j \frac{\langle \psi_j | \psi_i \rangle \langle \psi_i | \psi_j \rangle \langle \psi_i | \psi_i \rangle}{\langle \psi_i | \psi_i \rangle} \leq \sum_{i,j,n} p_i p_j \langle \psi_j | n \rangle \langle n | \psi_j \rangle \langle \psi_i | \psi_i \rangle = \sum_{i,j} p_i p_j \langle \psi_j | \psi_j \rangle \langle \psi_i | \psi_i \rangle$$

Is that correct? Where do I go from here?

7. Apr 27, 2010

### Fredrik

Staff Emeritus
You're making it more complicated than it needs to be. I'm not even sure what you're doing, but you're getting the right result. Now you just need to use that the states are normalized.

This is the easy way to get the result you've got already:

$$\langle \psi_j | \psi_i \rangle \langle \psi_i | \psi_j \rangle=|\langle\psi_i|\psi_j\rangle|^2\leq \big\||\psi_i\rangle\big\|^2\big\||\psi_j\rangle\big\|^2=1$$

You will of course also have to use what you know about the pi.

I posted a statement and proof of the Cauchy-Schwarz inequality in the Science Advisor forum some time ago. You probably don't need it, but since it's a related topic, and since I have only posted it in a restricted forum before, I'm reposting it here.

Theorem:

If x and y are vectors in an inner product space X over $$\mathbb C[/itex], then [tex]|\langle x,y\rangle| \leq \|x\|\|y\|$$​

where the norm is the standard norm on an inner product space.

Proof:

Let t be an arbitrary complex number.

$$0 \leq \langle x+ty,x+ty\rangle=\|x\|^2+t\langle x,y\rangle+t^*\langle y,x\rangle+|t|^2\|y\|^2$$​

$$=\|x\|^2+2\operatorname{Re}(t\langle x,y\rangle)+|t|^2\|y\|^2$$​

The inequality is obviously satisfied when the real part of t<x,y> is non-negative, so we can only learn something interesting when it's negative. Let's choose Arg t so that it is.

$$=\|x\|^2-2|t||\langle x,y\rangle|+|t|^2\|y\|^2$$​

Now let's choose |t| so that it minimizes the sum of the last two terms. (This should give us the most interesting result).

$$s=|t|,\ A=\|y\|^2,\ B=2|\langle x,y\rangle|$$​

$$f(s)=As^2-Bs$$​

$$f'(s)=2As-B=0\ \Rightarrow\ s=\frac{B}{2A} = \frac{|\langle x,y\rangle|}{\|y\|^2}$$​

$$f''(s)=2A>0$$​

Continuing with this value of |t|...

$$=\|x\|^2-2\frac{|\langle x,y\rangle|}{\|y\|^2}|\langle x,y\rangle|+\frac{|\langle x,y\rangle|^2}{\|y\|^4}\|y\|^2$$​

$$=\|x\|^2-\frac{|\langle x,y\rangle|^2}{\|y\|^2}$$​

8. Apr 27, 2010

### barnflakes

Thank you for the response Fredrik, you'll have to excuse me, I'm really rather new to quantum mechanics/information, so when you say "use what I know with regards to p_i and p_j" I have to confess my ignorance as to what I know. As far as I'm aware, it's the probability that the N_th quantum system of an N dimensional system is in the state $$\psi_i$$ So if the system is in a pure state we know exactly the state of the system. I find this confusing. Does it mean we know the state of the system overall, or the state of each individual qubit/quantum system?

9. Apr 27, 2010

### Fredrik

Staff Emeritus
It's the probability that the ith system has been prepared in state $|\psi_i\rangle$. And you know that the sum of the probabilities is 1. That's what I meant you should use.

The density operator $\rho=\sum_i p_i|\psi_i\rangle\langle\psi_i|$ is a mathematical tool that we can use to calculate the expected average result when we perform a measurement of some observable A on every member of a large ensemble of identical systems, with a fraction $p_i$ of the systems in state $|\psi_i\rangle$. (If the number of systems is small, we're going to have to repeat the procedure many times to get an accurate average. That's why I said that we're calculating the "expected" average). It doesn't matter if the members of the ensemble are different systems that all exist at the same time at different locations, or if they are states of a single system at a single location at different times, or if they are different possible states of a single system at a single time.

A pure state has $p_i=\delta_{ij}$ for some j. What that means is that every member of the ensemble has been prepared in state $|\psi_i\rangle$. When you know that, it doesn't matter if the ensemble consists of a single system or 10^50 systems. What the information $p_i=\delta_{ij}$ is telling you is just what the "expected average" result of a measurement will be. ("Expected average" isn't a real term as far as I know. I just thought it seemed appropriate).

10. May 8, 2010

### barnflakes

Thank you Fredrik, I understand it much more now, one last thing about the proof. You say:

$$\operatorname{Tr}\rho^2=\sum_n\langle n|\rho^2|n\rangle$$

So I have $$\sum_{i,j,n} p_i p_j \langle n|\langle \psi_j | \psi_i \rangle \langle \psi_i | \psi_j \rangle |n \rangle$$

But since the expression $$\rho^2$$ is just a number then adding those orthonormal basis is making no difference? In other words, I can rearrange the above as follows:

$$\sum_{i,j,n} p_i p_j \langle \psi_i | \psi_j \rangle |n \rangle \langle n|\langle \psi_j | \psi_i \rangle =\sum_{i,j} p_i p_j \langle \psi_i | \psi_j \rangle \langle \psi_j | \psi_i \rangle =$$ and then use the caughy schwarz as above?

11. May 8, 2010

### Fredrik

Staff Emeritus
Yes, that's the definition of the trace. It's actually independent of the basis we use. (Proving that would be a good warm-up excercise).

That's not what we get from $\rho = \sum_i p_i | \psi_i \rangle \langle \psi_i |$ and the definition of the trace.

It isn't (but I see that it has magically turned into one in what you wrote above ).

12. May 8, 2010

### barnflakes

Haha oops, I see what I've done, sorry I forgot to check over my working since last time:

$$\sum_{i,j,n} p_i p_j \langle \psi_j | \psi_i \rangle \langle \psi_i | \psi_j \rangle$$ this is the expression I obtain after taking the trace and using the identity representation, I see now. Thank you Fredrik :)