# QM linear algebra true or false

1. Apr 4, 2015

### dyn

Apologies if this should be in homework section but I thought it best suited here. Been revising past papers but with no solutions. the following questions all require just a true or false answer. Any help or confirmation of my answers would be appreciated.

1 - every N x N matrix has N eigenvectors T/F ?
I think false because there would be infinitely many as any eigenvector can be multiplied by any scalar

2 - Operator R is diagonalizable where R = exp( i pi Sx /hbar) where Sx is the x spin operator
I think false because the x spin operator is not diagonal

3 - product of 2 unitary operators is a unitary operator T/F ?
i have no idea

4 - the exponential of a hermitian operator is a unitary operator T/F ?
I have no idea

The following questions all relate to finite dim complex vector space V

5 - the matrix representation of the identity operator is basis dependent T/F ?
i think false

6 - An orthogonal projector ( to a lower dim subspace) is neither injective nor surjective in V T/F ?
i think its not surjective , not sure about injective

2. Apr 4, 2015

### HallsofIvy

Staff Emeritus
I don't think that was what was intended! The set of all eigenvectors of an n by n matrix form a vector space of dimension less than or equal to n. There may be n independent eigenvectors if the matrix is "diagonalizable". But a matrix that is NOT diagonalizable is has fewer than n independent eigenvectors

Do you understand the difference between "diagonal" and "diagonalizable"?

It looks like you need to look up the definitions of "hermitian" and "unitary".

Why do you think that?

You have this exactly backward. Are you clear on what "injective" and "surjective" mean?[/quote][/QUOTE]

3. Apr 4, 2015

### dyn

I have looked up all the definitions and tried to work things out. I admit i am not clear on the injective/surjective terms. as for the identity operator being basis dependent i wasn't sure but i don't remember seeing one that wasn't and anyway if it was basis dependent how would you recognise it , it it wasn't a diagonal matrix of ones ? I have to admit i find linear algebra hard !

4. Apr 4, 2015

### micromass

Staff Emeritus
So can you show us what you tried?

Take an arbitrary basis and work out the representation of the identity operator.

5. Apr 4, 2015

### Fredrik

Staff Emeritus
A function $f:X\to Y$ is said to be injective if the implication
$$f(x)=f(y)\ \Rightarrow\ x=y$$ holds for all $x,y\in X$.

A function $f:X\to Y$ is said to be surjective if $f(X)=Y$.

The set $f(X)$ is by definition equal to $\{f(x)|x\in X\}$. If you don't understand that notation, perhaps you'd prefer the following alternative definition:

A function $f:X\to Y$ is said to be surjective if for all $y\in Y$, there's an $x\in X$ such that $f(x)=y$.

The matrix $[A]_E$ associated with a linear operator $A$ and an ordered basis $E=(e_1,\dots,e_n)$ is defined by $\big([A]_E\big)_{ij}= (Ae_j)_i$. The right-hand side denotes the $i$th component of $Ae_j$, i.e. the $a_k$ with $k=i$ on the right-hand side of $Ae_j=\sum_{k=1}^n a_k e_k$. If you don't know this formula, you can't check these things. See the https://www.physicsforums.com/threads/matrix-representations-of-linear-transformations.694922/ [Broken] for more information.

Last edited by a moderator: May 7, 2017
6. Apr 4, 2015

### dyn

Thank you for your replies. I understand that you try to help people find the answers themselves instead of just giving the answers but sometimes for someone who is self-studying and can't see the wood for the trees the latter method works best.

7. Apr 4, 2015

### Staff: Mentor

At this forum we don't believe that just giving answers is helpful in the long run for the student. This is our philosophy, whether you're in a regular class or are studying the material on your own.

8. Apr 4, 2015

### Fredrik

Staff Emeritus
I agree that it's useful to see a few solved examples, but we treat all textbook-style question as homework, and the rules for homework say that we can't give you complete solutions. All we can do is give you hints. When we do, you need to show us your attempt to use those hints to make progress. Even if what you're doing is wrong, you will still learn something from it, because we will tell you what you did wrong. If you feel that the hints are useless, you need to tell us why.

The easiest problem here is probably problem 3. Can you at least show us an attempt at that?

9. Apr 4, 2015

### dyn

For Q3 my answer is that the product of 2 unitary operators is a unitary operator !
(AB)+ = B+A+ =B-1A-1 = (AB)-1

10. Apr 4, 2015

### micromass

Staff Emeritus
Correct

11. Apr 4, 2015

### Fredrik

Staff Emeritus
Here's a cool feature of our hint system. Once you have shown us your solution, I can show you mine. In this case, it's only very slightly different from yours. A unitary U satisfies $U^*U=I$. So if A and B are unitary, we have
$$(AB)^*(AB)= B^*A^*AB =B^*I B=B^*B=I.$$ I used the associativity of composition of functions, and the formula $(AB)^*=B^*A^*$.

The above is all you need to do to solve the problem. In case you're interested in the proof of the last formula above, it goes like this: For all x,y, we have
$$\langle (AB)^*x,y\rangle =\langle x,ABy\rangle=\langle A^*x,By\rangle =\langle B^*A^*x,y\rangle,$$ and therefore
$$\big\langle \big((AB)^*-B^*A^*\big)x,y\big\rangle=0.$$ This implies that for all x, we have
$$\big\langle \big((AB)^*-B^*A^*\big)x,\big((AB)^*-B^*A^*\big)x\big\rangle=0.$$ This implies that for all x, we have $\big((AB)^*-B^*A^*\big)x=0$. This implies that $(AB)^*-B^*A^*=0$.

Last edited: Apr 5, 2015
12. Apr 4, 2015

### Fredrik

Staff Emeritus
1. You're right, but they probably meant something different from what they said. I think they meant "there exist eigenvectors $x_1,\dots,x_N$ such that $\{x_1,\dots,x_N\}$ is a linearly independent set of cardinality N". (The comment about cardinality is just my way of saying that each $x_i$ is different from all the others). I think you should start by trying a few very simple matrices and then form an opinion about whether it's seems likely that the claim is true. If you think it's true, you try to prove it. If you think it's false, you try to find a counterexample.

2. You will have to use the definition of "exp" and "diagonalizable". Is $S_x$ diagonalizable? Can you use the answer to that question?

3. Problem solved.

4. Suppose that H is hermitian. What is $(e^H)^*e^H$? Is it the identity operator?

5. Use the formula I posted.

6. What is the definition of orthogonal projection? Can you think of a simple but non-trivial example of an orthogonal projection? Is it injective? Is it surjective?

13. Apr 4, 2015

### dyn

Q5 the identity operator is basis independent. Whatever vector I multiply by diag( 1,1,1...) gives me the original vector so diag(1,1,1,...) must be the identity matrix for any basis.

Q4 the exponential of a hermitian operator is a unitary operator as (eH)-1eH = e-HeH = 1 (identity matrix) but I have to admit my doubts about that one

14. Apr 4, 2015

### dyn

Q2 operator R is diagonalizable because Sx can be diagonalized because it consists of 2 eigenvectors that span the space

15. Apr 4, 2015

### dyn

Q6 an orthogonal projector to a lower dim subspace is not surjective because its range is not the full vector space and not injective because the nullspace of the projector is not just the zero vector although I admit I understand don't the last point ; I just found it as a theorem

16. Apr 5, 2015

### Fredrik

Staff Emeritus
This is correct, but I wouldn't consider it a solution unless you also explain what matrix multiplication has to do with a change of basis.

Here you seem to have assumed that $(e^H)^{-1}=e^{-H}$. This is equivalent to assuming that $e^{-H}e^H=I$. So what you have assumed is equivalent to the result of your calculation. Also, even if $(e^H)^{-1}=e^{-H}$ had been a theorem, you should have written $1=(e^H)^{-1}e^H=e^{-H}e^H$ rather than $(e^H)^{-1}e^H=e^{-H}e^H=1$, because it's $(e^H)^{-1}e^H$ that's obviously equal to 1, not $e^{-H}e^H$.

I would use the fact that a linear operator $U$ is unitary if and only if $U^*U=I$. You will have to use the definition of the exponential map and some known properties of the adjoint operation to see if this identity holds when $U=e^H$ and $H^*=H$.

Do you mean that $S_x$ is diagonalizable because its columns are eigenvectors of (the matrix representation of) $S_z$? I don't understand this argument. I haven't thought it through to the point where I know if it's wrong or if you just need to elaborate a bit.

I would use a theorem about diagonalization of self-adjoint (=hermitian) linear operators to argue that $S_x$ is diagonalizable. Once you have completed that argument, you can't just say that the fact that $S_x$ is diagonalizable implies that $R$ is diagonalizable. You will have to use the definition of the exponential map to show that this is the case.

You have solved the part about surjectivity.

It's fine to use that theorem (I can explain it when you have completed the solution), but how did you determine that the null space isn't {0}? If P is the orthogonal projection onto a proper subspace M, then what is the null space of P? What is the definition of the orthogonal projection onto M?

Last edited: Apr 5, 2015
17. Apr 5, 2015

### dyn

thanks for your reply. I will have to move on now as its about time to do the real exam ! But could you tell me what you mean by the definition of the exponential map ?

18. Apr 5, 2015

### micromass

Staff Emeritus
$\text{exp}(A) = I + A + \frac{A^2}{2!} + \frac{A^3}{3!} + ... + \frac{A^n}{n!} + ...$