QM linear algebra true or false

In summary: I agree that it's useful to see a few solved examples, but we treat all textbook-style question as homework, and the rules for homework say that we can't give you complete solutions. All we can do is give you hints. When we do, you need to show us your attempt to use those hints to make progress. Even if you have no idea what to do, if you show us your attempts, we can usually help you get started.
  • #1
dyn
773
61
Apologies if this should be in homework section but I thought it best suited here. Been revising past papers but with no solutions. the following questions all require just a true or false answer. Any help or confirmation of my answers would be appreciated.

1 - every N x N matrix has N eigenvectors T/F ?
I think false because there would be infinitely many as any eigenvector can be multiplied by any scalar

2 - Operator R is diagonalizable where R = exp( i pi Sx /hbar) where Sx is the x spin operator
I think false because the x spin operator is not diagonal

3 - product of 2 unitary operators is a unitary operator T/F ?
i have no idea

4 - the exponential of a hermitian operator is a unitary operator T/F ?
I have no idea

The following questions all relate to finite dim complex vector space V

5 - the matrix representation of the identity operator is basis dependent T/F ?
i think false

6 - An orthogonal projector ( to a lower dim subspace) is neither injective nor surjective in V T/F ?
i think its not surjective , not sure about injective
 
Physics news on Phys.org
  • #2
dyn said:
Apologies if this should be in homework section but I thought it best suited here. Been revising past papers but with no solutions. the following questions all require just a true or false answer. Any help or confirmation of my answers would be appreciated.

1 - every N x N matrix has N eigenvectors T/F ?
I think false because there would be infinitely many as any eigenvector can be multiplied by any scalar
I don't think that was what was intended! The set of all eigenvectors of an n by n matrix form a vector space of dimension less than or equal to n. There may be n independent eigenvectors if the matrix is "diagonalizable". But a matrix that is NOT diagonalizable is has fewer than n independent eigenvectors

2 - Operator R is diagonalizable where R = exp( i pi Sx /hbar) where Sx is the x spin operator
I think false because the x spin operator is not diagonal
Do you understand the difference between "diagonal" and "diagonalizable"?

3 - product of 2 unitary operators is a unitary operator T/F ?
i have no idea

4 - the exponential of a hermitian operator is a unitary operator T/F ?
I have no idea
It looks like you need to look up the definitions of "hermitian" and "unitary".

The following questions all relate to finite dim complex vector space V

5 - the matrix representation of the identity operator is basis dependent T/F ?
i think false
Why do you think that?

6 - An orthogonal projector ( to a lower dim subspace) is neither injective nor surjective in V T/F ?
i think its not surjective , not sure about injective
You have this exactly backward. Are you clear on what "injective" and "surjective" mean?[/quote][/QUOTE]
 
  • #3
I have looked up all the definitions and tried to work things out. I admit i am not clear on the injective/surjective terms. as for the identity operator being basis dependent i wasn't sure but i don't remember seeing one that wasn't and anyway if it was basis dependent how would you recognise it , it it wasn't a diagonal matrix of ones ? I have to admit i find linear algebra hard !
 
  • #4
dyn said:
I have looked up all the definitions and tried to work things out.

So can you show us what you tried?

as for the identity operator being basis dependent i wasn't sure but i don't remember seeing one that wasn't and anyway if it was basis dependent how would you recognise it , it it wasn't a diagonal matrix of ones ?

Take an arbitrary basis and work out the representation of the identity operator.
 
  • #5
dyn said:
I admit i am not clear on the injective/surjective terms.
A function ##f:X\to Y## is said to be injective if the implication
$$f(x)=f(y)\ \Rightarrow\ x=y$$ holds for all ##x,y\in X##.

A function ##f:X\to Y## is said to be surjective if ##f(X)=Y##.

The set ##f(X)## is by definition equal to ##\{f(x)|x\in X\}##. If you don't understand that notation, perhaps you'd prefer the following alternative definition:

A function ##f:X\to Y## is said to be surjective if for all ##y\in Y##, there's an ##x\in X## such that ##f(x)=y##.

dyn said:
as for the identity operator being basis dependent i wasn't sure but i don't remember seeing one that wasn't and anyway if it was basis dependent how would you recognise it , it it wasn't a diagonal matrix of ones ? I have to admit i find linear algebra hard !
The matrix ##[A]_E## associated with a linear operator ##A## and an ordered basis ##E=(e_1,\dots,e_n)## is defined by ##\big([A]_E\big)_{ij}= (Ae_j)_i##. The right-hand side denotes the ##i##th component of ##Ae_j##, i.e. the ##a_k## with ##k=i## on the right-hand side of ##Ae_j=\sum_{k=1}^n a_k e_k##. If you don't know this formula, you can't check these things. See the https://www.physicsforums.com/threads/matrix-representations-of-linear-transformations.694922/ for more information.
 
Last edited by a moderator:
  • #6
Thank you for your replies. I understand that you try to help people find the answers themselves instead of just giving the answers but sometimes for someone who is self-studying and can't see the wood for the trees the latter method works best.
 
  • #7
dyn said:
Thank you for your replies. I understand that you try to help people find the answers themselves instead of just giving the answers but sometimes for someone who is self-studying and can't see the wood for the trees the latter method works best.
At this forum we don't believe that just giving answers is helpful in the long run for the student. This is our philosophy, whether you're in a regular class or are studying the material on your own.
 
  • Like
Likes bhobba
  • #8
I agree that it's useful to see a few solved examples, but we treat all textbook-style question as homework, and the rules for homework say that we can't give you complete solutions. All we can do is give you hints. When we do, you need to show us your attempt to use those hints to make progress. Even if what you're doing is wrong, you will still learn something from it, because we will tell you what you did wrong. If you feel that the hints are useless, you need to tell us why.

The easiest problem here is probably problem 3. Can you at least show us an attempt at that?
 
  • #9
For Q3 my answer is that the product of 2 unitary operators is a unitary operator !
(AB)+ = B+A+ =B-1A-1 = (AB)-1
 
  • #10
dyn said:
For Q3 my answer is that the product of 2 unitary operators is a unitary operator !
(AB)+ = B+A+ =B-1A-1 = (AB)-1

Correct
 
  • #11
Here's a cool feature of our hint system. Once you have shown us your solution, I can show you mine. In this case, it's only very slightly different from yours. A unitary U satisfies ##U^*U=I##. So if A and B are unitary, we have
$$(AB)^*(AB)= B^*A^*AB =B^*I B=B^*B=I.$$ I used the associativity of composition of functions, and the formula ##(AB)^*=B^*A^*##.

The above is all you need to do to solve the problem. In case you're interested in the proof of the last formula above, it goes like this: For all x,y, we have
$$\langle (AB)^*x,y\rangle =\langle x,ABy\rangle=\langle A^*x,By\rangle =\langle B^*A^*x,y\rangle,$$ and therefore
$$\big\langle \big((AB)^*-B^*A^*\big)x,y\big\rangle=0.$$ This implies that for all x, we have
$$\big\langle \big((AB)^*-B^*A^*\big)x,\big((AB)^*-B^*A^*\big)x\big\rangle=0.$$ This implies that for all x, we have ##\big((AB)^*-B^*A^*\big)x=0##. This implies that ##(AB)^*-B^*A^*=0##.
 
Last edited:
  • #12
1. You're right, but they probably meant something different from what they said. I think they meant "there exist eigenvectors ##x_1,\dots,x_N## such that ##\{x_1,\dots,x_N\}## is a linearly independent set of cardinality N". (The comment about cardinality is just my way of saying that each ##x_i## is different from all the others). I think you should start by trying a few very simple matrices and then form an opinion about whether it's seems likely that the claim is true. If you think it's true, you try to prove it. If you think it's false, you try to find a counterexample.

2. You will have to use the definition of "exp" and "diagonalizable". Is ##S_x## diagonalizable? Can you use the answer to that question?

3. Problem solved.

4. Suppose that H is hermitian. What is ##(e^H)^*e^H##? Is it the identity operator?

5. Use the formula I posted.

6. What is the definition of orthogonal projection? Can you think of a simple but non-trivial example of an orthogonal projection? Is it injective? Is it surjective?
 
  • #13
Q5 the identity operator is basis independent. Whatever vector I multiply by diag( 1,1,1...) gives me the original vector so diag(1,1,1,...) must be the identity matrix for any basis.

Q4 the exponential of a hermitian operator is a unitary operator as (eH)-1eH = e-HeH = 1 (identity matrix) but I have to admit my doubts about that one
 
  • #14
Q2 operator R is diagonalizable because Sx can be diagonalized because it consists of 2 eigenvectors that span the space
 
  • #15
Q6 an orthogonal projector to a lower dim subspace is not surjective because its range is not the full vector space and not injective because the nullspace of the projector is not just the zero vector although I admit I understand don't the last point ; I just found it as a theorem
 
  • #16
dyn said:
Q5 the identity operator is basis independent. Whatever vector I multiply by diag( 1,1,1...) gives me the original vector so diag(1,1,1,...) must be the identity matrix for any basis.
This is correct, but I wouldn't consider it a solution unless you also explain what matrix multiplication has to do with a change of basis.

dyn said:
Q4 the exponential of a hermitian operator is a unitary operator as (eH)-1eH = e-HeH = 1 (identity matrix) but I have to admit my doubts about that one
Here you seem to have assumed that ##(e^H)^{-1}=e^{-H}##. This is equivalent to assuming that ##e^{-H}e^H=I##. So what you have assumed is equivalent to the result of your calculation. Also, even if ##(e^H)^{-1}=e^{-H}## had been a theorem, you should have written ##1=(e^H)^{-1}e^H=e^{-H}e^H## rather than ##(e^H)^{-1}e^H=e^{-H}e^H=1##, because it's ##(e^H)^{-1}e^H## that's obviously equal to 1, not ##e^{-H}e^H##.

I would use the fact that a linear operator ##U## is unitary if and only if ##U^*U=I##. You will have to use the definition of the exponential map and some known properties of the adjoint operation to see if this identity holds when ##U=e^H## and ##H^*=H##.

dyn said:
Q2 operator R is diagonalizable because Sx can be diagonalized because it consists of 2 eigenvectors that span the space
Do you mean that ##S_x## is diagonalizable because its columns are eigenvectors of (the matrix representation of) ##S_z##? I don't understand this argument. I haven't thought it through to the point where I know if it's wrong or if you just need to elaborate a bit.

I would use a theorem about diagonalization of self-adjoint (=hermitian) linear operators to argue that ##S_x## is diagonalizable. Once you have completed that argument, you can't just say that the fact that ##S_x## is diagonalizable implies that ##R## is diagonalizable. You will have to use the definition of the exponential map to show that this is the case.

dyn said:
Q6 an orthogonal projector to a lower dim subspace is not surjective because its range is not the full vector space and not injective because the nullspace of the projector is not just the zero vector although I admit I understand don't the last point ; I just found it as a theorem
You have solved the part about surjectivity.

It's fine to use that theorem (I can explain it when you have completed the solution), but how did you determine that the null space isn't {0}? If P is the orthogonal projection onto a proper subspace M, then what is the null space of P? What is the definition of the orthogonal projection onto M?
 
Last edited:
  • #17
thanks for your reply. I will have to move on now as its about time to do the real exam ! But could you tell me what you mean by the definition of the exponential map ?
 
  • #18
##\text{exp}(A) = I + A + \frac{A^2}{2!} + \frac{A^3}{3!} + ... + \frac{A^n}{n!} + ...##
 

1. Is linear algebra necessary for understanding quantum mechanics?

Yes, linear algebra is essential for understanding quantum mechanics. It provides the mathematical framework for describing the behavior of quantum systems, such as particles and waves, and their interactions.

2. Can quantum mechanics be described solely using classical linear algebra?

No, classical linear algebra is not sufficient for describing quantum mechanics. Quantum mechanics requires a specialized form of linear algebra called quantum linear algebra, which takes into account the unique features of quantum systems, such as superposition and entanglement.

3. Are all quantum states represented by vectors in linear algebra?

Yes, all quantum states are represented by vectors in linear algebra. These vectors, called state vectors or ket vectors, represent the quantum state of a system and can be manipulated using linear algebra operations.

4. Can two quantum states be added or multiplied together in linear algebra?

No, two quantum states cannot be added or multiplied together in linear algebra. In quantum mechanics, the state of a system is described by a single vector, and the combination of two states is represented by taking the inner product of the two vectors.

5. Is the concept of eigenvalues and eigenvectors applicable in quantum mechanics?

Yes, the concept of eigenvalues and eigenvectors is applicable in quantum mechanics. In fact, these concepts are crucial for understanding the behavior of quantum systems, such as the energy levels of a particle in a potential well.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
608
  • Calculus and Beyond Homework Help
Replies
24
Views
795
  • Calculus and Beyond Homework Help
Replies
14
Views
594
  • Calculus and Beyond Homework Help
Replies
5
Views
525
  • Calculus and Beyond Homework Help
Replies
2
Views
523
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
458
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
Back
Top