1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

QM linear algebra true or false

  1. Apr 4, 2015 #1

    dyn

    User Avatar

    Apologies if this should be in homework section but I thought it best suited here. Been revising past papers but with no solutions. the following questions all require just a true or false answer. Any help or confirmation of my answers would be appreciated.

    1 - every N x N matrix has N eigenvectors T/F ?
    I think false because there would be infinitely many as any eigenvector can be multiplied by any scalar

    2 - Operator R is diagonalizable where R = exp( i pi Sx /hbar) where Sx is the x spin operator
    I think false because the x spin operator is not diagonal

    3 - product of 2 unitary operators is a unitary operator T/F ?
    i have no idea

    4 - the exponential of a hermitian operator is a unitary operator T/F ?
    I have no idea

    The following questions all relate to finite dim complex vector space V

    5 - the matrix representation of the identity operator is basis dependent T/F ?
    i think false

    6 - An orthogonal projector ( to a lower dim subspace) is neither injective nor surjective in V T/F ?
    i think its not surjective , not sure about injective
     
  2. jcsd
  3. Apr 4, 2015 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    I don't think that was what was intended! The set of all eigenvectors of an n by n matrix form a vector space of dimension less than or equal to n. There may be n independent eigenvectors if the matrix is "diagonalizable". But a matrix that is NOT diagonalizable is has fewer than n independent eigenvectors

    Do you understand the difference between "diagonal" and "diagonalizable"?

    It looks like you need to look up the definitions of "hermitian" and "unitary".

    Why do you think that?

    You have this exactly backward. Are you clear on what "injective" and "surjective" mean?[/quote][/QUOTE]
     
  4. Apr 4, 2015 #3

    dyn

    User Avatar

    I have looked up all the definitions and tried to work things out. I admit i am not clear on the injective/surjective terms. as for the identity operator being basis dependent i wasn't sure but i don't remember seeing one that wasn't and anyway if it was basis dependent how would you recognise it , it it wasn't a diagonal matrix of ones ? I have to admit i find linear algebra hard !
     
  5. Apr 4, 2015 #4

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    So can you show us what you tried?

    Take an arbitrary basis and work out the representation of the identity operator.
     
  6. Apr 4, 2015 #5

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    A function ##f:X\to Y## is said to be injective if the implication
    $$f(x)=f(y)\ \Rightarrow\ x=y$$ holds for all ##x,y\in X##.

    A function ##f:X\to Y## is said to be surjective if ##f(X)=Y##.

    The set ##f(X)## is by definition equal to ##\{f(x)|x\in X\}##. If you don't understand that notation, perhaps you'd prefer the following alternative definition:

    A function ##f:X\to Y## is said to be surjective if for all ##y\in Y##, there's an ##x\in X## such that ##f(x)=y##.

    The matrix ##[A]_E## associated with a linear operator ##A## and an ordered basis ##E=(e_1,\dots,e_n)## is defined by ##\big([A]_E\big)_{ij}= (Ae_j)_i##. The right-hand side denotes the ##i##th component of ##Ae_j##, i.e. the ##a_k## with ##k=i## on the right-hand side of ##Ae_j=\sum_{k=1}^n a_k e_k##. If you don't know this formula, you can't check these things. See the https://www.physicsforums.com/threads/matrix-representations-of-linear-transformations.694922/ [Broken] for more information.
     
    Last edited by a moderator: May 7, 2017
  7. Apr 4, 2015 #6

    dyn

    User Avatar

    Thank you for your replies. I understand that you try to help people find the answers themselves instead of just giving the answers but sometimes for someone who is self-studying and can't see the wood for the trees the latter method works best.
     
  8. Apr 4, 2015 #7

    Mark44

    Staff: Mentor

    At this forum we don't believe that just giving answers is helpful in the long run for the student. This is our philosophy, whether you're in a regular class or are studying the material on your own.
     
  9. Apr 4, 2015 #8

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I agree that it's useful to see a few solved examples, but we treat all textbook-style question as homework, and the rules for homework say that we can't give you complete solutions. All we can do is give you hints. When we do, you need to show us your attempt to use those hints to make progress. Even if what you're doing is wrong, you will still learn something from it, because we will tell you what you did wrong. If you feel that the hints are useless, you need to tell us why.

    The easiest problem here is probably problem 3. Can you at least show us an attempt at that?
     
  10. Apr 4, 2015 #9

    dyn

    User Avatar

    For Q3 my answer is that the product of 2 unitary operators is a unitary operator !
    (AB)+ = B+A+ =B-1A-1 = (AB)-1
     
  11. Apr 4, 2015 #10

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Correct
     
  12. Apr 4, 2015 #11

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Here's a cool feature of our hint system. Once you have shown us your solution, I can show you mine. In this case, it's only very slightly different from yours. A unitary U satisfies ##U^*U=I##. So if A and B are unitary, we have
    $$(AB)^*(AB)= B^*A^*AB =B^*I B=B^*B=I.$$ I used the associativity of composition of functions, and the formula ##(AB)^*=B^*A^*##.

    The above is all you need to do to solve the problem. In case you're interested in the proof of the last formula above, it goes like this: For all x,y, we have
    $$\langle (AB)^*x,y\rangle =\langle x,ABy\rangle=\langle A^*x,By\rangle =\langle B^*A^*x,y\rangle,$$ and therefore
    $$\big\langle \big((AB)^*-B^*A^*\big)x,y\big\rangle=0.$$ This implies that for all x, we have
    $$\big\langle \big((AB)^*-B^*A^*\big)x,\big((AB)^*-B^*A^*\big)x\big\rangle=0.$$ This implies that for all x, we have ##\big((AB)^*-B^*A^*\big)x=0##. This implies that ##(AB)^*-B^*A^*=0##.
     
    Last edited: Apr 5, 2015
  13. Apr 4, 2015 #12

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    1. You're right, but they probably meant something different from what they said. I think they meant "there exist eigenvectors ##x_1,\dots,x_N## such that ##\{x_1,\dots,x_N\}## is a linearly independent set of cardinality N". (The comment about cardinality is just my way of saying that each ##x_i## is different from all the others). I think you should start by trying a few very simple matrices and then form an opinion about whether it's seems likely that the claim is true. If you think it's true, you try to prove it. If you think it's false, you try to find a counterexample.

    2. You will have to use the definition of "exp" and "diagonalizable". Is ##S_x## diagonalizable? Can you use the answer to that question?

    3. Problem solved.

    4. Suppose that H is hermitian. What is ##(e^H)^*e^H##? Is it the identity operator?

    5. Use the formula I posted.

    6. What is the definition of orthogonal projection? Can you think of a simple but non-trivial example of an orthogonal projection? Is it injective? Is it surjective?
     
  14. Apr 4, 2015 #13

    dyn

    User Avatar

    Q5 the identity operator is basis independent. Whatever vector I multiply by diag( 1,1,1...) gives me the original vector so diag(1,1,1,...) must be the identity matrix for any basis.

    Q4 the exponential of a hermitian operator is a unitary operator as (eH)-1eH = e-HeH = 1 (identity matrix) but I have to admit my doubts about that one
     
  15. Apr 4, 2015 #14

    dyn

    User Avatar

    Q2 operator R is diagonalizable because Sx can be diagonalized because it consists of 2 eigenvectors that span the space
     
  16. Apr 4, 2015 #15

    dyn

    User Avatar

    Q6 an orthogonal projector to a lower dim subspace is not surjective because its range is not the full vector space and not injective because the nullspace of the projector is not just the zero vector although I admit I understand don't the last point ; I just found it as a theorem
     
  17. Apr 5, 2015 #16

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    This is correct, but I wouldn't consider it a solution unless you also explain what matrix multiplication has to do with a change of basis.

    Here you seem to have assumed that ##(e^H)^{-1}=e^{-H}##. This is equivalent to assuming that ##e^{-H}e^H=I##. So what you have assumed is equivalent to the result of your calculation. Also, even if ##(e^H)^{-1}=e^{-H}## had been a theorem, you should have written ##1=(e^H)^{-1}e^H=e^{-H}e^H## rather than ##(e^H)^{-1}e^H=e^{-H}e^H=1##, because it's ##(e^H)^{-1}e^H## that's obviously equal to 1, not ##e^{-H}e^H##.

    I would use the fact that a linear operator ##U## is unitary if and only if ##U^*U=I##. You will have to use the definition of the exponential map and some known properties of the adjoint operation to see if this identity holds when ##U=e^H## and ##H^*=H##.

    Do you mean that ##S_x## is diagonalizable because its columns are eigenvectors of (the matrix representation of) ##S_z##? I don't understand this argument. I haven't thought it through to the point where I know if it's wrong or if you just need to elaborate a bit.

    I would use a theorem about diagonalization of self-adjoint (=hermitian) linear operators to argue that ##S_x## is diagonalizable. Once you have completed that argument, you can't just say that the fact that ##S_x## is diagonalizable implies that ##R## is diagonalizable. You will have to use the definition of the exponential map to show that this is the case.

    You have solved the part about surjectivity.

    It's fine to use that theorem (I can explain it when you have completed the solution), but how did you determine that the null space isn't {0}? If P is the orthogonal projection onto a proper subspace M, then what is the null space of P? What is the definition of the orthogonal projection onto M?
     
    Last edited: Apr 5, 2015
  18. Apr 5, 2015 #17

    dyn

    User Avatar

    thanks for your reply. I will have to move on now as its about time to do the real exam ! But could you tell me what you mean by the definition of the exponential map ?
     
  19. Apr 5, 2015 #18

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    ##\text{exp}(A) = I + A + \frac{A^2}{2!} + \frac{A^3}{3!} + ... + \frac{A^n}{n!} + ...##
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: QM linear algebra true or false
Loading...