1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof question for linear algebra

  1. Apr 23, 2013 #1
    1. The problem statement, all variables and given/known data


    I have a quick question about the proof below.

    Let A be an nxn matrix. Prove that A is singular if and only if λ=0

    I searched the proof online, and they did it using Ax=0

    However,

    When I tried doing on my own my solution was this

    If A is singular then the det(A)=0

    However we know from the following relation ship that
    (λ1*λ2.....*λi)=det(A)

    thus there must be at least one eigenvalue λi such that

    (λ1*λ2.....*λi)=0 end of proof

    Is my reasoning correct?

    Thank you




    2. Relevant equations



    3. The attempt at a solution
     
  2. jcsd
  3. Apr 23, 2013 #2

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    It's not very logical. I don't even know what "Let A be an nxn matrix. Prove that A is singular if and only if λ=0" means. Is that really the statement you have to prove? What's λ? If λ is supposed to be an eigenvalue, then a singular matrix can certainly have a nonzero eigenvalue.
     
  4. Apr 23, 2013 #3
    I literally just copy and pasted what the book was asking for, and yes λ is an eigenvalue.
     
  5. Apr 23, 2013 #4

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Then your book is being sloppy. [[0,0],[0,3]] is singular, but it does have an eigenvalue of 3. It also has an eigenvalue of 0 which is what really matters.
     
  6. Apr 23, 2013 #5
    I understand your case, but does that mean that my reasoning to the proof is ok? Or do I need to justify it more?
     
  7. Apr 23, 2013 #6

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    If the statement of what you are supposed to prove is not clear then it's going to be hard to say any proof is right or not. If you want to show any matrix that is singular has an eigenvalue of 0 and vice versa, then you should start from the definition of singular.
     
  8. Apr 23, 2013 #7
    didn't I do that by stating that if the matrix A is singular then the det(A)=0
     
  9. Apr 23, 2013 #8

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    If you are going to take as a given that det(A)=0 iff A is nonsingular AND you know det(A) is the product of the eigenvalues, then I suppose that's ok if you state it a little more clearly. But there's a much more economical proof that a singular matrix has zero eigenvalue. I think you found it online.
     
    Last edited: Apr 23, 2013
  10. Apr 23, 2013 #9
    I'm a bit confused would I have to prove that if a matrix is singular then the det(A)=0 or can I just quote the book? Also there is a relationship in the text that states that the the sum of the eigenvalues is equal to the trace, as well as the product of the eigenvalues is equal to the det(A)
     
  11. Apr 23, 2013 #10

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    The original problem didn't say anything about det(A). You don't need it. As near as I can tell it wants you to show a singular matrix has a zero eigenvalue. The cheap proof uses definition of singular and the rank-nullity theorem. A matrix mapping R^n to R^n is singular if it has a nontrivial kernel. So?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Proof question for linear algebra
Loading...