Symmetric Matrixes Thm

  1. I need to prove the following.
    1. A is a symmetric matrix, and x(transpose)*A*x=0 for all x (belongs to R^n) if and only if A=0.
    2. x(transpose)*A*x=0 for all x (belongs to R^n), if and only if A is skew symmetric.
  2. jcsd
  3. I can suggest that for the second one that you transpose both sides
    and use the A(transposed) = -A property so that
    x(transpose)*A*x = 0 = -(x(tranpose)*A*x)
    so those two can equal 0 if and only if A is 0
  4. For the first one, sufficiency is obvious. For necessity, either, write the explicit quadratic polynomial and plug in all values of x. Only possibility is the zero polynomial. Alternatively, Assume your [itex]A=V\Lambda V^{-1}[/itex] is diagonalizable with some Jordan blocks (possibly with size [itex]k \times k[/itex]). And use the structure of the Jordan blocks.
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?