1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Matrices and determiants

  1. Mar 1, 2017 #1
    1. Problem statement : suppose we have a Hermitian 3 x 3 Matrix A and X is any non-zero column vector. If
    X(dagger) A X > 0 then it implies that determinant (A) > 0.

    I tried to prove this statement and my attempt is attached as an image. Please can anyone guide me in a step by step way to approach this problem. I am not sure and not very clear about my way of approach about the problem.

    P.S. I am new to the forum and I apologize if my post has flaws. I tried my best to formulate the problem and my attempt at it clearly but still I am not very proficient in this formatting stuff. I wrote it in MSword too and then posted it but the whole text appeared struck out and then it was removed.
    Last edited by a moderator: Mar 1, 2017
  2. jcsd
  3. Mar 1, 2017 #2


    User Avatar
    Homework Helper
    Gold Member

    With the free or cheap app DocScan HD you can make your page look like this:

    In your case I don't know whether that's enough to get anyone to read it. :oldbiggrin:
  4. Mar 1, 2017 #3

    Stephen Tashi

    User Avatar
    Science Advisor

    Attempted transliteration:

  5. Mar 2, 2017 #4


    User Avatar
    Gold Member

    You're on the cusp of some very interesting stuff called quadratic forms. I'd suggest approaching this from a few different angles -- it connects quite a few different concepts in Linear Algebra. With that in mind, here are a few other ways to look at the problem:
    - - - -

    When you get down to it, these are optimization problems that are eigenvalue problems. (And determinant = product of all eigenvalues).

    For Hermitian ##\mathbf A##, consider the optimization problem:

    ##\mathbf x^H \mathbf A \mathbf x##

    where ##\mathbf x \neq \mathbf 0##
    (where ##^H## indicates conjugate transpose -- your post says Hermitian, so we really should be using ##^H## not ##^T##. Note that ##\mathbf 0## is the zero vector).

    How would you minimize this expression? How would you maximize this? (There are basically two approaches, one is use Lagrange multipliers, and the other is diagonalize with unitary basis formed by eigenvectors. Both are worth working through in detail.)

    I would spend some time thinking on quadratic forms.

    - - - - -
    A simpler, but less elegant approach is to not directly mention eigenvalues, and consider this:

    ##\mathbf x^H \mathbf A \mathbf x##. Assume that ##\mathbf A## has a non trivial nullspace. Then that means ##\mathbf A## is singular and I can find some ##\mathbf x \neq \mathbf 0##, where ##\mathbf A \mathbf x = \mathbf 0##. If this is the case, then ##\mathbf x^H \mathbf A \mathbf x = \mathbf x^H \big(\mathbf A \mathbf x\big) = ?##. What does this tell you?

    This approach is simpler, but feels a lot lot less constructive (and satisfying) to me than actually working through the quadratic form optimization.
    - - - -
    Note there is a sticky on using LaTeX formatting here:


    Also you can right click my formatted text and show the math commands to see what I entered to get this result.
    - - - -
    By the way this was posted in the 'precalculus' forum. So I'm not totally sure how to think about the use of intermediate value theorem (or Lagrange multipliers) here.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted

Similar Discussions: Matrices and determiants
  1. Matrices question (Replies: 2)

  2. Matrices proof. (Replies: 1)

  3. Sets of matrices... (Replies: 8)