Negative determinants when calculating eigenvectors?

1. Jul 24, 2012

tamtam402

Let M be a transformation matrix. C is the matrix which diagonalizes M.

I'm trying to use the formula D = C-1MC. I noticed that depending on how I arrange my vectors in C, I can change the sign of the determinant. If I calculate D using a configuration of C that gives me a negative value for the determinant, my matrix D will have a negative sign in front of the eigenvalues on it's diagonal. (Note: the determinant is needed when calculating the inverse of C, so a negative determinant will multiply the C-1MC equation by -1 )

However, I've read that the matrix D should always have the eigenvalues on it's diagonal, and I've also heard that it doesn't matter how you set-up the matrix C, as long as the eigenvectors are all there.

What's going on? Should I always make sure to use a configuration of C that will get me a positive determinant?

Last edited: Jul 24, 2012
2. Jul 24, 2012

DonAntonio

$${}$$
No, you can't. The determinant is an invariant of matrices (linear operators), and it equals the product of all the matrix's eigenvalues

(in some extension field, probably).

DonAntonio

3. Jul 25, 2012

AlephZero

You must be making a mistake somewhere, but without a specific example (preferably a small one that is easy to check!) we can't really say much more than post #2 already said.

4. Jul 25, 2012

HallsofIvy

If you change the order of vectors, you change the matrix so it is NOT correct to say that you have changed the sign on an eigenvalue. You have an eigenvalue of a different matrix that has a different sign.

In particular, we can always represent an abstract linear operator on a vector space by a matrix by choosing a specific ordered basis for the vector space. But changing the basis or just changing the order of the vectors in the basis will represent the same linear operator by different matrix.

5. Jul 25, 2012

tamtam402

Let's suppose I have 2 eigenvalues: 1 and 3.

The eigenvector for the first eigenvalue is (1,-1). The eigenvector for the second eigenvalue is (1,1).

I can write my matrix in the following form, which has a determinant equal to -2.

1 1
1 -1

I can also write my matrix in the following form, which has a determinant equal to 2.

1 1
-1 1

6. Jul 25, 2012

genericusrnme

I think I see what you're getting at here, you can write the eigenvector matrix in a different order which will make the determinant change sign
BUT
you have $C^{-1}$ there too
$det(C^{-1} M C) = det(C^{-1}) det(M) det(C) = \frac{1}{det(C)} det(M) det(C) = \frac{det(C)}{det(C)} det(M) = det(M)$
Is this what was confusing you?

7. Jul 25, 2012

DonAntonio

This "example" is like the example of a (plane) triangle (in Euclidean geometry) with sides 3, 6, 9: impossible.

You cannot have a 2x2 matrix with eigenvalues 1,3 and its determinant different from 3: impossible. This is why, I

suppose, you were already asked to give a specific example , not only throwing numbers as eigenvalues

and "let's say an eigenvalue is..." things.

DonAntonio

8. Jul 27, 2012

tamtam402

Yes, thank you. Somehow it was my first time getting a negative determinant, and I also messed up elsewhere in the process. That's what led me to believe that negative determinants were bad :)

9. Jul 27, 2012

tamtam402

Please see the post above yours, it seems like my example was specific enough.

10. Jul 27, 2012

micromass

Not really. We still don't know what you did wrong.

11. Jul 28, 2012

DonAntonio

No, it isn't, and I find it odd that you stubbornly insist on. You must give us one specific matrix where you have

specifically calculated its eigenvalues and its eigenvectors and and where you've specifically changed the order of the eigenvectors and

get a DIFFERENT determinant. Of course, you can't do this but if you send what the above we'll be able, perhaps, to specifically

BTW, my last post has a typo: of course that the determinant of any 2x2 matrix with eigenvalues 1,3 will ALWAYS be 3. I meant

to write -3 referring to your claim that you can get the opposite sign.

DonAntonio

Pd If you refer, as "specific example, to the matrices
$$\begin{pmatrix}1&1\\1&-1\end{pmatrix}\,\,,\,\,\begin{pmatrix}1&1\\-1&1\end{pmatrix}$$
then this a non-example as these matrices have neither the same determinant nor the same trace so they can't

be similar. Please do present us the ORIGINAL matrix before you try to change it.

12. Jul 28, 2012

genericusrnme

micromass, donantonio, the problem the guy was having is this;

Let A be some matrix then we can decompose it as $A = S D S^{-1}$. But this isn't dependant on the order we write the eigenvalues and eigen vectors, so we can change the rows of S and alter D whilst leaving A unchanged. He knew that permuting the rows of a matrix changes the sign of its determinant which lead him to be confused since he knows the determinant of the RHS should be equal to the determinant of the LHS. He was forgetting about the fact that there is an $S$ and an [/itex]S^{-1}[/itex] which will cancel any changes caused by changing rows of S. He's not trying to claim that two matrices with permuted rows are the same or have the same eigen values or anything along those lines.

13. Jul 28, 2012

DonAntonio

Wonderful. Why then didn't he send any specific example of a matrix with that decomposition? That could have cleared things out way before.

DonAntonio

14. Jul 30, 2012

tamtam402

Sorry, I didn't check back on the thread after genericusrnme replied. I didn't post a specific example because I had trashed my work and I couldn't find out what exercise I had failed. When I came back to the thread I was ready to do the exercise from scratch to post my solution, but I saw genericusrnme's post and he proved that the sign of the determinant should have no impact on the solution, so I had surely screwed up a calculation somewhere.