Eigenvalues and diagonalization of a matrix

Click For Summary
When diagonalizing a matrix, the order of the eigenvalues on the diagonal must match the order of the corresponding eigenvectors in the matrix B, which consists of these eigenvectors as columns. The diagonal matrix D can be formed as D = B^{-1}AB, where A is the original matrix. The eigenvalues can be arranged in any order, provided the eigenvectors maintain the same sequence. Changing the order of the eigenvalues effectively alters the representation of the linear operator with respect to a different ordered basis. This ensures that the properties of the original matrix are preserved in the diagonalized form.
dyn
Messages
774
Reaction score
63
When you diagonalize a matrix the diagonal elements are the eigenvalues but how do you know which order to put the eigenvalues in the diagonal elements as different orders give different matrices ?
Thanks
 
Physics news on Phys.org
The same order as the matrix with your eigenvectors. If I recall correctly, so long as those two have their columns corresponding with each other, it's fine; you'll transform your matrix to a diagonal one, then later on you'll transform back. If it's self-consistent, the properties you're looking for should be conserved.
(But I'm not an expert on this, so hopefully there will be more input!)
 
  • Like
Likes DrClaude
ModestyKing needn't be so modest, he is correct :smile:
 
If an n by n matrix A has n independent eigenvectors, the there exist a matrix B such that D= B^{-1}AB is a diagonal matrix having the eigenvalues on the diagonal. B is the matrix having the corresponding eigenvectors as columns

What ModestyKing and DrClaude are saying is that the eigenvalues can be any order- as long as you have the eigenvectors, forming the columns of matrix B, in the same order.
 
Last edited by a moderator:
  • Like
Likes dyn
An ordered basis for an n-dimensional vector space V is an n-tuple ##(e_1,\dots,e_n)## such that ##\{e_1,\dots,e_n\}## is a basis for V

The component n-tuple of a vector ##x## with respect to an ordered basis ##(e_1,\dots,e_n)## is the unique n-tuple of scalars ##(x_1,\dots,x_n)## such that ##x=\sum_{i=1}^n x_i e_i##.

The matrix of components of a linear operator ##A## with respect to an ordered basis ##(e_1,\dots,e_n)## is the n×n matrix [A] defined by ##[A]_{ij}=(Ae_j)_i##. (The right-hand side denotes the ##i##th component of ##Ae_j## with respect to ##(e_1,\dots,e_n)##). This matrix is diagonal if and only if the ##e_i## are eigenvectors of the linear operator ##A##.

Every matrix is the matrix of components of some linear operator, with respect to some ordered basis. To change the order of the non-zero numbers in a diagonal matrix, is to change the order of the vectors in the ordered basis. You end up with a representation of the same linear operator, with respect to a different ordered basis.
 
Last edited:
  • Like
Likes dyn
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K