Eigenvalues / eigenvectors concept explaination please

Click For Summary
SUMMARY

The discussion clarifies the concepts of eigenvalues and eigenvectors, specifically addressing the properties of complex eigenvalues and the conditions for matrix invertibility. It establishes that complex eigenvalues appear in conjugate pairs and that a matrix is invertible if its column vectors are linearly independent. Furthermore, it concludes that a non-invertible matrix must have an eigenvector corresponding to the eigenvalue of zero, as the determinant condition |A - λI| = 0 indicates non-invertibility when λ equals zero.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically eigenvalues and eigenvectors.
  • Familiarity with matrix operations and properties, including invertibility.
  • Knowledge of complex numbers and their properties.
  • Basic understanding of determinants and the characteristic polynomial.
NEXT STEPS
  • Study the properties of complex eigenvalues and their implications in linear transformations.
  • Learn about the relationship between eigenvalues, eigenvectors, and matrix diagonalization.
  • Explore the concept of linear independence in the context of vector spaces.
  • Investigate the characteristic polynomial and its role in finding eigenvalues.
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, as well as educators seeking to explain the concepts of eigenvalues and eigenvectors effectively.

Ush
Messages
97
Reaction score
0
Hello
This is a concept question I do not understand. I'm just wondering why the answer is what it is. (the answer is written below the question, I just have no idea where it comes from)

attachment.php?attachmentid=40837&stc=1&d=1320957609.png
 

Attachments

  • h1.PNG
    h1.PNG
    52.7 KB · Views: 537
Physics news on Phys.org
For a, you have to know that complex eigenvalues come in conjugate pairs. That is, if a + bi is an eigenvalue of a matrix A, then so is a - bi. The same goes for eigenvectors. If an eigenvector has entries (a, b + ci) then there is another eigenvector with entries (a, b - ci).

For b, you have to know a couple of things.

A square matrix A is invertible if and only if its column vectors are linearly independent. This is equivalent to saying that a square matrix A is invertible if and only if there are no nontrivial solutions to the equation Ax = 0 (this is because Ax is a linear combination of the column vectors of A).

So then a matrix that is not invertible must have nontrivial solutions to Ax = 0. But in this case, you have found an eigenvector of A with eigenvalue 0. That means every matrix that is not invertible must have eigenvectors corresponding to eigenvalue 0.

Another way to think about it, is that when you are looking for eigenvalues and solving |A - λI| = 0, you are looking for the values of λ that will make A - λI not invertible. But if A is not invertible, then clearly λ = 0 is a solution.

Hope this helps.
 

Similar threads

Replies
5
Views
2K
Replies
2
Views
1K
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K