# How to Solve Complex Eigenvectors in Matrix Algebra

• tophman
In summary, the person posting the question is confused about finding eigenvectors for complex eigenvalues when working with a nxn matrix. They know to compute the null space of A-lambdaI but are unsure of how to proceed for n > 2. They are wondering if Gaussian elimination is the only way to solve this problem and if each row being a multiple of the other is a valid solution. They are seeking help with solving this complex matrix.
tophman
Hey,

I have a quick question that I can not seem to find much of an answer to in my text. When working with a nxn matrix, A, and you find eigenvalues that are complex, I'm confused about how to go about finding the actual eigenvector. I know we compute the null space of A-lambdaI, but that is where I seem to get stuck. For a 2x2, easy enough and I can do it. The problem is when n > 2. Gaussian elimination becomes a ridiculous mess. Is that the only way to do it? When I do substitution I end up with 0 = 0 which makes me think that each row is just some multiple of the other. If this is the case, do I just use any row I want?

Basically, I'm completely stuck with how to solve the complex matrix.

Any help would be greatly appreciated!

tophman said:
Hey,

I have a quick question that I can not seem to find much of an answer to in my text. When working with a nxn matrix, A, and you find eigenvalues that are complex, I'm confused about how to go about finding the actual eigenvector. I know we compute the null space of A-lambdaI, but that is where I seem to get stuck. For a 2x2, easy enough and I can do it. The problem is when n > 2. Gaussian elimination becomes a ridiculous mess. Is that the only way to do it? When I do substitution I end up with 0 = 0 which makes me think that each row is just some multiple of the other. If this is the case, do I just use any row I want?

Basically, I'm completely stuck with how to solve the complex matrix.

Any help would be greatly appreciated!
Well, of course, you get "0= 0". In order to be an eigenvalue, the equations you get with $\lambda$ equal to that eigenvalue, must be dependent so that 0 is not the only solution. I don't know what problem you are doing but Gaussian elimination is the best way to go- expect, of course, to use a TI-93 calculator that will do eigenvectors for you!

Hi there,

Solving for complex eigenvectors in matrix algebra can be a bit tricky, but there are a few methods that can help make the process easier. One method is to use the characteristic polynomial of the matrix, which is formed by taking the determinant of (A - λI). This polynomial can be solved for the eigenvalues, and then the corresponding eigenvectors can be found by substituting these values into (A - λI)x = 0 and solving for x.

Another approach is to use the Jordan canonical form, which involves finding a matrix P such that P^-1AP is in Jordan form. This form is useful because it has a simple structure that makes it easier to find eigenvectors. Once you have the Jordan form, you can use the same process as above to find the eigenvectors.

If you are using Gaussian elimination to solve for the eigenvectors, it can indeed become quite messy, especially for larger matrices. In this case, it may be helpful to use a computer program or calculator to assist with the computations.

In terms of using any row you want, it is important to remember that when solving for eigenvectors, you are looking for a non-zero vector that satisfies the equation (A - λI)x = 0. So, if you end up with 0 = 0 when substituting into the equation, it means that the eigenvector is a multiple of the other row. In this case, you can choose any non-zero multiple of that row as your eigenvector.

I hope this helps and provides some guidance for solving complex eigenvectors in matrix algebra. Don't hesitate to ask for further clarification or assistance if needed. Best of luck!

## 1. What are eigenvectors and eigenvalues?

Eigenvectors and eigenvalues are important concepts in linear algebra. Eigenvectors are special vectors that, when multiplied by a square matrix, result in a scalar multiple of themselves. Eigenvalues are the corresponding scalar multiples. In simpler terms, an eigenvector is a vector that does not change direction when multiplied by a matrix.

## 2. Why is it important to solve for eigenvectors?

Solving for eigenvectors allows us to understand how a matrix affects the direction of a vector. This is useful in many applications, such as understanding the behavior of systems in physics and engineering, or in data analysis and machine learning.

## 3. How can I find the eigenvectors of a matrix?

To find the eigenvectors of a matrix, we need to first find the eigenvalues. This can be done by solving the characteristic equation of the matrix. Once the eigenvalues are found, we can plug them back into the original matrix and solve for the corresponding eigenvectors using Gaussian elimination or other methods.

## 4. What is the difference between real and complex eigenvectors?

A real eigenvector is a vector with real number components, while a complex eigenvector has complex number components. In terms of solving for complex eigenvectors in matrix algebra, the process is the same as solving for real eigenvectors, except we must also consider the complex numbers in the calculations.

## 5. Can a matrix have more than one eigenvector?

Yes, a matrix can have multiple eigenvectors, each corresponding to a different eigenvalue. In fact, the number of eigenvectors a matrix has is equal to its dimension. However, if the matrix has repeated eigenvalues, there may be fewer distinct eigenvectors.

• Linear and Abstract Algebra
Replies
10
Views
2K
• Linear and Abstract Algebra
Replies
12
Views
1K
• Linear and Abstract Algebra
Replies
1
Views
966
• Linear and Abstract Algebra
Replies
3
Views
2K
• Linear and Abstract Algebra
Replies
1
Views
3K
• Linear and Abstract Algebra
Replies
4
Views
2K
Replies
17
Views
1K
• Linear and Abstract Algebra
Replies
8
Views
1K