Find the Eigenvalues and eigenvectors of 3x3 matrix

Click For Summary

Discussion Overview

The discussion revolves around finding the eigenvalues and eigenvectors of a 3x3 matrix based on given equations. Participants explore the implications of the equations, the structure of the matrix, and the properties of eigenvalues and eigenvectors.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested, Mathematical reasoning

Main Points Raised

  • Some participants suggest starting with the relation Av=λv or det(A-λI)v=0 to find eigenvalues and eigenvectors.
  • There is a proposal to derive the matrix A explicitly from the given equations, though some participants note this may not be straightforward.
  • One participant identifies that the eigenspace corresponding to the eigenvalue λ=3 has dimension two, spanned by the vectors (1,-1,1) and (2,-1,0).
  • Another participant asserts that all linear combinations of the identified eigenvectors are also eigenvectors for their respective eigenvalues.
  • Questions are raised regarding the reversibility and diagonalizability of the matrix A, with discussions on the implications of eigenvalues and linear independence of the eigenvectors.
  • One participant observes that a specific subtraction leads to the conclusion that λ=0, 3, and 6 could be eigenvalues.
  • There are corrections and clarifications regarding the interpretation of the equations and the nature of the eigenvectors.

Areas of Agreement / Disagreement

Participants generally agree on some eigenvalues and eigenvectors identified, but there are competing views on the implications of these findings, particularly regarding the structure of the matrix A, its reversibility, and diagonalizability. The discussion remains unresolved on some aspects, particularly the explicit form of A and its properties.

Contextual Notes

Limitations include the lack of explicit definition for matrix A and the dependence on the interpretations of the given equations. The discussion also reflects uncertainty regarding the linear independence of the eigenvectors and the implications for the matrix's properties.

Michael_0039
Messages
59
Reaction score
7
Assume a table A(3x3) with the following:

A [ 1 2 1 ]^T = 6 [ 1 2 1 ]^T
A [ 1 -1 1 ]^T = 3 [ 1 -1 1 ]^T
A [ 2 -1 0]^T = 3 [ 1 -1 1]^T

Find the Eigenvalues and eigenvectors:

I have in mind to start with the Av=λv or det(A-λI)v=0....

Also, the first 2 equations seems to have the form Av=λv:
So maybe, u1= [ 1 2 1 ] with λ=6 and u2= [1 -1 1] with λ=3 .......but what about 3rd ?
 
Physics news on Phys.org
Why don't you get A explicitly at first ?
 
anuttarasammyak said:
Why don't you get A explicitly at first ?
Matrix A is not given, only those equations.
 
Those equations are equivalent to a product relation of 3X3 matrices say
AB=C
So explicitly
A=CB^{-1}
...But I see getting explicit A is not a straight forward way
 
Last edited:
Michael_0039 said:
Assume a table A(3x3) with the following:

A [ 1 2 1 ]^T = 6 [ 1 2 1 ]^T
A [ 1 -1 1 ]^T = 3 [ 1 -1 1 ]^T
A [ 2 -1 0]^T = 3 [ 1 -1 1]^T

Find the Eigenvalues and eigenvectors:

I have in mind to start with the Av=λv or det(A-λI)v=0....

Also, the first 2 equations seems to have the form Av=λv:
So maybe, u1= [ 1 2 1 ] with λ=6 and u2= [1 -1 1] with λ=3 .......but what about 3rd ?
Correct.

The second and third means, that the eigenspace to the eigenvalue ##3## has dimension two. It is spanned by ##(1,-1,1)^\tau## and ##(2,-1,0)^\tau.##
 
So the answer will be:
λ1=6 with u1=[ 1 2 1 ]T
λ2=3 with u2=[ 1 -1 1 ]T and u3=[ 2 -1 0 ]T

(?)
 
Michael_0039 said:
So the answer will be:
λ1=6 with u1=[ 1 2 1 ]T
λ2=3 with u2=[ 1 -1 1 ]T and u3=[ 2 -1 0 ]T

(?)
Yes.

I mean, almost.
All linear combinations of ##(1,2,1)## (i.e. all multiples of) are eigenvectors to the eigenvalue ##6,## too.
All linear combinations of ##(1,-1,1)## and ##(2,-1,0)## are eigenvectors to the eigenvalue ##3,## too.
 
  • Like
Likes   Reactions: Michael_0039
@Michael_0039 note my edit. You've been too fast.

If ##v_1,\ldots,v_n## are all eigenvectors to the same eigenvalue ##\lambda ## then
$$
A\left(\sum_{k=1}^n \alpha_k v_k\right)=\sum_{k=1}^n\alpha_k A(v_k)=\sum_{k=1}^n\alpha_k \lambda v_k=\lambda \cdot \left(\sum_{k=1}^n\alpha_k v_k\right)
$$
all linear combinations are eigenvectors to the eigenvalue ##\lambda, ## too.
 
So for the 2nd question: Is A reversible or/and diagonal ?
The answer if A-1 exists is λ12≠0 and is non-diagonal because λ2 has reached one time but has 2 eigenvectors (?)
 
  • #10
Michael_0039 said:
So for the 22nd question: Is A reversible ...
What do you think? Hint: not reversible implies not injective. Now, what does not injective mean?

Michael_0039 said:
... or/and diagonal
Are those three vectors linearly independent? If so, they will be a basis. How does ##A## look like according to that basis?
 
  • #11
Michael_0039 said:
A [ 1 -1 1 ]^T = 3 [ 1 -1 1 ]^T
A [ 2 -1 0]^T = 3 [ 1 -1 1]^T
By subtraction I observe
A[-1,0,1]^T=[0,0,0]^T=0[-1,0,1]^T
which suggests ##\lambda=0,3,6## in all.
 
  • Like
Likes   Reactions: Michael_0039
  • #12
fresh_42 said:
Correct.

The second and third means, that the eigenspace to the eigenvalue ##3## has dimension two. It is spanned by ##(1,-1,1)^\tau## and ##(2,-1,0)^\tau.##

The second and third equations tell us that (2, -1, 0)^T - (1, -1, 1)^T = (1,0,-1)^T \in \ker A.
 
  • #13
pasmith said:
The second and third equations tell us that (2, -1, 0)^T - (1, -1, 1)^T = (1,0,-1)^T \in \ker A.
Thank you. My mistake! I mistook it for ##A(2,-1,0)=(2,-1,0).##

@Michael_0039 Forget my posts. I was thinking about a different situation.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 7 ·
Replies
7
Views
10K