Describing Matrix/Transformation by Eigens.

  • Thread starter WWGD
  • Start date
In summary, the conversation discusses a linear map from R^n to R^n and whether a unique eigenvalue l=1 necessarily means that the associated matrix M is a rotation matrix about the eigenspace. The answer is no, as shown by the example given. The speaker is trying to show that applying the operation kri+rj preserves the solution subspace of a system of linear equations, and is seeking alternative ways to prove this. The conversation also mentions the fundamental theorem of linear algebra and its relevance in this problem.
  • #1
WWGD
Science Advisor
Gold Member
7,323
11,168
Hi, All:

Say T:R^n --->R^n is a linear map , and that the associated matrix M has a unique eigenvalue l=1. Is M necessarily a rotation matrix about the eigenspace?

Thanks.
 
Physics news on Phys.org
  • #2
No - e.g. consider ##\left(\begin{smallmatrix} 1 & 1 \\ 0 & 1\end{smallmatrix}\right)##.
 
  • #3
I see. I am trying to show that by applying kri+rj , i.e., adding a multiple of k times row i to

row j one row to another row has the effect of rotating one of the k-planes about the

solution subspace, since this is the only way I can conceive that the operation kri+rj

preserves the solution to the system. Can you see how I else I can show this?
 
  • #4
Solution subspace of what?
 
  • #5
Well, I have a soluble , i.e., non-contradictory (homogeneous) system of linear equations .

The fundamental row operations--exchange rows, add a multiple of one

row to another row-- preserve the solutions to the system. If we look at the

solution S to the system geometrically, this is a subspace, possibly trivial. I'm trying

to show that the operation of adding a multiple of row i to row j has the effect of

rotating the n-planes in the system of equations about the solution- space S.

I think the affine case--for non-homogeneous systems, is similar. I've been using

the fundamental theorem of linear algebra that Bacle had mentioned in a similar

problem, but I still can't prove this.
 

FAQ: Describing Matrix/Transformation by Eigens.

1. What is a matrix transformation?

A matrix transformation is a mathematical operation that involves multiplying a given matrix by a transformation matrix, resulting in a new transformed matrix. This allows for the manipulation and repositioning of points in a vector space.

2. What is an eigenvalue?

An eigenvalue is a scalar value that represents the scale of the transformation in a matrix transformation. It is the factor by which the eigenvectors of the matrix are stretched or compressed.

3. How are eigenvectors and eigenvalues related?

Eigenvectors and eigenvalues are related in that eigenvectors are the vectors that are scaled by the corresponding eigenvalue in a matrix transformation. They represent the direction and magnitude of the transformation.

4. What is the importance of eigendecomposition in describing a matrix?

Eigendecomposition is important in describing a matrix because it allows for the separation of a matrix into its eigenvalues and eigenvectors. This provides a clearer understanding of the transformation and how it affects the vector space.

5. Can a matrix have complex eigenvalues?

Yes, a matrix can have complex eigenvalues. This occurs when the matrix is not real symmetric, and it is common in applications such as quantum mechanics and signal processing.

Back
Top