- #1

- 7,323

- 11,168

Say T:R^n --->R^n is a linear map , and that the associated matrix M has a unique eigenvalue l=1. Is M necessarily a rotation matrix about the eigenspace?

Thanks.

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter WWGD
- Start date

In summary, the conversation discusses a linear map from R^n to R^n and whether a unique eigenvalue l=1 necessarily means that the associated matrix M is a rotation matrix about the eigenspace. The answer is no, as shown by the example given. The speaker is trying to show that applying the operation kri+rj preserves the solution subspace of a system of linear equations, and is seeking alternative ways to prove this. The conversation also mentions the fundamental theorem of linear algebra and its relevance in this problem.

- #1

- 7,323

- 11,168

Say T:R^n --->R^n is a linear map , and that the associated matrix M has a unique eigenvalue l=1. Is M necessarily a rotation matrix about the eigenspace?

Thanks.

Physics news on Phys.org

- #2

morphism

Science Advisor

Homework Helper

- 2,016

- 4

No - e.g. consider ##\left(\begin{smallmatrix} 1 & 1 \\ 0 & 1\end{smallmatrix}\right)##.

- #3

- 7,323

- 11,168

row j one row to another row has the effect of rotating one of the k-planes about the

solution subspace, since this is the only way I can conceive that the operation kri+rj

preserves the solution to the system. Can you see how I else I can show this?

- #4

morphism

Science Advisor

Homework Helper

- 2,016

- 4

Solution subspace of what?

- #5

- 7,323

- 11,168

The fundamental row operations--exchange rows, add a multiple of one

row to another row-- preserve the solutions to the system. If we look at the

solution S to the system geometrically, this is a subspace, possibly trivial. I'm trying

to show that the operation of adding a multiple of row i to row j has the effect of

rotating the n-planes in the system of equations about the solution- space S.

I think the affine case--for non-homogeneous systems, is similar. I've been using

the fundamental theorem of linear algebra that Bacle had mentioned in a similar

problem, but I still can't prove this.

A matrix transformation is a mathematical operation that involves multiplying a given matrix by a transformation matrix, resulting in a new transformed matrix. This allows for the manipulation and repositioning of points in a vector space.

An eigenvalue is a scalar value that represents the scale of the transformation in a matrix transformation. It is the factor by which the eigenvectors of the matrix are stretched or compressed.

Eigenvectors and eigenvalues are related in that eigenvectors are the vectors that are scaled by the corresponding eigenvalue in a matrix transformation. They represent the direction and magnitude of the transformation.

Eigendecomposition is important in describing a matrix because it allows for the separation of a matrix into its eigenvalues and eigenvectors. This provides a clearer understanding of the transformation and how it affects the vector space.

Yes, a matrix can have complex eigenvalues. This occurs when the matrix is not real symmetric, and it is common in applications such as quantum mechanics and signal processing.

- Replies
- 1

- Views
- 1K

- Replies
- 8

- Views
- 2K

- Replies
- 4

- Views
- 1K

- Replies
- 4

- Views
- 2K

- Replies
- 1

- Views
- 1K

- Replies
- 2

- Views
- 1K

- Replies
- 1

- Views
- 948

- Replies
- 2

- Views
- 1K

- Replies
- 4

- Views
- 2K

- Replies
- 10

- Views
- 1K

Share: