Eigenvectors and orthogonal basis

In summary, the conversation discusses a linear transformation with a basis of eigenvectors already solved. The possible eigenvectors are given, but the exercise wants to show that there is no orthogonal basis for this transformation. The concept of orthogonal matrices and their implications, such as preserving length and numeric stability, is also mentioned. Suggestions are given for showing that there is no orthogonal basis.
  • #1
0kelvin
50
5

Homework Statement


I have a linear transformation ##\mathbb{R}^3 \rightarrow \mathbb{R}^3##. The part that asks for a basis of eigenvectors I've already solved it. The possible eigenvectors are ##(1,-3,0), (1,0,3), (\frac{1}{2}, \frac{1}{2},1) ##. Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

How do I show it?

The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

Homework Equations

The Attempt at a Solution


[/B]
With the euclidean inner product I can clearly see that the eigenvectors are not orthogonal to each other. But I'm not sure if calculating many pairs of dot products is the way to show it.

I thought about Gram-Schmidt but doing that would make the vectors not be eigenvectors anymore.
 
Last edited:
Physics news on Phys.org
  • #2
If you have got three eigenvectors, then either they are mutually orthogonal or they are not.
 
  • #3
0kelvin said:
Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

How do I show it?

I suppose you could do a Schur decomposition, and show that you get a matrix ##\mathbf{UTU^*}## where ##\mathbf{T}## has eigenvalues along the diagonal but it is upper triangular only (i.e. it is not both upper triangular and lower triangular -- i.e. not diagonal). Out of curiosity, what are the eigenvalues here?

More importantly:

0kelvin said:
The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

Orthogonal (and unitary if complex) matrices are extremely pleasant to work with. From a geometric standpoint, they are length preserving. Consider some real valued vector ##\mathbf{x}## and an orhogonal matrix ##mathbf{U}##: ##||\mathbf{x}||_2^2 = \mathbf{x^Tx} = ||\mathbf{Ux}||_2^2 =\mathbf{x^TU^TUx} = \mathbf{x^TIx} = \mathbf{x^T x}##. Having mutually orthonormal eigenvectors is immensely useful in manipulating things like quadratic forms -- e.g. maximizing or minimizing ##\mathbf{x^T A x}##, which comes up all the time (e.g. Hessian matrix of second derivatives in calc).

On top of all this, orthogonal matrices are very nice for preserving numeric stability if you're doing serious computational work.
 

1. What is an eigenvector?

An eigenvector is a vector that, when multiplied by a square matrix, results in a scalar multiple of itself. In other words, it is a vector that does not change direction when multiplied by a matrix.

2. What is an orthogonal basis?

An orthogonal basis is a set of vectors that are perpendicular to each other. This means that their dot products are equal to zero. Orthogonal bases are useful in linear algebra because they can simplify calculations and allow for easier visualization of vector spaces.

3. How are eigenvectors and eigenvalues related?

Eigenvectors and eigenvalues are related in that eigenvectors represent the directions in which a matrix transformation only stretches or compresses, while eigenvalues represent the amount of stretching or compression in each direction.

4. Can a matrix have more than one orthogonal basis?

Yes, a matrix can have multiple orthogonal bases. In fact, any set of linearly independent eigenvectors can form an orthogonal basis. However, not all matrices have orthogonal bases.

5. How are eigenvectors and orthogonal bases used in data analysis?

Eigenvectors and orthogonal bases are used in data analysis to simplify and reduce the dimensionality of data sets. By finding the eigenvalues and eigenvectors of a data matrix, we can identify the most important directions in which the data varies, and use those directions as a new basis for representing the data. This can help with visualization, clustering, and other data analysis techniques.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
345
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Calculus and Beyond Homework Help
Replies
16
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
3K
  • Linear and Abstract Algebra
Replies
3
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
977
Back
Top