Linear Algebra: Eigenvectors and Orthonormal Bases

No, not necessarily. A counter-example is provided by the vectors (1, 0) and (0, 1) in R2, which are clearly linearly independent but not orthogonal.In summary, the conversation discusses the properties of eigenvectors of a symmetric matrix, including their linear independence and their potential to form an orthonormal basis for the vector space Rn. The interlocutors also consider the relationship between linear independence and orthogonality, and provide counter-examples to illustrate that they are not necessarily equivalent.
  • #1
Niles
1,866
0

Homework Statement


Consider a symmetric (and hence diagonalizable) n x n matrix A. The eigenvectors of A are all linearly independant, and hence they span the eigenspace Rn.

Since the matrix A is symmetric, there exists an orthonormal basis consisting of eigenvectors.

My questions are:

1) Will this orthonormal basis of eigenvectors also span the same space Rn?

2) If two vectors are linearly independant, will they also be orthogornal?
 
Physics news on Phys.org
  • #2
For 1): when does a set of vectors span the vector space? Do the eigenvectors satisfy these conditions? [Actually, you already gave the answer yourself... do you see where? ]

For 2): Consider (1, 0) and (1, 1) in R2.
 
  • #3
1) They ar elinearly dependant, so yes - I guess I answered my own question there!

2) Great, a counter-example, so no. Thanks!
 
  • #4
1) Yep, it follows from the fact that "the eigenvectors are linearly independent" and that there are n of them. That is, they form a basis, as you said in the question. And of course a basis always spans the space (even a non-orthogonal and/or not normalized one).
 

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors and two operations, vector addition and scalar multiplication, that satisfy certain properties. These properties include closure, associativity, commutativity, and the existence of an additive identity and inverse, among others.

2. What is the difference between a vector and a scalar?

A vector is a quantity that has both magnitude and direction, while a scalar is a quantity that has only magnitude. Vectors are represented by arrows and can be added, subtracted, and multiplied by scalars, while scalars can only be multiplied by other scalars.

3. How is linear independence related to vector spaces?

In linear algebra, a set of vectors is linearly independent if no vector in the set can be written as a linear combination of the other vectors. In vector spaces, linear independence is an important property as it allows for the unique representation of vectors and helps in the understanding of the structure of the space.

4. What is a basis in a vector space?

A basis is a set of linearly independent vectors that span the entire vector space. This means that any vector in the space can be written as a linear combination of the basis vectors. The number of basis vectors is known as the dimension of the vector space.

5. How is linear algebra used in real-world applications?

Linear algebra has various applications in fields such as engineering, physics, economics, and computer science. It is used to model and solve problems involving linear systems, optimization, data analysis, and more. Some common applications include image processing, machine learning, and financial analysis.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
591
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
520
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
935
  • Calculus and Beyond Homework Help
Replies
10
Views
999
  • Calculus and Beyond Homework Help
Replies
15
Views
948
Back
Top