Bases, operators and eigenvectors

In summary, independent vectors in a vector space form a basis regardless of their orthogonality, and we can construct a linear operator with these independent vectors as eigenvectors. However, orthogonal independent vectors can also be used to construct a linear operator with these vectors as eigenvectors.
  • #1
fog37
1,568
108
Hello,

In the case of 2D vector spaces, every vector member of the vector space can be expressed as a linear combination of two independent vectors which together form a basis. There are infinitely many possible and valid bases, each containing two independent vectors (not necessarily orthogonal or orthonormal), that we can use. Vectors existing in a 3D vector space can use one out of the many and infinitely possible sets of 3 independent base vectors and so on...

When talking about linear operators and their eigenvectors, we learn that the set of eigenvectors (each eigenvector has its own eigenvalue) of an operator forms a base since the eigenvectors are independent vectors.
That means that the base formed by the eigenvectors of a certain linear operator is just one of those many, infinite possible bases, correct? In 2D, linear operators must therefore have only two eigenvectors (forming a 2D base) while in 3D linear operators can only have 3 eigenvectors and so... is that true?

If we considered a generic basis for a 2D vector space, would the vectors in that basis necessarily be the two eigenvectors of some linear operator? Or is that not necessarily true?

Thanks,
Fog37
 
Physics news on Phys.org
  • #2
fog37 said:
In 2D, linear operators must therefore have only two eigenvectors (forming a 2D base) while in 3D linear operators can only have 3 eigenvectors and so... is that true?
Counter-example: ##\vec v\rightarrow\ 3\vec v##
(you want different eigenvalues)

even then: if ##\vec a## is an eigenvector, then any multiple of ##\vec a## is also an eigenvector
(you want independent eigenvectors)

fog37 said:
would the vectors in that basis necessarily be the two eigenvectors
You can always construct such an operator: assign (different) eigenvalues to these eigenvectors and there you are !
 
  • #3
Be aware that a linear operator may not have any eigenvectors.
 
  • #4
Check out
$$
\begin{bmatrix}1&0\\0&1 \end{bmatrix}\; , \;\begin{bmatrix}1&0\\0&-1 \end{bmatrix}\; , \;\begin{bmatrix}1&1\\0&1 \end{bmatrix}\, , \,\begin{bmatrix}0&-1\\1&0 \end{bmatrix}\; , \;\begin{bmatrix}1&1\\0&0 \end{bmatrix}
$$
 
  • #5
fog37 said:
When talking about linear operators and their eigenvectors, we learn that the set of eigenvectors (each eigenvector has its own eigenvalue) of an operator forms a base since the eigenvectors are independent vectors.

Perhaps what you have in mind is the situation where a linear operator that maps an n-dimensional space into itself has n distinct eigenvalues and n-distinct eigenvectors. This is not the situation that applies to all linear operators, but there are particular examples of this situation that are important in physics. You hear more about these particular operators than other operators, so you can get the impression that all operators are this way.

This about an operator that projects 3D space onto a 1D subspace - for example T((x,y,z)) = (x,0,0).
 
  • #6
Thanks everyone. All very helpful. Thank you for reminding me that any scaled version of the same eigenvector is also an eigenvector. So a base can be formed by eigenvectors having different eigenvalues which makes them independent (orthogonal).

In summary:
In 2D, for example, we can form infinite new bases from one particular basis of two independent vectors by scaling each vector in the basis by arbitrary amounts. BvU mentioned that, given an arbitrary basis of two orthogonal vectors, we can find/construct a linear operator that has the two orthogonal vectors as its eigenvectors.

Of course, an acceptable basis is also one that has two vectors that are independent but not orthogonal. In this case, I don't think we can arrive to a linear operator having those independent vectors as eigenvectors. Are bases having non-orthogonal independent vectors very useful? If so, do you have an example?
 
  • #7
fog37 said:
independent (orthogonal).
Note that independent ##\ne## orthogonal at all !
 
  • #8
yes, two vectors, graphically, are independent as long as they are not along the same line of action. their mutual angle can be any angle other than zero or 180.

Still, independent vectors form a basis...
 

1. What is a base?

A base is a set of linearly independent vectors that span a vector space. This means that any vector in the space can be written as a linear combination of the base vectors.

2. What are operators?

Operators are mathematical symbols or functions that operate on a vector to produce another vector. They can perform operations such as addition, multiplication, and differentiation.

3. What is an eigenvector?

An eigenvector of an operator is a vector that, when operated on by the operator, results in a scalar multiple of the original vector. In other words, the vector is not changed in direction, only scaled.

4. How are bases, operators, and eigenvectors related?

Bases and operators are closely related as bases are used to represent operators. Eigenvectors are also important in the study of operators as they represent the directions along which the operator only scales the vector.

5. How are eigenvectors useful in linear algebra and other fields?

Eigenvectors are useful in many areas of mathematics and science, including linear algebra, physics, and engineering. They are often used to solve systems of linear equations, analyze the behavior of linear transformations, and understand the properties of quantum systems.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
3K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
543
  • Linear and Abstract Algebra
Replies
3
Views
929
  • Linear and Abstract Algebra
Replies
8
Views
853
  • Linear and Abstract Algebra
Replies
9
Views
163
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
847
Replies
3
Views
2K
Replies
26
Views
2K
Back
Top