# Linear Algebra Eigenspace Question

• Nexttime35
In summary, to find the basis for the 0-eigenspace, we need to find a basis for the vectors v such that T(v) = 0v, which can be represented by all polynomials in P3. Moving on to the 1-eigenspace, we need to find a basis for the vectors v such that T(v) = 1v, which can be represented by any polynomial in P4. However, if we consider the power series representation of a function in C∞, we can see that the 0-eigenspace is actually a space of infinitely differentiable functions with coefficients a_i = 0 for i ≥ 4. Thus, the basis for the 0-e
Nexttime35

## Homework Statement

Let T: C∞(R)→C∞(R) be given by T(f) = f'''' where T sends a function to the fourth derivative.

a) Find a basis for the 0-eigenspace.
b) Find a basis for the 1-eigenspace.

## The Attempt at a Solution

I just want to verify my thought process for this problem. For a), finding the basis for the 0-eigenspace, essentially I needed to find a basis for the vectors v in V such that T(v) = 0v .

So, would the basis for this 0-eigenspace be all polynomials in P3? If you solve the fourth derivative of any polynomial in P3, you will get 0.

As for b), when finding the basis for the 1-eigenspace, we need to find a basis for the vectors v in V such that T(v) = 1v, or that after solving the fourth derivative, you get a function that is equal to 1? Is this the correct logic? So would the basis for the 1-eigenspace be any polynomial in P4?

The 1 eigenspace is vectors that map to themselves. Not to the vector 1.

If by ##C^∞## you mean the space of all infinitely differentiable functions, then there are a lot more than polynomials around.

Let ##f \in C^∞##. Look at the power series: ##f(x) = ∑_{i=0}^{∞} a_i x^i##. If the fourth derivative of ##f## is 0, then you have that ##a_i = 0## for ##i \geq 4##. Thus the choice of ##a_0, a_1, a_2, a_3## determines ##f## in the 0-eigenspace. Using this, can you come up with a basis? (It will have 4 functions in it).

## 1. What is a linear algebra eigenspace?

A linear algebra eigenspace is a vector subspace of a vector space that consists of all the eigenvectors corresponding to a particular eigenvalue of a given linear transformation.

## 2. How is an eigenspace different from an eigenvalue?

An eigenvalue is a scalar value that represents how a linear transformation stretches or compresses a vector, while an eigenspace is the set of all vectors that are only scaled by the corresponding eigenvalue when transformed by the linear transformation.

## 3. What is the significance of eigenspaces in linear algebra?

Eigenspaces are important in linear algebra because they allow for the decomposition of a matrix into simpler, diagonalizable components. They also provide insight into the behavior of linear transformations and can be used to solve systems of linear equations.

## 4. How do you find the eigenspace of a given matrix?

To find the eigenspace of a given matrix, first find the eigenvalues by solving the characteristic equation. Then, for each eigenvalue, find the corresponding eigenvectors by solving the system of equations (A - λI)x = 0, where A is the given matrix, λ is the eigenvalue, and x is the eigenvector. The set of all eigenvectors corresponding to a particular eigenvalue forms the eigenspace.

## 5. Can an eigenspace be empty?

No, an eigenspace cannot be empty. Every matrix has at least one eigenvalue, which means there is at least one eigenvector and thus, an eigenspace. However, an eigenspace can be a zero vector (containing only the origin).

Replies
10
Views
2K
Replies
0
Views
774
Replies
15
Views
1K
Replies
4
Views
2K
Replies
2
Views
1K
Replies
24
Views
2K
Replies
18
Views
1K
Replies
2
Views
926
Replies
2
Views
2K
Replies
10
Views
1K