Vector space, basis, linear operator

Click For Summary

Homework Help Overview

The discussion revolves around a problem in linear algebra concerning vector spaces, linear operators, and the concept of linear independence. The original poster is tasked with proving the existence of a vector in a vector space such that a specific set of vectors formed by applying linear operators to this vector spans the space.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants explore the implications of the linear independence of the operators and the conditions under which a vector can be found to satisfy the spanning requirement. There are discussions about the minimal polynomial and its degree, as well as the kernel of certain linear combinations of operators.

Discussion Status

Participants are actively engaging with the problem, offering hints and questioning assumptions. Some have suggested considering the rational canonical form and the implications of linear independence in the context of linear operators. There is a recognition of the complexity involved in finding a suitable vector and the need for further exploration of the concepts discussed.

Contextual Notes

There are references to the definitions of linear independence and the structure of the vector space of linear operators, as well as the potential complications arising from considering countably or uncountably many subspaces. The original poster's assumption that the vector space is over the complex field is also noted.

boombaby
Messages
129
Reaction score
0

Homework Statement


Let V be a vector space of dimension n. And the linear operators E=A^0, A^1, A^2, ... A^(n-1) are linearly independent. Prove that there exists a v in V such that V=<v, Av, A^2v, ..., A^(n-1)v>


Homework Equations





The Attempt at a Solution


Here are something that I tried.
the degree of the minimal polynomials p(t) such that p(A)=0 is larger than n-1. I wanted to start the proof from here but have no idea how to proceed.
assume V is over complex field C so that there is an eigenvector. However, it seems that this is also not the disired v to make them independent.
Any hint? Thanks a lot
 
Physics news on Phys.org
Are you using <,> to denote span? If so, then it suffices to find a v such that {v, Av, ..., An-1v} is linearly independent. Stated differently, we want to make sure that whenever k0, k1, ..., kn-1 are scalars that are not all zero, then

[tex]k_0 + k_1 Av + \cdots + k_{n-1} A^{n-1}v \neq 0[/tex].

Or, equivalently,

[tex](k_0 + k_1 A + \cdots + k_{n-1} A^{n-1})v \neq 0 \iff v \not\in \ker(k_0 + k_1 A + \cdots + k_{n-1} A^{n-1})[/tex].

But what can you say about ker(k0+ k1 A + ... + kn-1 An-1)?
 
thanks for the hints! Here's what I think,

so k0E+k1A+...+kn-1An-1 (called [tex]B_{k_{1},k_{2},..,k_{n-1}}[/tex])cannot be the 0 operator (since E, A, ..,An-1 are linearly independent), which implies that [tex]dim(Ker\! B_{k_{1},k_{2},..,k_{n-1}}) \leq n-1[/tex]

(**) suppose we have V1, V2 as two proper sub space of V, then we can find a vector in V which is neither in V1 nor V2
(assume we cannot, find two vectors such that [tex]x\in V_{1} \: x\notin V_{2} \: y\in V_{2} \: y\notin V_{1}[/tex] x+y will give a contration)

If we have only finitely many [tex]B_{k_{1},k_{2},..,k_{n-1}}[/tex] I can find a v such that [tex]B_{k_{1},k_{2},..,k_{n-1}}v\neq 0 \: for\,all\: B_{k_{1},k_{2},..,k_{n-1}}[/tex] according to (**). Does this remain true if there are countably many or uncountably many subspace? I've no idea...

Am I thinking right?...

Thanks
 
What is your definition of "linearly independent" for linear operators?
 
Presumably linear independence refers to independence in the vector space L(V) of linear operators on V.

boombaby said:
If we have only finitely many [tex]B_{k_{1},k_{2},..,k_{n-1}}[/tex] I can find a v such that [tex]B_{k_{1},k_{2},..,k_{n-1}}v\neq 0 \: for\,all\: B_{k_{1},k_{2},..,k_{n-1}}[/tex] according to (**). Does this remain true if there are countably many or uncountably many subspace?
No, but fortunately we only need to consider finitely many [itex]B_{k_{1},k_{2},..,k_{n-1}}[/itex]. Consider the irreducible factors of the minimal polynomial of T.
 
it's the same as other things I think...All the linear operators on V over field K forms a vector space, called L(V), and A_1,...,A_s in L(V) are linearly independent iff k_1*A_1+...k_s*A_s=0 implies k_1=...=k_s=0...

Edit:
I'll have a look at the irreducible factors...Thanks
 
Last edited:
can you explain it a little more? I can not figure it out...
Thanks
 
You're going to want to use the rational canonical decomposition from here.
 
Ah...it seems that there are lots of materials under the topic "rational canonical" that I cannot understand quite fast right now. I'll back to this question later. Anyway, Thanks a lot for your help!
 
  • #10
boombaby said:
it's the same as other things I think...All the linear operators on V over field K forms a vector space, called L(V), and A_1,...,A_s in L(V) are linearly independent iff k_1*A_1+...k_s*A_s=0 implies k_1=...=k_s=0...

Which is the same, then, as saying that if v is any non-zero vector, {A_1v, A_2v, ..., A_s v} are independent in V.
 
  • #11
HallsofIvy said:
Which is the same, then, as saying that if v is any non-zero vector, {A_1v, A_2v, ..., A_s v} are independent in V.

No, this is not true. if some Ai v=0, then they are not independent.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
Replies
15
Views
3K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
34
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K