A question about linear algebra

Artusartos
Messages
236
Reaction score
0

Homework Statement



Let A \in M_n(F) and v \in F^n. Let k be the smallest positive integer such that v, Av, A^2v, ..., A^kv are linearly dependent.
a) Show that we can find a_0, ... , a_{k-1} \in F wiht

a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0

(note that teh coefficient of A^kv is 1). Write

f(x) = x^k + a_{k-1}x^{k-1} + ... + a_0.

Note that f(A)(v)=0.


b) Show that f is a monic polynomial of lowest degree with f(A)(v)=0.

I tried to answer these questions, but I'm not sure if my answers are correct...can anybody please check them for me?


Homework Equations





The Attempt at a Solution



a) Since k be the smallest positive integer such that v, Av, A^2v, ..., A^kv are linearly dependent, we know that v, Av, A^2v, ..., A^{k-1}v is linearly independent. So, for v, Av, A^2v, ..., A^kv, we know that for any linear combination of these elements, there exists a coefficient that is nonzero (when the whole equation is zero)...In other words:

b_0v + b_1Av + ... + b_{k-1}A^{k-1}v + b_kA^kv = 0 for b_0, ... , b_k not all zero.

Now if we divide the whole equation by b_k, we get

\frac{b_0}{b_k} v + \frac{b_1}{b_k} Av + ... + \frac{b_{k-1}}{b_k} A^{k-1}v + A^kv = 0.

If we denote \frac{b_m}{b_k} by a_m (where m is between zero and k), we get...

a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0

Now to show that f(A)(v)=0 ...

f(A)(v) = (A^k + a_{k-1}A^{k-1} + ... + a_0v)(v) = a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0

b) f(A) = f(A)(v) = (A^k + a_{k-1}A^{k-1} + ... + a_0v)(v) = a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0

Can anybody check if I'm right? I'm also not sure if I understood the questions correctly?
 
Physics news on Phys.org
For part (a), you should explain why you know that b_k is nonzero, so you can divide by it.

For part (b), you have shown that f(A)(v) = 0. Obviously f is monic. Why is it a lowest degree polynomial with f(A)(v) = 0, i.e. why can't this be true for a lower degree polynomial?
 
jbunniii said:
For part (a), you should explain why you know that b_k is nonzero, so you can divide by it.

For part (b), you have shown that f(A)(v) = 0. Obviously f is monic. Why is it a lowest degree polynomial with f(A)(v) = 0, i.e. why can't this be true for a lower degree polynomial?

For part a), b_k is nonzero, because v, Av, ... , A^kv is linearly dependent...so when the equation is zero, not all of the coefficients are zero...

For part b), this cannot be true for a polynomial of a lower degree, because v, Av, ... ,A^kv is the smallest k such that v, Av, ... ,A^kv is linearly dependent. So if we had v, Av, ... ,A^{k-1}v, then the basis would b independent, so all the coefficients would need to equal zero when f(A)(v) = 0. So the function wouldn't be monic anymore.

Do you think my answers are correct?
 
Artusartos said:
For part a), b_k is nonzero, because v, Av, ... , A^kv is linearly dependent...so when the equation is zero, not all of the coefficients are zero...
They're not all zero, but some of them could be. How do you know that b_k in particular is not zero?

For part b), this cannot be true for a polynomial of a lower degree, because v, Av, ... ,A^kv is the smallest k such that v, Av, ... ,A^kv is linearly dependent. So if we had v, Av, ... ,A^{k-1}v, then the basis would b independent, so all the coefficients would need to equal zero when f(A)(v) = 0. So the function wouldn't be monic anymore.
Yes, correct.
 
jbunniii said:
They're not all zero, but some of them could be. How do you know that b_k in particular is not zero?Yes, correct.

Because I know that the rest is the linearly independent basis. The extra one is b_kvA^k
 
jbunniii said:
They're not all zero, but some of them could be. How do you know that b_k in particular is not zero?Yes, correct.
Thanks a lot.

I also have two other questions, if you don't mind...

1) Let V=Span(v, Av, A^2v, ... , A^{k-1}v ). Show that V is the smallest A-invariant subspace containing v. We denote this fact by writing

V=F[x]v

This corresponds to the F[x]-module structure on F^n induced by multiplication by A.

2) Show that v, Av, A^2v, ... , A^{k-1}v is a basis, B, for V.

My answers:

1) We know that V is T-invariant because,

T_A(v) = Av = vx_1 + Avx_2 + ... + A^{k-1}x_k \in V

since Av = vx_1 + Avx_2 + ... + A^{k-1}x_k is a linear combination of the span of V.

We know that it is the smallest one, because if V = Span (v, Av, ... ,A^{k-2}v), for example, then not all of A^wv (w in F) would be a linear combination of (v, Av, ... , A^{k-2}v) (we also know that (v, Av, ... , A^{k-2}v) is a non-square matrix, so it cannot be a basis), since we know from the first question that I asked that (v, Av, ... , A^{k-1}v) is a basis for F.

2) Since V=Span(v, Av, A^2v, ... , A^{k-1}v ), we know that [v|Av|...|A^{k-1}v] has a pivot position in every row. We also know that it is a square matrix, from the very first question that I asked...so it must also have a pivot position in every column. Thus, it is one-to-one and onto...and must be a basis for V.

Do you think my answers are correct?

Thanks in advance
 
Artusartos said:
Thanks a lot.

I also have two other questions, if you don't mind...

1) Let V=Span(v, Av, A^2v, ... , A^{k-1}v ). Show that V is the smallest A-invariant subspace containing v. We denote this fact by writing

V=F[x]v
Does k still have the same meaning as in the first problem? i.e. it's the smallest power such that v,Av,A^2v,\ldots,A^{k-1}v,A^{k}v are linearly dependent? I will assume so, because the statement isn't necessarily true for an arbitrary k.

My answers:

1) We know that V is T-invariant because,

T_A(v) = Av = vx_1 + Avx_2 + ... + A^{k-1}x_k \in V

since Av = vx_1 + Avx_2 + ... + A^{k-1}x_k is a linear combination of the span of V.
This doesn't seem quite right. An arbitrary element of V can be written as follows:
v = vx_1 + Avx_2 + \ldots + A^{k-1}vx_k
Therefore
Av = Avx_1 + A^2vx_2 + \ldots + A^{k-1}vx_{k-1} + A^{k}vx_k
Now you have to use the fact that A^{k}v is a linear combination of v,Av,A^2v,\ldots,A^{k-1}v.

We know that it is the smallest one, because if V = Span (v, Av, ... ,A^{k-2}v), for example,
OK, that's one particular subspace with lower dimension than V. But you need to prove that NO subspace with lower dimension than V can be an A-invariant subspace containing v. Hint: to do this, you need to show that any such subspace must contain all of the vectors v, Av, ... ,A^{k-1}v.

2) Since V=Span(v, Av, A^2v, ... , A^{k-1}v ), we know that [v|Av|...|A^{k-1}v] has a pivot position in every row. We also know that it is a square matrix, from the very first question that I asked...so it must also have a pivot position in every column. Thus, it is one-to-one and onto...and must be a basis for V.
I don't think you need to talk about rows and columns here. A basis is a linearly independent set of vectors which spans the space. Well, by definition, v, Av, A^2v, ... , A^{k-1}v spans V, so all you need is that this is a linearly independent set. But that's also given, isn't it?
 
jbunniii said:
Does k still have the same meaning as in the first problem? i.e. it's the smallest power such that v,Av,A^2v,\ldots,A^{k-1}v,A^{k}v are linearly dependent? I will assume so, because the statement isn't necessarily true for an arbitrary k.

Yes it does have the same meaning.



jbunniii said:
This doesn't seem quite right. An arbitrary element of V can be written as follows:
v = vx_1 + Avx_2 + \ldots + A^{k-1}vx_k
Therefore
Av = Avx_1 + A^2vx_2 + \ldots + A^{k-1}vx_{k-1} + A^{k}vx_k
Now you have to use the fact that A^{k}v is a linear combination of v,Av,A^2v,\ldots,A^{k-1}v.

So, can I answer it like this:

From the very first question that I asked, we know that the equation a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0 is true. So

A^kv = -a_0v - a_1Av - ... - a_{k-1}A^{k-1}v

So if we let y_m = -a_m for m = 0, 1, ... , k-1,

A^kv = y_0v + y_1Av + ... + y_{k-1}A^{k-1}v

From here, we can see that any A^kv can be written as a linear combination of v, Av, ..., A^{k-1}v. Since V is spanned by these elements, then A^kv is alo in V. So V is A-invariant.

jbunniii said:
OK, that's one particular subspace with lower dimension than V. But you need to prove that NO subspace with lower dimension than V can be an A-invariant subspace containing v. Hint: to do this, you need to show that any such subspace must contain all of the vectors v, Av, ... ,A^{k-1}v.

So if we look at this equation again:

A^kv = y_0v + y_1Av + ... + y_{k-1}A^{k-1}v, we will see that A^kv can be written as the linear combination of v, Av, ... , A^{k-1}. Since v, Av, ... ,A^{k-1} is linearly independent, we know that we cannot write A^kv as a linear combination of these elements if we take any of them away. So, V must be the smallest.





jbunniii said:
I don't think you need to talk about rows and columns here. A basis is a linearly independent set of vectors which spans the space. Well, by definition, v, Av, A^2v, ... , A^{k-1}v spans V, so all you need is that this is a linearly independent set. But that's also given, isn't it?

Yes it is given.
 
Back
Top