A question about linear algebra

In summary: Your answers seem to be mostly correct, but there are a few things that could be clarified or expanded upon.1) You say that V is T-invariant, but the question is asking about A-invariant subspaces. Also, you should explain why V is the smallest A-invariant subspace containing v. Additionally, you should explain how this corresponds to the F[x]-module structure on F^n induced by multiplication by A.2) You are correct that [v|Av|...|A^{k-1}v] has a pivot position in every row and column, but you should also explain why this means it is a basis for V. Additionally, you should clarify that this is true because V=Span(v, Av
  • #1
Artusartos
247
0

Homework Statement



Let [itex]A \in M_n(F)[/itex] and [itex]v \in F^n[/itex]. Let k be the smallest positive integer such that [itex]v, Av, A^2v, ..., A^kv[/itex] are linearly dependent.
a) Show that we can find [itex]a_0, ... , a_{k-1} \in F[/itex] wiht

[itex]a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0[/itex]

(note that teh coefficient of A^kv is 1). Write

f(x) = x^k + a_{k-1}x^{k-1} + ... + a_0.

Note that f(A)(v)=0.


b) Show that f is a monic polynomial of lowest degree with f(A)(v)=0.

I tried to answer these questions, but I'm not sure if my answers are correct...can anybody please check them for me?


Homework Equations





The Attempt at a Solution



a) Since k be the smallest positive integer such that [itex]v, Av, A^2v, ..., A^kv[/itex] are linearly dependent, we know that [itex]v, Av, A^2v, ..., A^{k-1}v[/itex] is linearly independent. So, for [itex]v, Av, A^2v, ..., A^kv[/itex], we know that for any linear combination of these elements, there exists a coefficient that is nonzero (when the whole equation is zero)...In other words:

[itex]b_0v + b_1Av + ... + b_{k-1}A^{k-1}v + b_kA^kv = 0[/itex] for b_0, ... , b_k not all zero.

Now if we divide the whole equation by b_k, we get

[itex] \frac{b_0}{b_k} v + \frac{b_1}{b_k} Av + ... + \frac{b_{k-1}}{b_k} A^{k-1}v + A^kv = 0[/itex].

If we denote [itex] \frac{b_m}{b_k}[/itex] by [itex]a_m[/itex] (where m is between zero and k), we get...

[itex]a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0[/itex]

Now to show that f(A)(v)=0 ...

[itex]f(A)(v) = (A^k + a_{k-1}A^{k-1} + ... + a_0v)(v) = a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0[/itex]

b) [itex]f(A) = f(A)(v) = (A^k + a_{k-1}A^{k-1} + ... + a_0v)(v) = a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0 [/itex]

Can anybody check if I'm right? I'm also not sure if I understood the questions correctly?
 
Physics news on Phys.org
  • #2
For part (a), you should explain why you know that [itex]b_k[/itex] is nonzero, so you can divide by it.

For part (b), you have shown that f(A)(v) = 0. Obviously f is monic. Why is it a lowest degree polynomial with f(A)(v) = 0, i.e. why can't this be true for a lower degree polynomial?
 
  • #3
jbunniii said:
For part (a), you should explain why you know that [itex]b_k[/itex] is nonzero, so you can divide by it.

For part (b), you have shown that f(A)(v) = 0. Obviously f is monic. Why is it a lowest degree polynomial with f(A)(v) = 0, i.e. why can't this be true for a lower degree polynomial?

For part a), [itex]b_k[/itex] is nonzero, because [itex]v, Av, ... , A^kv [/itex] is linearly dependent...so when the equation is zero, not all of the coefficients are zero...

For part b), this cannot be true for a polynomial of a lower degree, because [itex]v, Av, ... ,A^kv[/itex] is the smallest k such that [itex]v, Av, ... ,A^kv[/itex] is linearly dependent. So if we had [itex]v, Av, ... ,A^{k-1}v[/itex], then the basis would b independent, so all the coefficients would need to equal zero when f(A)(v) = 0. So the function wouldn't be monic anymore.

Do you think my answers are correct?
 
  • #4
Artusartos said:
For part a), [itex]b_k[/itex] is nonzero, because [itex]v, Av, ... , A^kv [/itex] is linearly dependent...so when the equation is zero, not all of the coefficients are zero...
They're not all zero, but some of them could be. How do you know that [itex]b_k[/itex] in particular is not zero?

For part b), this cannot be true for a polynomial of a lower degree, because [itex]v, Av, ... ,A^kv[/itex] is the smallest k such that [itex]v, Av, ... ,A^kv[/itex] is linearly dependent. So if we had [itex]v, Av, ... ,A^{k-1}v[/itex], then the basis would b independent, so all the coefficients would need to equal zero when f(A)(v) = 0. So the function wouldn't be monic anymore.
Yes, correct.
 
  • #5
jbunniii said:
They're not all zero, but some of them could be. How do you know that [itex]b_k[/itex] in particular is not zero?Yes, correct.

Because I know that the rest is the linearly independent basis. The extra one is [itex]b_kvA^k[/itex]
 
  • #6
jbunniii said:
They're not all zero, but some of them could be. How do you know that [itex]b_k[/itex] in particular is not zero?Yes, correct.
Thanks a lot.

I also have two other questions, if you don't mind...

1) Let [itex]V=Span(v, Av, A^2v, ... , A^{k-1}v )[/itex]. Show that V is the smallest A-invariant subspace containing v. We denote this fact by writing

V=F[x]v

This corresponds to the F[x]-module structure on [itex]F^n[/itex] induced by multiplication by A.

2) Show that [itex]v, Av, A^2v, ... , A^{k-1}v [/itex] is a basis, B, for V.

My answers:

1) We know that V is T-invariant because,

[itex]T_A(v) = Av = vx_1 + Avx_2 + ... + A^{k-1}x_k \in V[/itex]

since [itex]Av = vx_1 + Avx_2 + ... + A^{k-1}x_k[/itex] is a linear combination of the span of V.

We know that it is the smallest one, because if [itex]V = Span (v, Av, ... ,A^{k-2}v)[/itex], for example, then not all of [itex]A^wv[/itex] (w in F) would be a linear combination of [itex](v, Av, ... , A^{k-2}v)[/itex] (we also know that [itex](v, Av, ... , A^{k-2}v)[/itex] is a non-square matrix, so it cannot be a basis), since we know from the first question that I asked that [itex](v, Av, ... , A^{k-1}v)[/itex] is a basis for F.

2) Since [itex]V=Span(v, Av, A^2v, ... , A^{k-1}v )[/itex], we know that [itex][v|Av|...|A^{k-1}v][/itex] has a pivot position in every row. We also know that it is a square matrix, from the very first question that I asked...so it must also have a pivot position in every column. Thus, it is one-to-one and onto...and must be a basis for V.

Do you think my answers are correct?

Thanks in advance
 
  • #7
Artusartos said:
Thanks a lot.

I also have two other questions, if you don't mind...

1) Let [itex]V=Span(v, Av, A^2v, ... , A^{k-1}v )[/itex]. Show that V is the smallest A-invariant subspace containing v. We denote this fact by writing

V=F[x]v
Does [itex]k[/itex] still have the same meaning as in the first problem? i.e. it's the smallest power such that [itex]v,Av,A^2v,\ldots,A^{k-1}v,A^{k}v[/itex] are linearly dependent? I will assume so, because the statement isn't necessarily true for an arbitrary [itex]k[/itex].

My answers:

1) We know that V is T-invariant because,

[itex]T_A(v) = Av = vx_1 + Avx_2 + ... + A^{k-1}x_k \in V[/itex]

since [itex]Av = vx_1 + Avx_2 + ... + A^{k-1}x_k[/itex] is a linear combination of the span of V.
This doesn't seem quite right. An arbitrary element of [itex]V[/itex] can be written as follows:
[tex]v = vx_1 + Avx_2 + \ldots + A^{k-1}vx_k[/tex]
Therefore
[tex]Av = Avx_1 + A^2vx_2 + \ldots + A^{k-1}vx_{k-1} + A^{k}vx_k[/tex]
Now you have to use the fact that [itex]A^{k}v[/itex] is a linear combination of [itex]v,Av,A^2v,\ldots,A^{k-1}v[/itex].

We know that it is the smallest one, because if [itex]V = Span (v, Av, ... ,A^{k-2}v)[/itex], for example,
OK, that's one particular subspace with lower dimension than [itex]V[/itex]. But you need to prove that NO subspace with lower dimension than [itex]V[/itex] can be an [itex]A[/itex]-invariant subspace containing [itex]v[/itex]. Hint: to do this, you need to show that any such subspace must contain all of the vectors [itex]v, Av, ... ,A^{k-1}v[/itex].

2) Since [itex]V=Span(v, Av, A^2v, ... , A^{k-1}v )[/itex], we know that [itex][v|Av|...|A^{k-1}v][/itex] has a pivot position in every row. We also know that it is a square matrix, from the very first question that I asked...so it must also have a pivot position in every column. Thus, it is one-to-one and onto...and must be a basis for V.
I don't think you need to talk about rows and columns here. A basis is a linearly independent set of vectors which spans the space. Well, by definition, [itex]v, Av, A^2v, ... , A^{k-1}v[/itex] spans [itex]V[/itex], so all you need is that this is a linearly independent set. But that's also given, isn't it?
 
  • #8
jbunniii said:
Does [itex]k[/itex] still have the same meaning as in the first problem? i.e. it's the smallest power such that [itex]v,Av,A^2v,\ldots,A^{k-1}v,A^{k}v[/itex] are linearly dependent? I will assume so, because the statement isn't necessarily true for an arbitrary [itex]k[/itex].

Yes it does have the same meaning.



jbunniii said:
This doesn't seem quite right. An arbitrary element of [itex]V[/itex] can be written as follows:
[tex]v = vx_1 + Avx_2 + \ldots + A^{k-1}vx_k[/tex]
Therefore
[tex]Av = Avx_1 + A^2vx_2 + \ldots + A^{k-1}vx_{k-1} + A^{k}vx_k[/tex]
Now you have to use the fact that [itex]A^{k}v[/itex] is a linear combination of [itex]v,Av,A^2v,\ldots,A^{k-1}v[/itex].

So, can I answer it like this:

From the very first question that I asked, we know that the equation [itex]a_0v + a_1Av + ... + a_{k-1}A^{k-1}v + A^kv = 0[/itex] is true. So

[itex]A^kv = -a_0v - a_1Av - ... - a_{k-1}A^{k-1}v [/itex]

So if we let [itex]y_m = -a_m[/itex] for m = 0, 1, ... , k-1,

[itex]A^kv = y_0v + y_1Av + ... + y_{k-1}A^{k-1}v [/itex]

From here, we can see that any [itex]A^kv[/itex] can be written as a linear combination of [itex] v, Av, ..., A^{k-1}v [/itex]. Since V is spanned by these elements, then [itex]A^kv[/itex] is alo in V. So V is A-invariant.

jbunniii said:
OK, that's one particular subspace with lower dimension than [itex]V[/itex]. But you need to prove that NO subspace with lower dimension than [itex]V[/itex] can be an [itex]A[/itex]-invariant subspace containing [itex]v[/itex]. Hint: to do this, you need to show that any such subspace must contain all of the vectors [itex]v, Av, ... ,A^{k-1}v[/itex].

So if we look at this equation again:

[itex]A^kv = y_0v + y_1Av + ... + y_{k-1}A^{k-1}v [/itex], we will see that [itex]A^kv[/itex] can be written as the linear combination of [itex]v, Av, ... , A^{k-1}[/itex]. Since [itex]v, Av, ... ,A^{k-1}[/itex] is linearly independent, we know that we cannot write [itex]A^kv[/itex] as a linear combination of these elements if we take any of them away. So, V must be the smallest.





jbunniii said:
I don't think you need to talk about rows and columns here. A basis is a linearly independent set of vectors which spans the space. Well, by definition, [itex]v, Av, A^2v, ... , A^{k-1}v[/itex] spans [itex]V[/itex], so all you need is that this is a linearly independent set. But that's also given, isn't it?

Yes it is given.
 

Related to A question about linear algebra

1. What is linear algebra?

Linear algebra is a branch of mathematics that deals with linear equations, matrices, vector spaces, and linear transformations. It involves the study of objects that are linearly related to each other, such as lines and planes in geometry.

2. What are the applications of linear algebra?

Linear algebra has various applications in fields such as physics, engineering, economics, computer graphics, and data science. It is used to solve systems of linear equations, analyze data and networks, and model real-world situations.

3. What are the basic concepts in linear algebra?

The basic concepts in linear algebra include vectors, matrices, linear transformations, eigenvalues and eigenvectors, and systems of linear equations. These concepts help in understanding and solving problems related to linear relationships between variables.

4. Why is linear algebra important?

Linear algebra plays a crucial role in many areas of mathematics and science. It provides a powerful tool for solving complex problems and understanding the relationships between different variables. It is also the foundation for many advanced mathematical concepts and techniques.

5. How can I improve my understanding of linear algebra?

To improve your understanding of linear algebra, you can practice solving problems and working with matrices and vectors. You can also watch online tutorials, attend lectures, and read textbooks or articles on the subject. It is also helpful to have a strong foundation in algebra and calculus before diving into linear algebra.

Similar threads

  • Calculus and Beyond Homework Help
Replies
3
Views
509
  • Calculus and Beyond Homework Help
Replies
1
Views
621
  • Calculus and Beyond Homework Help
Replies
14
Views
616
  • Calculus and Beyond Homework Help
Replies
24
Views
824
  • Calculus and Beyond Homework Help
Replies
1
Views
470
  • Calculus and Beyond Homework Help
Replies
5
Views
6K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
461
  • Calculus and Beyond Homework Help
Replies
3
Views
584
Back
Top