Proving the Independence and Span of Matrix Columns | Theorem Help

  • Thread starter Thread starter stunner5000pt
  • Start date Start date
  • Tags Tags
    Matrices Theorem
Click For Summary

Homework Help Overview

The discussion revolves around proving the independence and span of the columns of a matrix A, specifically focusing on the set {A^T C1, ..., A^T Cn} and its role as a basis of Rn when the rank of A equals n. Participants explore the implications of linear independence and the relationship between the rank of a matrix and its transpose.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants question whether the linear independence of the columns of A implies the linear independence of the transformed set {A^T C1, ..., A^T Cn}. There is discussion about the nature of matrix multiplication and its effects on vector independence. Some participants suggest manipulating equations to derive contradictions, while others highlight the importance of understanding the null space of A^T.

Discussion Status

The discussion is active, with multiple interpretations being explored regarding the linear independence of the transformed vectors. Some participants have provided guidance on how to approach proving linear independence, while others have raised concerns about assumptions made in the reasoning. There is no explicit consensus yet, as participants continue to refine their arguments and explore different aspects of the problem.

Contextual Notes

Participants note that A is an mxn matrix, which introduces specific constraints on the rank and the dimensions of the vectors involved. The implications of the rank being equal to n are also under scrutiny, particularly in relation to the properties of the null space of A^T.

stunner5000pt
Messages
1,447
Reaction score
5
Let A be a mxn matrix with columns C1,...Cn. If rank A = n, show taht
[itex]{A^T C_{1},...,A^T C_{N}}[/itex]is a basis of Rn


since Rank A = n, then the columns are linearly independent

so does that automatically mean that any multiple,, like A transpose for example, will keep the independence of the Columns?

A theorem also tells us that if the Rank A = n, then the column span Rn. So the columns span Rn in this case

is this adequate for a proof?
 
Physics news on Phys.org
No, it is not. Multiplication by A is not just multiplication by a scalar. It could also rotate the vectors.

You need to show that the set {A^T C1, ... A^T Cn} is linearly independent, which implies it is a basis in Rn since it has n elements. Assume otherwise. Then you have a1A^T C1 + ... + anA^T Cn = 0 for a1 ... an not all zero. Can you manipulate this to get a contradiction?
 
The answer is below. Highlight the first white line if you need a hint. Every time you need a hint, highlight the next line. For your own good, highlight as few lines as possible, i.e. try to do most of it yourself.

{ATC1, ..., ATCn} is a basis of Rn

iff {ATC1, ..., ATCn} is linearly independent

iff the columns of the matrix (ATC1 ... ATCn) are linearly independent

iff det[(ATC1 ... ATCn)] is non-zero

iff det[AT(C1 ... Cn)] is non-zero

iff det[ATA] is non-zero

iff det[AT]det[A] is non-zero

iff det[A]det[A] is non-zero

iff det[A] is non-zero


iff rank[A] = n
 
Last edited:
AKG said:
The answer is below. Highlight the first white line if you need a hint. Every time you need a hint, highlight the next line. For your own good, highlight as few lines as possible, i.e. try to do most of it yourself.

{ATC1, ..., ATCn} is a basis of Rn

iff {ATC1, ..., ATCn} is linearly independent

iff the columns of the matrix (ATC1 ... ATCn) are linearly independent

iff det[(ATC1 ... ATCn)] is non-zero

iff det[AT(C1 ... Cn)] is non-zero

iff det[ATA] is non-zero

iff det[AT]det[A] is non-zero

iff det[A]det[A] is non-zero

iff det[A] is non-zero


iff rank[A] = n

This would be true if A were a nxn matrix, but the question specifies that A is a mxn matrix.
 
Last edited:
0rthodontist said:
No, it is not. Multiplication by A is not just multiplication by a scalar. It could also rotate the vectors.

You need to show that the set {A^T C1, ... A^T Cn} is linearly independent, which implies it is a basis in Rn since it has n elements. Assume otherwise. Then you have a1A^T C1 + ... + anA^T Cn = 0 for a1 ... an not all zero. Can you manipulate this to get a contradiction?

ok supose that
[tex]a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0[/tex]
where not all ai are zero

we know that A is not a zero matrix because the rank A = n <= m
not A transpose is not zero
then the Columns must be all zero
But again A is not zero. Thus by contradiction we get that all the ai must be zero

IS that good?
 
nocturnal said:
This would be true if A were a nxn matrix, but the question specifies that A is a mxn matrix.
Oops, I overlooked that.
 
stunner5000pt said:
ok supose that
[tex]a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0[/tex]
where not all ai are zero

we know that A is not a zero matrix because the rank A = n <= m
not A transpose is not zero
then the Columns must be all zero
But again A is not zero. Thus by contradiction we get that all the ai must be zero

IS that good?
No. Why is the underlined stuff true? Try this:

{ATC1, ..., ATCn} is a basis
iff {ATC1, ..., ATCn} is linearly independent
iff c1ATC1 + ... + cnATCn = 0 implies c1 = ... = cn = 0
iff AT(c1C1 + ... + cnCn) = 0 implies c1 = ... = cn = 0
iff AT(c1C1 + ... + cnCn) = 0 implies c1C1 + ... + cnCn = 0 [since the Ci are linearly independent since Rank[A] = n, so [itex]\sum c_iC_i = 0 \Leftrightarrow \forall i,\, c_i=0[/itex]]
iff AT(X) = 0 implies X = 0

But rank[AT] = rank[A] = n. Is there some theorem which says that, given this, that italicized line must be true?
 
A^T has all independent columns since it has rank n. Therefore, A^T(X) = 0 can only be true if X is 0. Otherwise you would have a linear combination of the columns of A^T, with not all nonzero coefficients, yielding 0, which we know cannot happen.
 
A^T has all independent columns since it has rank n
You mean rows.


Anyways, I think everyone's overlooking something important!

[tex]a_{1}A^T C_{1} + ... + a_{n}A^T C_{n} = 0[/tex]

This means that the vector [itex]a_1 C_1 + \cdots + a_n C_n[/itex] is an element of the null space of [itex]A^T[/itex]. People seem to be arguing that the null space is trivial, and therefore the vector must be zero.

But that's wrong!

[itex]A^T[/itex] is a rank n nxm matrix. Since it's operating on an m-dimensional space, it must have a null space of dimension (m-n).


You have to use the fact that these m-long column vectors are special -- each of the [itex]C_n[/itex] is the transpose of one of the rows of [itex]A^T[/itex].


In fact, if we place the [itex]A^T C_i[/itex] into a matrix, we get:

[tex] [ A^T C_1 \mid A^T C_2 \mid \cdots \mid A^T C_n ] = A^T A[/tex]

There's actually a 1-line proof that this (square) matrix is nonsingular, but I think the approach you guys are using ought to work, as long as you start using the fact that the [itex]C_i[/itex] are special m-long vectors, and not arbitrary m-long vectors.
 
Last edited:
  • #10
Hurkyl said:
You mean rows.

:redface: Oh yeah... I was picturing it wrong.
 
  • #11
There's actually a 1-line proof that this (square) matrix is nonsingular
When m=n, this is easy, and that's what I did in my first post because I assumed we were dealing with square matrices. If m is not n, what is the 1-line proof?
 
  • #12
[tex] A^TA \vec{v} = 0 \implies \vec{v}^T A^T A \vec{v} = 0<br /> \implies (A\vec{v})^T (A \vec{v}) = 0 \implies <br /> A \vec{v} = 0 \implies \vec{v} = 0[/tex]
Where the last one follows because A has rank n and is operating on R^n.
 
  • #13
Very nice, thanks Hurkyl.
 

Similar threads

Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
17
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 27 ·
Replies
27
Views
4K