• Support PF! Buy your school textbooks, materials and every day products Here!

Linear Algebra Proof involving Linear Independence

  • #1
RJLiberator
Gold Member
1,095
63

Homework Statement


Prove that if [itex]({A_1, A_2, ..., A_k})[/itex] is a linearly independent subset of M_nxn(F), then [itex](A_1^T,A_2^T,...,A_k^T)[/itex] is also linearly independent.

Homework Equations




The Attempt at a Solution



Have: [itex]a_1A_1^T+a_2A_2^T+...+a_kA_k^T=0[/itex] implies [itex]a_1A_1+a_2A_2+...+a_kA_k=0[/itex]

So [itex]a_1=a_2=a_3=a_n...=0[/itex]

^^ This was the answer in the back of the book, but I'm not sure what it means.

I guess I have to assume that the T means transpose here. It's safe to assume that since it's linear independent, then the transpose is also linear independent?
 

Answers and Replies

  • #2
540
360
##A_i## denotes an n x n matrix as I understand and the said system is a subset of ##Mat_n (F)##

Only trivial linear combination of the matrices ##A_i## produces a 0 matrix.

EDIT: I think that was too much information. In general, what can you say about the individual sums of the elements with respective indices given that the initial system is linearly independent?
 
  • Like
Likes RJLiberator
  • #3
33,632
5,287

Homework Statement


Prove that if [itex]({A_1, A_2, ..., A_k})[/itex] is a linearly independent subset of M_nxn(F), then [itex](A_1^T,A_2^T,...,A_k^T)[/itex] is also linearly independent.

Homework Equations




The Attempt at a Solution



Have: [itex]a_1A_1^T+a_2A_2^T+...+a_kA_k^T=0[/itex] implies [itex]a_1A_1+a_2A_2+...+a_kA_k=0[/itex]

So [itex]a_1=a_2=a_3=a_n...=0[/itex]
There is a subtlety in the definition of linear independence that escapes many students in linear algebra. Given any set of vectors ##{v_1, v_2, \dots, v_n}##, the equation ##c_1v_1 + c_2v_2 + \dots + c_nv_n = 0## always has ##c_1 = c_2 = \dots = c_n = 0## as a solution. The difference between the vectors being linearly independent versus linearly dependent is whether the solution for the constants ##c_i## is unique. For a set of linearly independent vectors, ##c_1 = c_2 = \dots = c_n = 0## is the only solution (often called the trivial solution). For a set of linearly dependent vectors, there will also be an infinite number of other solutions.

Here's an example. Consider the vectors ##v_1 = <1, 0>, v_2 = <0, 1>, v_3 = <1, 1>##. The equation ##c_1v_1 + c_2v_2 + c_3v_3 = 0## is obviously true when ##c_1 = c_2 = c_3 = 0##. That alone isn't enough for us to conclude that the three vectors are linearly independent. With a bit of work we can see that ##c_1 = 1, c_2 = 1, c_3 = -1## is another solution. In fact, this is only one of an infinite number of alternative solutions, so we conclude that the three vectors here are linearly dependent.

What I've written about vectors here applies to any member of a vector space, including the matrices of the problem posted in this thread.
RJLiberator said:
^^ This was the answer in the back of the book, but I'm not sure what it means.

I guess I have to assume that the T means transpose here. It's safe to assume that since it's linear independent, then the transpose is also linear independent?
Yes, T means transpose. No, you can't assume that since the set of vectors (matrices in this case) is linearly independent, then the set of transposes is also linearly independent. You have to show that this is the case.
 
  • Like
Likes RJLiberator
  • #4
540
360
You gave 3 vectors in your example in a two dimensional space. There is always one vector that is a linear combination of the two others, provided the set of vectors span the space.

The objective in the problem is to use the fact that the system of matrices is linearly independent. It means that the linear combination to produce a 0 matrix is trivial.

Multiplying a matrix with a scalar, however, means each individual element of the matrix is multiplied with the same scalar.
 
Last edited:
  • Like
Likes RJLiberator
  • #5
Fredrik
Staff Emeritus
Science Advisor
Gold Member
10,851
407
When you're asked to prove that a set ##\{v_1,\dots,v_n\}## is linearly independent, you should almost always start the proof with "Let ##a_1,\dots,a_n## be numbers such that ##\sum_{i=1}^n a_i v_i=0##."

This is the straightforward way to begin because the definition of "linearly independent" tells you that now it's sufficient to prove that ##a_i=0## for all ##i\in\{1,\dots,n\}##. Use the equality ##\sum_{i=1}^n a_iv_i=0## and the assumptions that were included in the problem statement.

So in your case, you start by saying this: Let ##a_1,\dots,a_k\in\mathbb F## be such that ##\sum_{i=1}^k a_i (A_i)^T=0##.

Then you use the assumptions to prove that this equality implies that ##a_i=0## for all ##i\in\{1,\dots,k\}##.

It's safe to assume that since it's linear independent, then the transpose is also linear independent?
I don't know what you mean exactly, but you can't assume anything that wasn't included as an assumption in the problem statement. If you mean that it's safe to assume that since ##\{A_1,\dots,A_k\}## is linearly independent, ##\{(A_1)^T,\dots,(A_k)^T\}## is too, then the answer is an extra strong "no", because you have made the statement that you want to prove one of your assumptions.
 
  • Like
Likes RJLiberator
  • #6
33,632
5,287
You gave 3 vectors in your example in a two dimensional space. There is always one vector that is a linear combination of the two others, provided the set of vectors span the space.
I did this on purpose, to provide a simple example of a set of linearly dependent vectors. To show that this set was linearly dependent, I used only the definition of linear dependence. Of course you could use other concepts to show that there are too many vectors in my set to form a basis, which makes the set linearly dependent, but my point was that many beginning students of Linear Algebra don't get the fine point that distinguishes linear independence from linear dependence; namely, the business about the equation having only the trivial solution.
nuuskur said:
The objective in the problem is to use the fact that the system of matrices is linearly independent. It means that the linear combination to produce a 0 matrix is trivial.

Multiplying a matrix with a scalar, however, means each individual element of the matrix is multiplied with the same scalar.
 
  • Like
Likes RJLiberator and nuuskur
  • #7
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728

Homework Statement


Prove that if [itex]({A_1, A_2, ..., A_k})[/itex] is a linearly independent subset of M_nxn(F), then [itex](A_1^T,A_2^T,...,A_k^T)[/itex] is also linearly independent.

Homework Equations




The Attempt at a Solution



Have: [itex]a_1A_1^T+a_2A_2^T+...+a_kA_k^T=0[/itex] implies [itex]a_1A_1+a_2A_2+...+a_kA_k=0[/itex]

So [itex]a_1=a_2=a_3=a_n...=0[/itex]

^^ This was the answer in the back of the book, but I'm not sure what it means.

I guess I have to assume that the T means transpose here. It's safe to assume that since it's linear independent, then the transpose is also linear independent?
Are you sure you have copied the question correctly? As stated, it is essentially trivial. A more important---and not nearly as easy---version would be: if the columns of an ##n \times n## matrix are linearly independent, then the rows are linearly independent as well. (Your version of the problem is that if a bunch of ##n \times n## matrices are linearly independent, then so are their transposes. That seems a pointless exercise to me!)
 
  • Like
Likes RJLiberator
  • #8
RJLiberator
Gold Member
1,095
63
Yeah, the question is pretty trivial after reading the responses here. I suppose that's why I was a bit mixed up on it. I felt I didn't have enough for the answer.

But after reading the discussion here and adding a few things, I feel confident with this.

Thanks, and a shout out to Fredrik for the extreme clarity.
 

Related Threads on Linear Algebra Proof involving Linear Independence

Replies
3
Views
2K
Replies
4
Views
2K
  • Last Post
Replies
3
Views
854
Replies
2
Views
6K
Replies
4
Views
526
Replies
4
Views
7K
Replies
2
Views
884
  • Last Post
Replies
3
Views
4K
  • Last Post
Replies
12
Views
1K
  • Last Post
Replies
7
Views
3K
Top