Linear Algebra Proof involving Linear Independence

In summary, the problem statement is asking to prove that if a linearly independent subset of matrices is given, then the set of their transposes is also linearly independent. This means that the only solution to the linear combination that produces a zero matrix is when all the coefficients are equal to zero. The T in the problem denotes transpose, and it is not safe to assume that the set of transposes is also linearly independent based on the given information. The key to solving this problem is to start by assuming a linear combination of the transposed matrices and using the given information to show that the coefficients must all be equal to zero, thus proving the linear independence of the set of transposes.
  • #1
RJLiberator
Gold Member
1,095
63

Homework Statement


Prove that if [itex]({A_1, A_2, ..., A_k})[/itex] is a linearly independent subset of M_nxn(F), then [itex](A_1^T,A_2^T,...,A_k^T)[/itex] is also linearly independent.

Homework Equations

The Attempt at a Solution



Have: [itex]a_1A_1^T+a_2A_2^T+...+a_kA_k^T=0[/itex] implies [itex]a_1A_1+a_2A_2+...+a_kA_k=0[/itex]

So [itex]a_1=a_2=a_3=a_n...=0[/itex]

^^ This was the answer in the back of the book, but I'm not sure what it means.

I guess I have to assume that the T means transpose here. It's safe to assume that since it's linear independent, then the transpose is also linear independent?
 
Physics news on Phys.org
  • #2
##A_i## denotes an n x n matrix as I understand and the said system is a subset of ##Mat_n (F)##

Only trivial linear combination of the matrices ##A_i## produces a 0 matrix.

EDIT: I think that was too much information. In general, what can you say about the individual sums of the elements with respective indices given that the initial system is linearly independent?
 
  • Like
Likes RJLiberator
  • #3
RJLiberator said:

Homework Statement


Prove that if [itex]({A_1, A_2, ..., A_k})[/itex] is a linearly independent subset of M_nxn(F), then [itex](A_1^T,A_2^T,...,A_k^T)[/itex] is also linearly independent.

Homework Equations

The Attempt at a Solution



Have: [itex]a_1A_1^T+a_2A_2^T+...+a_kA_k^T=0[/itex] implies [itex]a_1A_1+a_2A_2+...+a_kA_k=0[/itex]

So [itex]a_1=a_2=a_3=a_n...=0[/itex]
There is a subtlety in the definition of linear independence that escapes many students in linear algebra. Given any set of vectors ##{v_1, v_2, \dots, v_n}##, the equation ##c_1v_1 + c_2v_2 + \dots + c_nv_n = 0## always has ##c_1 = c_2 = \dots = c_n = 0## as a solution. The difference between the vectors being linearly independent versus linearly dependent is whether the solution for the constants ##c_i## is unique. For a set of linearly independent vectors, ##c_1 = c_2 = \dots = c_n = 0## is the only solution (often called the trivial solution). For a set of linearly dependent vectors, there will also be an infinite number of other solutions.

Here's an example. Consider the vectors ##v_1 = <1, 0>, v_2 = <0, 1>, v_3 = <1, 1>##. The equation ##c_1v_1 + c_2v_2 + c_3v_3 = 0## is obviously true when ##c_1 = c_2 = c_3 = 0##. That alone isn't enough for us to conclude that the three vectors are linearly independent. With a bit of work we can see that ##c_1 = 1, c_2 = 1, c_3 = -1## is another solution. In fact, this is only one of an infinite number of alternative solutions, so we conclude that the three vectors here are linearly dependent.

What I've written about vectors here applies to any member of a vector space, including the matrices of the problem posted in this thread.
RJLiberator said:
^^ This was the answer in the back of the book, but I'm not sure what it means.

I guess I have to assume that the T means transpose here. It's safe to assume that since it's linear independent, then the transpose is also linear independent?
Yes, T means transpose. No, you can't assume that since the set of vectors (matrices in this case) is linearly independent, then the set of transposes is also linearly independent. You have to show that this is the case.
 
  • Like
Likes RJLiberator
  • #4
You gave 3 vectors in your example in a two dimensional space. There is always one vector that is a linear combination of the two others, provided the set of vectors span the space.

The objective in the problem is to use the fact that the system of matrices is linearly independent. It means that the linear combination to produce a 0 matrix is trivial.

Multiplying a matrix with a scalar, however, means each individual element of the matrix is multiplied with the same scalar.
 
Last edited:
  • Like
Likes RJLiberator
  • #5
When you're asked to prove that a set ##\{v_1,\dots,v_n\}## is linearly independent, you should almost always start the proof with "Let ##a_1,\dots,a_n## be numbers such that ##\sum_{i=1}^n a_i v_i=0##."

This is the straightforward way to begin because the definition of "linearly independent" tells you that now it's sufficient to prove that ##a_i=0## for all ##i\in\{1,\dots,n\}##. Use the equality ##\sum_{i=1}^n a_iv_i=0## and the assumptions that were included in the problem statement.

So in your case, you start by saying this: Let ##a_1,\dots,a_k\in\mathbb F## be such that ##\sum_{i=1}^k a_i (A_i)^T=0##.

Then you use the assumptions to prove that this equality implies that ##a_i=0## for all ##i\in\{1,\dots,k\}##.

RJLiberator said:
It's safe to assume that since it's linear independent, then the transpose is also linear independent?
I don't know what you mean exactly, but you can't assume anything that wasn't included as an assumption in the problem statement. If you mean that it's safe to assume that since ##\{A_1,\dots,A_k\}## is linearly independent, ##\{(A_1)^T,\dots,(A_k)^T\}## is too, then the answer is an extra strong "no", because you have made the statement that you want to prove one of your assumptions.
 
  • Like
Likes RJLiberator
  • #6
nuuskur said:
You gave 3 vectors in your example in a two dimensional space. There is always one vector that is a linear combination of the two others, provided the set of vectors span the space.
I did this on purpose, to provide a simple example of a set of linearly dependent vectors. To show that this set was linearly dependent, I used only the definition of linear dependence. Of course you could use other concepts to show that there are too many vectors in my set to form a basis, which makes the set linearly dependent, but my point was that many beginning students of Linear Algebra don't get the fine point that distinguishes linear independence from linear dependence; namely, the business about the equation having only the trivial solution.
nuuskur said:
The objective in the problem is to use the fact that the system of matrices is linearly independent. It means that the linear combination to produce a 0 matrix is trivial.

Multiplying a matrix with a scalar, however, means each individual element of the matrix is multiplied with the same scalar.
 
  • Like
Likes RJLiberator and nuuskur
  • #7
RJLiberator said:

Homework Statement


Prove that if [itex]({A_1, A_2, ..., A_k})[/itex] is a linearly independent subset of M_nxn(F), then [itex](A_1^T,A_2^T,...,A_k^T)[/itex] is also linearly independent.

Homework Equations

The Attempt at a Solution



Have: [itex]a_1A_1^T+a_2A_2^T+...+a_kA_k^T=0[/itex] implies [itex]a_1A_1+a_2A_2+...+a_kA_k=0[/itex]

So [itex]a_1=a_2=a_3=a_n...=0[/itex]

^^ This was the answer in the back of the book, but I'm not sure what it means.

I guess I have to assume that the T means transpose here. It's safe to assume that since it's linear independent, then the transpose is also linear independent?

Are you sure you have copied the question correctly? As stated, it is essentially trivial. A more important---and not nearly as easy---version would be: if the columns of an ##n \times n## matrix are linearly independent, then the rows are linearly independent as well. (Your version of the problem is that if a bunch of ##n \times n## matrices are linearly independent, then so are their transposes. That seems a pointless exercise to me!)
 
  • Like
Likes RJLiberator
  • #8
Yeah, the question is pretty trivial after reading the responses here. I suppose that's why I was a bit mixed up on it. I felt I didn't have enough for the answer.

But after reading the discussion here and adding a few things, I feel confident with this.

Thanks, and a shout out to Fredrik for the extreme clarity.
 

1. What is linear independence?

Linear independence is a property that describes a set of vectors in a vector space. It means that no vector in the set can be written as a linear combination of the other vectors in the set.

2. How do you prove linear independence?

To prove linear independence, you must show that the only solution to the equation a1v1 + a2v2 + ... + anvn = 0 is a1 = a2 = ... = an = 0, where v1, v2, ..., vn are the vectors in the set and a1, a2, ..., an are scalars.

3. What is the significance of linear independence?

Linear independence is important because it allows us to form a basis for a vector space. A basis is a set of linearly independent vectors that spans the entire vector space, meaning that any vector in the space can be written as a linear combination of the basis vectors.

4. How does linear independence relate to linear transformations?

A linear transformation preserves linear independence, meaning that if a set of vectors is linearly independent, their images under a linear transformation will also be linearly independent.

5. Can a set of more than n vectors be linearly independent in an n-dimensional space?

No, a set of more than n vectors cannot be linearly independent in an n-dimensional space. This is because in an n-dimensional space, there are only n linearly independent directions, and any additional vectors can be written as linear combinations of the existing ones.

Similar threads

  • Calculus and Beyond Homework Help
Replies
7
Views
414
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
622
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
24
Views
799
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
Back
Top