Linearly independence question

In summary, the conversation discusses the relationship between the linear independence of columns and rows in a matrix ##A## and its transpose ##A^T##. It is stated that if ##A## has linearly independent columns, then ##A^T## has linearly independent rows. The conversation then moves on to discussing the properties of ##A^TA##, with the conclusion being that the rank of ##A## is equal to the rank of ##A^TA##. It is noted that this is true even if ##A## is not a square matrix and does not have to be symmetric. The conversation ends with a clarification on the property and a thank you for the help.
  • #1
Karnage1993
133
1
Say I have a matrix ##A## that has linearly independent columns. Then clearly ##A^T## has lin. indep. rows. So what can we say about ##A^TA##? Specifically, is there anything we can say about the rows/columns of ##A^TA##? I'm thinking there has to be some sort of relation but I don't know what that is (if there is indeed any).
 
Physics news on Phys.org
  • #2
We have [tex]\textrm{rank}(A) = \textrm{rank}(A^T A)[/tex] So the number of linear independent columns/rows of ##A## is the same as the number of linear independent columns/rows of ##A^T A##.
 
  • #3
Isn't ##\textrm{rank}(A) = \textrm{rank}(A^TA)## only true if ##A## is symmetric? Also, I forgot to include that ##A## is not necessarily a square matrix. Let's have ##A## be an ##n## x ##k## matrix. Does your conclusion still follow with these new conditions?
 
  • #4
Yes, it is true in general. Indeed, by rank-nullity is suffices to show that the nullity of ##A## equals the nullity of ##A^T A##.

But take ##Ax = 0##, then obviously ##A^T A x = 0##.
Conversely, if ##A^T A x = 0##, then ##x^T A^T A x##. But then ##|Ax| = 0##. Thus ##Ax= 0##.

So the nullspace of ##A## equals the nullspace of ##A^T A##.
 
  • #5
I was under the impression that ##\textrm{rank}(A) = \textrm{rank}(A^TA)## is only true if ##A## is symmetric, but it appears you are right, and Wikipedia confirms it. It is indeed true in general for any ##A##, so I guess I misread it somewhere. Thanks for the help!
 

Related to Linearly independence question

1. What is the concept of linear independence?

The concept of linear independence refers to the relationship between a set of vectors in a vector space. It means that none of the vectors in the set can be expressed as a linear combination of the others. In other words, the vectors are not redundant and each vector contributes a unique component to the space.

2. How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use the definition of linear independence. If none of the vectors can be written as a linear combination of the others, then the set is linearly independent. Alternatively, you can use the determinant method, where you create a matrix with the vectors as columns and calculate the determinant. If the determinant is non-zero, the vectors are linearly independent.

3. What is the difference between linear independence and linear dependence?

The difference between linear independence and linear dependence lies in the relationships between vectors in a set. Linear independence means that no vector in a set can be expressed as a linear combination of the others, while linear dependence means that at least one vector in a set can be written as a linear combination of the others.

4. How is linear independence useful in linear algebra?

Linear independence is a fundamental concept in linear algebra and has many applications. It allows us to determine the dimension of a vector space, solve systems of linear equations, and find a basis for a vector space. It also helps us understand the behavior of linear transformations and solve problems in fields like physics, engineering, and computer science.

5. Can a set of vectors be both linearly independent and linearly dependent?

No, a set of vectors cannot be both linearly independent and linearly dependent. These two concepts are mutually exclusive. If a set of vectors is linearly dependent, it means that at least one vector can be expressed as a linear combination of the others, making the set linearly dependent. Therefore, a set of vectors can only be either linearly independent or linearly dependent, not both.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
918
  • Linear and Abstract Algebra
Replies
8
Views
914
  • Linear and Abstract Algebra
Replies
6
Views
917
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
947
  • Linear and Abstract Algebra
Replies
9
Views
4K
Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top