Are Non-Singular Column Vectors of X Linearly Independent?

  • Thread starter IniquiTrance
  • Start date
In summary, If X is a square matrix, then det(X^TX) = det(X)^2=0. If X is not square, but is real, then QR decomposition should reduce the problem to that of square matrices. Prove by contradiction that if the columns of X are linearly dependent, there is a non-zero vector y such that X^TXy = 0. In addition, if the columns of X are linearly independent, then X^T X is non-singular.
  • #1
IniquiTrance
190
0
Is it true to say that if [itex]X^T X[/itex] is non-singular, then the column vectors of X must be linearly independent? I know how to prove that if the columns of X are linearly independent, then [itex]X^T X[/itex] is non-singular. Just not sure about the other way around. Thanks!
 
Physics news on Phys.org
  • #2
Is X a square matrix? If so use
[tex]det(X^TX) = det(X)^2=0[/tex]

If not X is not square, but is real, then QR decomposition should reduce the problem to that of square matrices (something simpler may suffice, but this is the simplest approach I can think of right now).
 
  • #3
Prove it by contradition. If the columns of X are linearly dependent, there is a non-zero vector ##y## such that ##X^TXy = 0##.
 
  • #4
Thank you. I suspected it was true, but couldn't prove it to myself.
 
  • #5
AlephZero said:
Prove it by contradition. If the columns of X are linearly dependent, there is a non-zero vector ##y## such that ##X^TXy = 0##.

That's a very nice proof, especially because you established necessary and sufficient conditions. The OP claimed that he could prove the converse, but, if he knew this proof he should have had no trouble.

Just one remark, instead of "contradiction", maybe you should have used "contraposition".
 
  • #6
Dickfore said:
Just one remark, instead of "contradiction", maybe you should have used "contraposition".

Well, I used to know what "contrapositive" meant when I was a student, but these days I find understanding the concepts is more useful than remembering their names.
 

What is the definition of "non-singularity" in the context of A^T*A?

In linear algebra, a square matrix is considered "non-singular" if it has an inverse, meaning it is invertible and has a unique solution to the equation Ax = b. In the case of A^T*A, this means that the matrix must have a non-zero determinant.

Why is the non-singularity of A^T*A important?

The non-singularity of A^T*A is important because it guarantees that the matrix is full rank, meaning it has linearly independent columns. This allows for unique solutions to be found for systems of equations involving A^T*A.

Can A^T*A be non-singular if A is not?

Yes, it is possible for A^T*A to be non-singular even if A is singular. This is because the transpose of a singular matrix may have linearly independent columns, resulting in a non-singular A^T*A.

How does the size of A affect the non-singularity of A^T*A?

The size of A does not affect the non-singularity of A^T*A. As long as A is a square matrix, the resulting A^T*A will also be a square matrix with the same number of rows and columns, and the same rules for non-singularity apply.

What are some applications of A^T*A and its non-singularity?

One application is in data analysis, where A represents a matrix of independent variables and A^T*A is used in calculating regression coefficients. The non-singularity of A^T*A ensures that the regression model is well-defined and has a unique solution. It is also used in optimization problems and in solving systems of equations in engineering and physics.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
843
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
846
  • Linear and Abstract Algebra
Replies
5
Views
1K
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
9
Views
534
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
154
  • Linear and Abstract Algebra
Replies
14
Views
2K
Back
Top