Null space of A = null space of A'A?

In summary: Take A' and multiply it by A. Then we have every row of A is contained in A'A, and every row of A'A is a linear combination of rows from A.3) Thus if A'A(X) = 0, then A(X) = 0.4) Similarly, if A(X) = 0, then A'A(X) = 0.5) Hence the null spaces of A and of A'A are the same.6) The null space of A'A is a subset of the null space of A.7) The rows of E are
  • #1
xalvyn
17
0
Hihi

I've been working on this problem for some time: if A is a (m x n) matrix, and A' denotes its transpose, then the null space of A is equal to the null space of A'A. Is this always true? I thought of a proof for the special case where A is given in reduced row echelon form, but fail to see how the general case can be proved or disproved. My proof runs like this:

1) Suppose A is in reduced row echelon form. Then, every column of A either i) contains a single 1 element, while the other entries are 0, or ii) its first i rows consist of elements which are not 1 and not necessarily 0, where i < column number.

2) Now take A' and multiply it by A. Then we have i) every row of A is contained in A'A, and ii) every row of A'A is a linear combination of rows from A.

3) Thus if A'A(X) = 0, then A(X) = 0. Similarly, if A(X) = 0, then A'A(X) = 0. Hence the null spaces of A and of A'A are the same.

There is also the weaker general result: the null space of A is a subset of the null space of A'A, because if A(X) = 0, then A'A(X) = A'(0) = 0.

i think the solution to the problem lies in the column space of A, but I'm not sure exactly how. Will be really grateful to anyone who can share his/her insight on this problem. Thanks. :)
 
Physics news on Phys.org
  • #2
ok, it seems that I've answered the question myself, but I'm still not quite satisfied with the solution.

My proof that null space of A = null space of A'A runs roughly as follows:

1) Suppose we have reduced A to its reduced row echelon form. Call this reduced row echelon form matrix E. Then, every row of A can be expressed as a linear combination of the rows of E.

2) Since the rows of A'A are in fact linear combinations of the rows of A, they are also linear combinations of the rows of E. Further, since the rows of E are independent, they form a basis for the row space of A'A. Thus the reduced row echelon form of A'A is equal to E. Hence if A'A(X) = 0, it must necessarily follow that A(X) = 0.

I sort of think that's the way to prove the result, yet it doesn't seem quite elegant.

I was hoping to demonstrate the result that the null space of A'A is a subset of the null space of A by writing the equations for A'A(X) = 0, equating each row to zero, and performing row operations on the equations to obtain the result AX = 0. it's pretty difficult for me.
 
  • #3
Well, there is a simpler way. What you are trying to show (the second direction of the subset symbol) is the same as showing that AT never maps Ax to 0 when Ax is nonzero.

Use the fact that
[tex](col A)^\perp = nul (A^T)[/tex]


Edit: I see a problem with your proof.
2) Since the rows of A'A are in fact linear combinations of the rows of A, they are also linear combinations of the rows of E. Further, since the rows of E are independent, they form a basis for the row space of A'A. Thus the reduced row echelon form of A'A is equal to E. Hence if A'A(X) = 0, it must necessarily follow that A(X) = 0.
The rows of E are not necessarily independent--for example what if A is square but not invertible? Then E will have a zero row. Anyway to be a basis of the row space of ATA the rows of E have to be more than independent; they have to span a space of the same dimension as the row space of ATA, which you haven't shown.
 
Last edited:
  • #4
Further, since the rows of E are independent, they form a basis for the row space of A'A.
0rthodontist already pointed out that the rows of E don't have to be independent.

There's another problem -- the rows of E might not even be in the row space of A'A. (At least, you haven't yet given a reason to think they would be)

Your argument is sufficient to show that the rowspace of A'A is a subspace of the row space of A, but you certainly have not shown that they are equal.


Now... if A'Ax = 0, then what is x'A'Ax? Does that give you any ideas?
 
  • #5
hi

Thanks for taking time off to look at my question.

I'll just try to fill in the gaps of my original proof and correct the errors:

1) E is not necessarily independent. Yes, it was a misstatement - i actually meant to say that the non-zero rows of E are independent, unless every row of E is zero.

2) Hence the rows of A'A are linear combinations of the non-zero rows of E. I didn't find the dimension of A'A, because every row of A'A is already a linear combination of the independent vectors in the rows of E, so that would seem to imply that the dimension of the row space of A'A is equal to the dimension of the row space of E.

3) In sum I tried to deduce from 1) and 2) that the row space of A'A is equal to the row space of E; this would mean that the null space of A'A and of A are also the same, or that the reduced row echelon form of A'A is equal to the reduced row echelon form of A.

Also, I showed that the null space of A is a subspace of the null space of A'A in my first post; my second post was done in a bit of a rush. ;)

As to Hurkly's suggestion, many thanks..
I don't know if I interpreted your hint correctly...but does it imply that the
elements of the original matrix A and the solution space of X are all real numbers?

thanks once again
 
Last edited:
  • #6
2) Hence the rows of A'A are linear combinations of the non-zero rows of E. I didn't find the dimension of A'A, because every row of A'A is already a linear combination of the independent vectors in the rows of E, so that would seem to imply that the dimension of the row space of A'A is equal to the dimension of the row space of E.
Not necessarily. The rows of the zero matrix are linear combinations of the rows in the identity matrix. The only thing you can say from this is that dim row (ATA) is less than or equal to dim row E.

Using [tex](col A)^\perp = nul (A^T)[/tex], what can you say about whether AT maps a nonzero vector in col A to 0?
 
Last edited:
  • #7
hi...been quite a while since I last posted.

To 0rthodontist...hope I got your hint right. As I've only just started studying orthogonality, I'm not totally familiar with the concept. Here's how I reasoned:

1) Since null space of A' = space of vectors orthogonal to column space
of A, AX must be orthogonal to every column of A, and every vector formed by linearly combining the columns of A.

2) But AX lies in the column space of A, i.e. it is a linear combination of the columns of A.

3) Hence AX must be orthogonal to itself, which is impossible unless AX = 0.

4) Thus X must lie in the null space of A.

QED?

Anyhow, I looked through the 'proof' I presented in the second post and realized my mistake. Is it the (unwarranted) assumption that all of the non-zero rows of E will occur at least once in A'A? If so, I hope the following steps will correct it:

1) If every non-zero row of E occurs at least once in the rows of A'A, then it means that the row spaces of E and of A are spanned by the same set
of independent vectors, and have therefore a common basis.

2) Suppose A is a (m x n) matrix.

3) Call the non-zero rows of E {E1, E2, ..., Ei}, in order of their row number.

4) Let the column numbers of columns of E in which a single 1 occurs as an element be k1, k2, ...ki, where k1 < k2 < ... < ki.

5) Represent the element of A in the ith row and jth column as a(i)(j).

6) Then the coefficient of E1 in the first row of A'A is equal to

a(1)(1)a(1)(k1) + a(2)(1)a(2)(k1) + ... + a(m)(1)a(m)(k1)

In general, the coefficient of E1 in the rth row of A'A is given by

a(1)(r)a(1)(k1) + a(2)(r)a(2)(k1) + ... + a(m)(r)a(m)(k1).

Consider the (k1)th row of A'A:

The coefficient of E1 is then given by

C1 = [a(1)(k1)]^2 + [a(2)(k1)]^2 + ... + [a(m)(k1)]^2

a(1)(k1), a(2)(k1), ... a(m)(k1) cannot be all zero, for otherwise all the elements of the (k1)th column of E would be zero, which contradicts (4).

Thus a sufficient condition for E1 to occur at least once in the rows of A'A is that, if k1 be the column number of the first '1' entry in the reduced row echelon form matrix of A, a(1)(k1), a(2)(k1), ... a(m)(k1) are all real.

7) Repeating this argument for E2, E3, ... Ei, we conclude that a sufficient condition for the null space of A to be the same as that of A'A is that, given that the leading '1' entries in the reduced row echelon form of A are in the columns k1, k2, ...ki, where k1 < k2 < k3 < ... < ki, a(1)(kr), a(2)(kr), ... a(m)(kr), where 1=< r =< i, are all real entries.

thanks for helping so far...hope there are no errors this time...
 
  • #8
1) Since null space of A' = space of vectors orthogonal to column space
of A, AX must be orthogonal to every column of A, and every vector formed by linearly combining the columns of A.

2) But AX lies in the column space of A, i.e. it is a linear combination of the columns of A.

3) Hence AX must be orthogonal to itself, which is impossible unless AX = 0.

4) Thus X must lie in the null space of A.

QED?
Yes, that's correct. I think you should explicitly state in the first step the assumption that ATAx = 0.

I'm not sure about your other proof. What do you mean by "If every non-zero row of E occurs at least once in the rows of A'A"?
 
Last edited:
  • #9
0rthodontist said:
Yes, that's correct. I think you should explicitly state in the first step the assumption that ATAx = 0.

I'm not sure about your other proof. What do you mean by "If every non-zero row of E occurs at least once in the rows of A'A"?

i meant it like this:
Since every row of A can be expressed as a linear combination of the non-zero rows of E, and every row of A'A can be expressed as a linear combination of the rows of A, every row of A'A can also be expressed as a linear combination of the non-zero rows of E. By a non-zero row of E, say Ei, 'occurring' at least once in the rows of A'A I mean there is at least one row of A'A where the coefficient of Ei is not zero, after expressing that row as a linear combination of all the non-zero rows of E.
 
  • #10
Oh, I forgot to follow up my hint. :frown:

x'A'Ax = (Ax)'(Ax)
 

1. What is the null space of a matrix?

The null space of a matrix, also known as the kernel, is the set of all vectors that when multiplied by the matrix result in the zero vector. In other words, it is the set of all possible solutions to the homogeneous equation Ax = 0, where A is the matrix and x is a vector.

2. What is the relationship between the null space of A and the null space of A'A?

The null space of A is a subset of the null space of A'A. This means that any vector that is in the null space of A will also be in the null space of A'A. However, the null space of A'A may contain additional vectors that are not in the null space of A.

3. How are the dimensions of the null space of A and the null space of A'A related?

The dimension of the null space of A is equal to the number of free variables in the solution to Ax = 0. The dimension of the null space of A'A is equal to the number of linearly independent columns in A. Therefore, the dimension of the null space of A will always be less than or equal to the dimension of the null space of A'A.

4. What does it mean if the null space of A is equal to the null space of A'A?

If the null space of A is equal to the null space of A'A, it means that every vector in the null space of A is also in the null space of A'A, and vice versa. This can occur when A is a square matrix with linearly independent columns, meaning it has a full column rank. In this case, the null space of A and A'A will both be the zero vector.

5. How can the null space of A = null space of A'A be used in practical applications?

This relationship can be used to simplify calculations and reduce the computational complexity in various applications. For example, in linear regression, the least squares solution can be found by solving the equation A'A x = A'b, where b is the response vector. Knowing that the null space of A'A is equal to the null space of A allows for a more efficient solution to this equation.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
533
  • Calculus and Beyond Homework Help
Replies
1
Views
407
  • Calculus and Beyond Homework Help
Replies
0
Views
408
  • Calculus and Beyond Homework Help
Replies
2
Views
446
  • Precalculus Mathematics Homework Help
Replies
10
Views
488
  • Calculus and Beyond Homework Help
Replies
2
Views
231
  • Calculus and Beyond Homework Help
Replies
15
Views
4K
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top