Proof concerning the Four Fundamental Spaces

In summary, the conversation is discussing the orthogonality of the row space to the null space and the column space to the left null space. The conversation references G. Strang's book, which contains a helpful picture that illustrates the orthogonality. They also discuss a matrix proof for this concept, with one proof showing the orthogonality of the row space to the null space and another showing the orthogonality of the column space to the left null space. The conversation ends with a question about the definition of the left null space and a statement about the usefulness of visualizing matrices in understanding this concept.
  • #1
Peter_Newman
155
11
Hello all,

I am currently working on the four fundamental spaces of a matrix. I have a question about the orthogonality of the
  1. row space to the null space
  2. column space to the left null space
------------------------------------------------

In the book of G. Strang there is this nice picture of the matrices that clarifies the orthogonality. This is what I mean:

FundamentalSpaces.png

With the help of these expressions, this is then argued:

$$Ax = \begin{pmatrix}\text{row 1} \\ ... \\ \text{row m}\end{pmatrix}\begin{pmatrix} \vdots \\ x \\ \vdots \end{pmatrix} = 0$$
$$A^Ty = \begin{pmatrix}\text{transposed column 1 of A} \\ ... \\ \text{transposed column n of A}\end{pmatrix}\begin{pmatrix} \vdots \\ y \\ \vdots \end{pmatrix} = 0$$

------------------------------------------------

There is also a matrix proof, and this is what I am very interested in. You can read my thoughts about it in the explanations below.

I have shown the whole thing as a picture. Using this picture I try to show the matrix proof.

1. proof for the orthogonality of the row space to the null space:
  • The row space vectors are combinations ##A^Ty## of the rows, therefore the dot product of ##A^Ty## with any ##x## of the null space must be zero:

$$\langle\,A^Ty,x\rangle = x^T(A^Ty) = (Ax)^Ty = 0^Ty = 0$$

2. proof of the orthogonality of the column space to the left null space:
  • The column space vectors are combinations ##Ax## of the columns, therefore the dot product of ##Ax## with any ##y## of the left null space must be zero:

$$\langle\, Ax, y\rangle = (Ax)^Ty = x^T(A^Ty) = x^T0 = 0$$

I think that is correct so far, but I am not quite sure. Therefore I would be very happy about a confirmation or contra position :)

Note 1: I would like to be honest enough to say that I have also asked this question to others HERE. However, I did not receive an answer there, but I would like to refer to this fact. If Someone here gives an answer, I would like to refer to it. I hope that's ok, because I can no longer delete my question in the other forum, but here I promise myself more adequate help (At least this was always so in the past :) )...

Note 2: This is not a homework assignment. I'm just reading the book in parallel and am interested in this question.
 
Physics news on Phys.org
  • #2
I think the easiest way to see the null space is orthogonal to the row space is to just noticed that if ##A x = 0##, then the first entry of ##Ax## is the dot product of ##x## and the first row of ##A##, and it must be zero. Similar to the other rows of ##A##.
 
  • Like
Likes Peter_Newman
  • #3
Hello and thank you for your answer. Yes, it is similar to that in Strang's book (see matrix part between the dashed lines in my first post). ##Ax = 0## is almost the definition from which this comes out. But how do you argue regarding the Left Nullspace and the Column Space?

I am especially interested in the proof with the help of the dot product.
 
  • #4
What is your definition of left null space?
 
  • #5
It is ##y \in N(A^T)## and ##Ax## is in the column space of Matrix ##A##. The definition for the left null space is ##A^Ty = 0##
 
  • #6
The picture looks as a graphic formulation, that every short exact sequence in ##\operatorname{Vec}## splits. In other words, we can find a basis of the kernel (or the image) and extend it to a basis of the entire spaces. Matrices require some bases anyway, so why not choose one which makes the matrices especially easy?
 
  • Like
Likes Peter_Newman
  • #7
Peter, ##A^t y =0## looks a lot like the ##Ax=0## formula we used to show the null space was orthogonal to the row space. What are the rows of ##A^t##?
 
  • Like
Likes Peter_Newman
  • #8
@Office_Shredder, the rows of ##A^T## are the columns of ##A##, I guess. With this you probably want to point out that this behaves almost like ##Ax = 0## (which we know is orthogonal. But only "rotated")?.

@fresh_42, the picture more or less summarizes the "action" of a matrix. This is in my opinion a nice way to visualize it (Strang's idea :) )
-----
What I want to point out:

This ##Ax = \begin{pmatrix}\text{row 1} \\ ... \\ \text{row m}\end{pmatrix}\begin{pmatrix} \vdots \\ x \\ \vdots \end{pmatrix} = 0## ist clear for me. I understand that the dot product is zero, means they are orthogonal.

This ##A^Ty = \begin{pmatrix}\text{transposed column 1 of A} \\ ... \\ \text{transposed column n of A}\end{pmatrix}\begin{pmatrix} \vdots \\ y \\ \vdots \end{pmatrix} = 0## is also understandable.

BUT:

I'am interested if this is right:
Let ##y \in N(A^T)## and ##Ax## is in the column space of Matrix ##A##. I want to prove that every vector ##y## is orthogonal to ##Ax##. Then ##\langle\, Ax, y\rangle = (Ax)^Ty = x^T(A^Ty) = x^T0 = 0##
 
Last edited:
  • #9
I think that your proof is correct at the end of your post.
 
  • Like
Likes Peter_Newman
  • #10
@Office_Shredder, thanks a lot!

I have one last question:

I have seen another prove (which comes not from me but is also interesting:) ) that derives this fact by starting from this statement ##A^Ty = 0##. I want to quote the steps to derive the orthogonality ##y^T\cdot Ax = 0##, to show what is unclear to me...

Step 1: ##A^Ty = 0##
Step 2: ##A^Ty = 0|x## multiply equation from left
Step 3: ##x^TA^Ty = 0|^T## transpose equation
Step 4: ##y^TAx = 0##

In this prove the step from 2 to 3 is unclear for me. What does "multiply equation from left" mean? Personally I like my prove more than this but, this prove here is in the meaning the same as my but in the derivation different.
 
  • #11
I think it's saying multiply both sides on the left by ##x^T##, and then obviously ##x^T 0= 0## so they don't even write the x on the right hand side I think.
 
  • Like
Likes Peter_Newman
  • #12
Thanks for your answer! I also see it that way, correctly it should then be called as follows in step 2:
Step 2: ##A^Ty = 0\quad|\cdot x^T##, multiply ##x^T## from left, (##x^TA^Ty = x^T0 = 0##)
 
  • #13
Peter_Newman said:
Thanks for your answer! I also see it that way, correctly it should then be called as follows in step 2:
Step 2: ##A^Ty = 0\quad|\cdot x^T##, multiply ##x^T## from left, (##x^TA^Ty = x^T0 = 0##)
If you are so picky, then you must write: ##A^Ty = 0\quad| x^T\cdot ##
 
  • Haha
Likes Peter_Newman
  • #14
If you like conceptual linear algebra you might think about this formulation:

If A:V-->W is a linear map from one vector space V to another W, and f:W-->k is a linear "functional" on W, i.e. a linear map to the scalars, then their composition fA = A*(f):V-->k is a linear functional on V. Thus a linear map A from V to W yields by composition a linear map A* in the other direction, i.e. from functionals on W to functionals on V.

The obvious subspaces associated to A are its kernel in V and its image in W. Then we can also ask for the kernel and image of A*. As to the kernel, i.e. what functionals f map to zero under A*, we ask when is fA = 0? This happens when f vanishes on the image of A, so the kernel of A* is orthogonal to the image of A.

As to the image of A*, we ask what functionals g on V have the form g = fA for some f on W? I.e. which functionals on V factor through A? This happens exactly when g vanishes on the kernel of A, (note at least if Ax=0, then certainly gx = fAx = 0). Thus the image of A* is orthogonal to the kernel of A.

The connection with your version is that the matrix of A* is the transpose of the matrix of A. Note e.g. that the ith row of A defines the scalar valued map fiA on V, which equals the composition of A followed by the scalar valued map fi on W which is projection on the ith "axis" in W defined by the given basis of W. Thus the rows of A span the image of A*, just as the columns of A span the image of A.

This may seem lengthy when written out, but it is all summarized in the diagram fA:V-->W-->k, preferably drawn as a triangle with sides A:V-->W, f:W-->k, fA:V-->k. Then you just look at it and ask yourself, given A, for which f is fA = 0, and which maps V-->k can occur as fA for some f? The answers almost force themselves on you.
 
  • Informative
Likes Peter_Newman
  • #15
Thanks for the good and helpful answers here. I think my question is now well answered!

@mathwonk, thank you for this other perspective about the things!
 

1. What are the four fundamental spaces?

The four fundamental spaces refer to the column space, null space, row space, and left null space of a matrix. These spaces are important in linear algebra and are used to describe the properties and relationships of matrices.

2. What is the importance of understanding the four fundamental spaces?

Understanding the four fundamental spaces allows us to gain insights into the properties of matrices and their applications in various fields such as physics, engineering, and computer science. It also helps in solving systems of linear equations and finding solutions to optimization problems.

3. How are the four fundamental spaces related to each other?

The column space and row space are related through the concept of orthogonality. The null space and left null space are also related through the concept of orthogonality. Additionally, the column space and null space are complementary, as are the row space and left null space.

4. How can one prove the properties of the four fundamental spaces?

The properties of the four fundamental spaces can be proven using various mathematical techniques, such as vector operations, matrix transformations, and linear algebra concepts. These proofs involve showing that the properties hold true for all possible matrices.

5. What are the real-world applications of the four fundamental spaces?

The four fundamental spaces have various real-world applications, such as in image and signal processing, machine learning, data compression, and network analysis. They are also used in solving problems related to linear transformations, eigenvalues and eigenvectors, and differential equations.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
888
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
876
  • Linear and Abstract Algebra
Replies
16
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
985
  • Linear and Abstract Algebra
Replies
10
Views
985
  • Linear and Abstract Algebra
Replies
15
Views
4K
Back
Top