How to verify that the nullspace is orthogonal to the row space?

In summary, the conversation discusses verifying the orthogonality of the nullspace and row space of B. The first part involves finding the nullspace of B and determining the special solution as a multiple of the free variable. The second part involves finding the nullspace of B^T and using it to find the left nullspace. The third part involves finding the row space of B and using it to verify that the nullspace of B is orthogonal to the row space. The dimensions and ranks of the spaces are also mentioned.
  • #1
DryRun
Gold Member
838
4

Homework Statement


How to verify that the nullspace is orthogonal to the row space of B?
I have inserted the screen-shot of the problem below:
[URL]http://i29.fastpic.ru/big/2011/0918/10/ca341692cc37b831143f5fe32351db10.jpg[/URL]

Homework Equations


Nullspace and orthogonality.

The Attempt at a Solution


I have self-studied that entire chapter of ortho with much difficulty and confusion but i have done the question to the best of my current abilities. Can someone please check on my answers and help with the third part of the question. Please let me know if i have presented my work/answers in the correct/best way. The lecturer is very itchy at corrections if there are slight deviations from the perfect answer/s.

So, for part (i):
I did an augmented matrix for Ax =0 and then reduced the augmented matrix to reduced row echelon form. The pivot variables are: x1, x2 and x3
The free variable is x4.

The answer for part (i) is x4 (1 -1 1 1 )^T
or should it be just (1 -1 1 1 )^T? Which would be the special solution? But since it's asking for the nullspace, i suppose i should give the special solution as a multiple of the free variable??

For part (ii):
From what i understand, left nullspace (quite a revelation to me) is actually another way of saying that it is the nullspace of B^T
So, i wrote the matrix B^T by writing the rows as columns.
Again, i wrote it as augmented matrix, with Ax = 0
Now, the reduced matrix in reduced row echelon form that i obtained is a bit unusual so I'm not sure of what i got, but it has to be an upper triangular matrix, right?
Here it is below. Did i do the right thing for the last row?

1 1 -2
0 2 1
0 0 -1
0 0 0

The pivot variables are x1, x2 and x3
The free variable is x4.

The answer is (not sure if it's correct?)

x4(0 0 0 1)^TFor part (iii):

If I'm right, then,
The row space of B = column space of B^T

So, i write the transpose matrix of B. Then i proceed to find the general solutions of Ax = 0 in terms of free variables.

I write the augmented matrix and then reduce the augmented matrix to reduced row echelon form, and this is what i get:

1 1 -2 0
0 2 1 0
0 0 -1 0
0 0 0 0

So, the pivot variables are x1, x2 and x3
The free variable is x4
The pivots are 1, 2 and -1

To verify, i write:

The row space of B = column space of B^T is a subspace of R^m (where m = 3)
i think m denotes the number of rows containing pivots, right? and n would then denote the number of pivot columns? I'm sorry for asking so many questions but i am doing self-study and this chapter is quite difficult to grasp.

Since the null space of B and the row space of B are both subspaces of R^3, therefore, the nullspace of B is orthogonal to the row space B. (verified? is the wording right?)

There is something mentioned about dimensions/ranks but i don't know if that's the way or alternative way of verifying this last part of the question??
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
so for i) you can see the matrix rows are linearly independent, and hopefully this was obvious in your row reduction. This means the dimension of the row space is 3, so we expect the dim of the nullspace to be 1.

Check the vector you found, we see
[tex] \begin{pmatrix}
1 & 0 & -2 & 1 \\
1 & 2 & -2 & 3 \\
-2 & 1 & 3 & 0 \\
\end{pmatrix}
\begin{pmatrix}
1 \\
-1 \\
1 \\
1 \\
\end{pmatrix}
=
\begin{pmatrix}
0 \\
0 \\
0 \\
0 \\
\end{pmatrix}
[/tex]

So the null space is given by (1,1,-1,1)^T, also note that this check has in fact answered iii) as well, as performing the matrix multiplication is essentially performing a dot product between the vector (1,1,-1,1)^T and each row vector of the matrix

so for part ii) as you mention it is actually solving the problem
[tex] x^TB = 0^T[/tex]

taking the transpose gives
[tex] B^Tx = 0[/tex]

so look like you're heading in the right direction
 

1. What is the nullspace and row space?

The nullspace and row space are both vector spaces that can be formed from a matrix. The nullspace contains all vectors that, when multiplied by the matrix, result in the zero vector. The row space contains all linear combinations of the rows of the matrix.

2. Why is it important for the nullspace to be orthogonal to the row space?

If the nullspace and row space are orthogonal, it means that they are perpendicular to each other. This is important because it ensures that any vector in the nullspace is independent from any vector in the row space. This allows for unique solutions when solving systems of linear equations.

3. How can I verify that the nullspace is orthogonal to the row space?

To verify that the nullspace is orthogonal to the row space, you can use the dot product. Take any vector from the nullspace and any vector from the row space, and calculate their dot product. If the result is zero, then the two vectors are orthogonal and the nullspace is indeed orthogonal to the row space.

4. What if the nullspace and row space are not orthogonal?

If the nullspace and row space are not orthogonal, it means that there is some overlap between the two spaces. This can lead to redundancy and ambiguity when solving systems of linear equations, as there may be multiple solutions that satisfy the equations.

5. How does the concept of orthogonal nullspace and row space relate to linear independence?

The concept of orthogonal nullspace and row space is closely related to linear independence. If the nullspace and row space are orthogonal, it means that the vectors in these spaces are linearly independent. This means that each vector is unique and cannot be expressed as a combination of the others, which is an important property in linear algebra.

Similar threads

  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
982
  • Calculus and Beyond Homework Help
Replies
1
Views
640
  • Linear and Abstract Algebra
Replies
8
Views
874
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
23
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Back
Top