Linear Algebra - Left Null Space

In summary, the left null space is a plane in ##\mathbb{R}^5##, where any two linearly independent vectors in that plane will serve as a basis for the left nullspace.
  • #1
YoshiMoshi
226
8

Homework Statement



I am given the follow graph and asked to find the left null space

Capture.png


Homework Equations

The Attempt at a Solution



First I start by transpose A because I know that the left null space is the null space of the incidence matrix transposed. I then reduce it to reduce row echelon form and use the follow relations
R = [I F;0 0]
N = [-F I]^T
This provides me with the left null space being
y_n = [1 - 1 0 1 0]^T, [-1 1 -1 0 1]^T

below you'll find my work

IMG_20160303_242503426.jpg


apparently I did something wrong because the answer is

Capture.png


I don't see what I did wrong in my computation. I believe that perhaps my error is in the conceptual understanding and I am not interpreting something correctly and missing a process. Thanks for any help someone can provide
 
Physics news on Phys.org
  • #2
A vector in the left nullspace is a vector in ##\mathbb{R}^5##, not ##\mathbb{R}^4##. What you're doing is solving this equation:
##\begin{bmatrix} x_1 & x_2 & x_3 & x_4 & x_5 \end{bmatrix} \begin{bmatrix} -1 & 1 & 0 & 0 \\ -1 & 0 & 1 & 0 \\ 0 & -1 & 1 & 0 \\ 0 & -1 & 0 & 1 \\ 0 & 0 & -1 & 1 \end{bmatrix} = \vec{0}##,
where ##\vec{0}## is the zero vector in ##\mathbb{R}^4##.
That works out to ##x_1## times row 1, ##x_2## times row 2, and so on, up to ##x_5## times row 5.

If you take the transpose of the matrix, the multiplications become column 1 (of the transpose) times ##x_1##, column 2 times ##x_2##, ..., column 5 times ##x_5##. You can row reduce the transposed matrix to find the solutions.

The last matrix you show agrees with what I got, so I think you have misinterpreted what it means.
Row 1 of your reduced matrix means ##x_1 - x_3 + x_5 = 0##. Row 2 means ##x_2 + x_3 - x_5 = 0## and Row 5 means ##x_4 + x_5 = 0##.

So,
x1 = x3 - x5
xx = -x3 + x5
x3 = x3
x4 = ... .. -x5
x5 = ... .. x5
If you stare at this a bit, you'll see it's linear combinations of two vectors.

BTW, I get the vector you show as ##y_{N, 1}##, but I get a different vector than what you show for ##y_{N, 2}##. (I don't understand what the subscript notation indicates here.)

Both my vectors work, as do the two that you show as the answer. Geometrically, the left nullspace is a plane in ##\mathbb{R}^5##, so any two linearly independent vectors in that plane will serve as a basis for the left nullspace.
 
  • #3
I see so when get it down to row echelon form it is apparent that there are two free columns in the matrix so therefore there are two vectors that make up the null space of A transpose. So the null space space is the span of two independent vectors or a plane in R^5 and any two linearly independent vectors will do.

I never thought of it in this way. In the answer key they state that

The vectors in the left null space may be found by any loop in the element space, where +1 follows the path direction and -1 reverses the path direction.

So graphically the left null space is ANY (?) closed loop around the graphical representation of the matrix? Can it really be indeed any closed loop?

Thanks for your help.
 
  • #4
YoshiMoshi said:
I see so when get it down to row echelon form it is apparent that there are two free columns in the matrix so therefore there are two vectors that make up the null space of A transpose. So the null space space is the span of two independent vectors or a plane in R^5 and any two linearly independent vectors will do.
YoshiMoshi said:
I never thought of it in this way. In the answer key they state that

The vectors in the left null space may be found by any loop in the element space, where +1 follows the path direction and -1 reverses the path direction.

So graphically the left null space is ANY (?) closed loop around the graphical representation of the matrix? Can it really be indeed any closed loop?
I'm not following this explanation. I couldn't remember how to get the incidence matrix (too many years since I did anything like that). I just used the basic definition of the nullspace of a matrix. For the left nullspace you're looking for all vectors x such that xA = 0. The work I did used that idea.

How you're describing the vectors in the left nullspace ("by any loop in the element space...") sounds more like how you get the entries in the incidence matrix. I don't see how the vector <1 -1 1 0 0>^T represents some closed loop in the graph.
 
  • #5
YoshiMoshi said:

Homework Statement



I am given the follow graph and asked to find the left null space

Capture.png


Homework Equations

The Attempt at a Solution



First I start by transpose A because I know that the left null space is the null space of the incidence matrix transposed. I then reduce it to reduce row echelon form and use the follow relations
R = [I F;0 0]
N = [-F I]^T
This provides me with the left null space being
y_n = [1 - 1 0 1 0]^T, [-1 1 -1 0 1]^T

below you'll find my work

View attachment 96736

apparently I did something wrong because the answer is

Capture.png


I don't see what I did wrong in my computation. I believe that perhaps my error is in the conceptual understanding and I am not interpreting something correctly and missing a process. Thanks for any help someone can provide

Is there some reason why your ##A## has row ↔ arc and column ↔ node? Almost any source I could cite would have it the other way, so your ##A## would be the transpose of the incidence matrix for almost everybody else.
 
  • #6
That's weird that's what we were told were the rows are the edges and the columns are the nodes. It does make a difference if you do it the other way around. I've been doing it wrong?

I think for this one <1 -1 1 0 0>^T if you start at node 1 and go to node 2 you have <1 X X X X>^T and then go from node 2 to node 3 you have <X X 1 X X>^T and then you go from node 3 to node 1 (the opposite direction in the graph) you have <X -1 X X X>^T giving us <1 -1 1 0 0>^T which is the upper triangle loop. The other answer they give for the left null space is <0 0 1 -1 1>^T which is the closed loop around the bottom triangle. I just don't understand why is the closed loop around the upper and lower triangle considered the left null space of the graph?
 
  • #7
YoshiMoshi said:
That's weird that's what we were told were the rows are the edges and the columns are the nodes. It does make a difference if you do it the other way around. I've been doing it wrong?

I think for this one <1 -1 1 0 0>^T if you start at node 1 and go to node 2 you have <1 X X X X>^T and then go from node 2 to node 3 you have <X X 1 X X>^T and then you go from node 3 to node 1 (the opposite direction in the graph) you have <X -1 X X X>^T giving us <1 -1 1 0 0>^T which is the upper triangle loop. The other answer they give for the left null space is <0 0 1 -1 1>^T which is the closed loop around the bottom triangle. I just don't understand why is the closed loop around the upper and lower triangle considered the null space of the graph?

No, it should not make a difference. However, every matrix has left- and right- null spaces, so you need to be sure about which one you really want. A left null space for your ##A## corresponds to taking a linear combination of columns and getting the zero vector, so is a solution of
[tex] \sum_{\text{arc}\; (i,j)} a_{ji} x_{j} = 0 \; \forall \; \text{node} \; i [/tex]
 
  • #8
Ok so that makes since but I don't understand the graphical interpretation of the left null space. Why is the closed loop of the upper and lower triangle of the graphical representation of the system the left null space? I see it's like you start at one node and then return to the node making the net displacement around the graph zero and you are using a combination of the edges/arcs to get back to were you started. But why is that the left null space and not the null space using this graphical approach.

Could I have just
start at node 1 and walk along edge 1 in the positive direction to get to node 2 <1 X X X X>^T
start at node 2 and walk along edge 4 in the positive direction to get to node 4 <X X X 1 X>^T
start at node 4 and walk along edge 5 in the negative direction to get to node 3 <X X X X -1>^T
start at node 3 and walk along edge 2 in the negative direction to get to node 1 <X -1 X X X>^T
This allows me to transverse around a complete closed loop ending up were I started giving me
<1 -1 0 1 -1>?

I saw this vector before when I was storing the elimination matrix and saw the free rows to be
<1 -1 0 1 -1>^T, <1 -1 1 0 0>^T
IMG_20160303_115116480.jpg

Thanks for your help!
 
  • #11
So in summary for the left null space:

Upper Loop
start at node 1 and walk along edge 1 in the positive direction to get to node 2 <1 X X X X>^T
start at node 2 and walk along edge 3 in the positive direction to get to node 3 <X X 1 X X>^T
start at node 2 and walk along edge 2 in the negative direction to get to node 1 <X -1 X X X>^T
<1 -1 1 0 0>^T
confirmation: http://www.wolframalpha.com/input/?...,-1,0,1},{0,0,-1,1}})*transpose{{1,-1,1,0,0}}

Lower Loop
start at node 4 and walk along edge 5 in the negative direction to get to node 3 <X X X X -1>^T
start at node 3 and walk along edge 3 in the negative direction to get to node 2 <X X -1 X X>^T
start at node 2 and walk along edge 4 in the positive direction to get to node 4 <X X X 1 X>^T
<0 0 -1 1 -1>^T
confirmation: http://www.wolframalpha.com/input/?...-1,0,1},{0,0,-1,1}})*transpose{{0,0,-1,1,-1}}

Outer Loop
start at node 1 and walk along edge 1 in the positive direction to get to node 2 <1 X X X X>^T
start at node 2 and walk along edge 4 in the positive direction to get to node 4 <X X X 1 X>^T
start at node 4 and walk along edge 5 in the negative direction to get to node 3 <X X X X -1>^T
start at node 3 and walk along edge 2 in the negative direction to get to node 1 <X -1 X X X>^T
<1 -1 0 1 -1>^T
confirmation: http://www.wolframalpha.com/input/?...-1,0,1},{0,0,-1,1}})*transpose{{1,-1,0,1,-1}}

So I get three vectors:
A=<1 -1 1 0 0>^T
B=<0 0 -1 1 -1>^T
C=<1 -1 0 1 -1>^T

I'm kind of confused. I get the two vectors they provide plus the one in the outer loop. I don't exactly see why do all of these lie in the same plane in R^5?
 
  • #12
YoshiMoshi said:
So I get three vectors:
A=<1 -1 1 0 0>^T
B=<0 0 -1 1 -1>^T
C=<1 -1 0 1 -1>^T

I'm kind of confused. I get the two vectors they provide plus the one in the outer loop. I don't exactly see why do all of these lie in the same plane in R^5?
Because they are linearly dependent.

Note that 1 * <1 -1 1 0 0>^T + 1 * <0 0 -1 1 -1>^T + 1 * <1 -1 0 1 -1>^T = <0 0 0 0 0>^T
Here, ##c_1 = c_2 = c_3 = 1##, but any three equal values for the constants would also result in the zero vector.

A set of vectors ##v_1, v_2, \dots, v_n## is linearly dependent if the equation ##c_1v_1 + c_2v_2 + \dots + c_nv_n = 0## has a nontrivial solution; i.e., a solution where at least one of the constants ##c_i## is not zero.

Since no vector of the three above is a multiple of any other vector, any two of them is a linearly independent set. Any two of the vectors (of A, B, and C) would define a plane in ##\mathbb{R}^5##.
 
  • #13
But can't we just add the components together?
1 * <1 -1 1 0 0>^T + 1 * <0 0 -1 1 -1>^T + 1 * <1 -1 0 1 -1>^T = <(1+0+1),(-1+0-1),(1-1+0),(1+1),(0-1-1)>^T = <2,-2,0,2,-2>^T?
 
  • #14
My mistake -- I used the vector I found, which is the negative of the third one in your list.

1 * <1 -1 1 0 0>^T + 1 * <0 0 -1 1 -1>^T + (-1) * <1 -1 0 1 -1>^T = <0 0 0 0 0>^T. So here is a solution where the constants (the scalars multiplying the vectors) are not all zero. The three vectors are linearly dependent. Everything else I said still applies.
 
  • #15
I should add that my approach in this problem is strictly one from a linear algebra perspective. I am not looking at this from the perspective of graph theory or incidence matrices or loops.
 
  • #16
I see so that's why it only provided the first two because the outer loop is dependent of the first two.
 

1. What is the left null space in linear algebra?

The left null space, also known as the null space of the transpose, is the set of all vectors that, when multiplied by a matrix, result in a zero vector. In other words, it is the set of all solutions to the equation ATx = 0, where AT is the transpose of the matrix A.

2. How is the left null space related to the row space?

The left null space and the row space of a matrix are related by the fact that they are orthogonal complements of each other. This means that the left null space contains all the vectors that are orthogonal to the row space, and vice versa.

3. How do you find the basis for the left null space?

To find the basis for the left null space, you can use the same method as finding the basis for the null space. This involves solving the system of equations ATx = 0 and then using the solutions to form a basis. Alternatively, you can use the reduced row echelon form of AT to find the basis.

4. Can the left null space be empty?

Yes, it is possible for the left null space to be empty. This occurs when the transpose of the matrix A has full row rank, meaning that all the rows are linearly independent. In this case, there are no vectors that satisfy the equation ATx = 0, and therefore the left null space is empty.

5. What is the dimension of the left null space?

The dimension of the left null space is equal to the number of columns minus the rank of the matrix. In other words, if A has n columns and rank r, then the dimension of the left null space is n - r. This is because the left null space is the orthogonal complement of the row space, which has dimension r.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
622
  • Calculus and Beyond Homework Help
Replies
1
Views
459
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
1
Views
609
  • Calculus and Beyond Homework Help
Replies
2
Views
524
  • Calculus and Beyond Homework Help
Replies
2
Views
336
  • Calculus and Beyond Homework Help
Replies
24
Views
798
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
944
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
Back
Top