# Linear transformation given a nullspace and a solution space.

## Homework Statement

Find if possible a linear transformation R^4-->R^3 so that the nullspace is [(1,2,3,4),(0,1,2,3)] and the range the solutions to x_1+x_2+x_3=0.

-

## The Attempt at a Solution

So I thought I should start with trying to find what kind of matrix we have from the information about the nullspace we have. I start with interpreting it as:
x_1=t, x_2 = 2t+s , x_3 = 3t+2s , x_4 = t+3s . Then I'm completly stuck. Any help wuld be nice!

Let's simplify the problem for the time being. Suppose that your null-space is (1,0,0,0) and (0,1,0,0). Can you construct a matrix for this problem? (Hint: how do the entries of a matrix relate to the image of the linear transformation?)

owlpride: I do know how to change bases, but I'm not that good at it. How do you mean with the image?

The image of a function is just it's range. Sorry for the confusion.

Forget about changing bases for a minute. We will do this in the end.

For the time being, we only want a linear map that maps (1,0,0,0) and (0,1,0,0) to (0,0,0). Where could you map the remaining two basis vectors (0,0,1,0) and (0,0,0,1) to in order to generate the correct range?

(0,0,0) ?

If you sent them to (0,0,0), then all of R^4 would be in the kernel of your map. But you want the kernel to be a two-dimensional subspace (we temporarily picked (1,0,0,0) and (0,1,0,0) as generators for the kernel instead of (1,2,3,4) and (0,1,2,3)).

Can you name some vectors that lie in the plane x1 + x2 + x3 =0? Because that's the plane that the range is supposed to cover, right?

Last edited:
yeah, you're right.
I guess:
(1,-1,0) would be a vector there, as well would (1,0,-1), right?

Good choices, because they are linearly independent!

What does the matrix look like for a linear map that does the following?
(1,0,0,0) -> (0,0,0)
(0,1,0,0) -> (0,0,0)
(0,0,1,0) -> (1,-1,0)
(0,0,0,1) -> (1,0,-1)

Isn't it simply the latter matrix that does it?

It is! That's why it is easier to solve the problem with the standard basis first. And the best part is that we can use this matrix to compute the matrix you are really interested in!

Suppose you found a second matrix that sends (1,2,3,4) to (1,0,0,0) and (0,1,2,3) to (0,1,0,0). Then the product of this matrix with the one we already have gives you the matrix you were initially looking for! In other words, you can construct the map you are looking for as the composition of two maps:

(1,2,3,4) -> (1,0,0,0) -> (0,0,0)
(0,1,2,3) -> (0,1,0,0) -> (0,0,0)
___v3___-> (0,0,1,0) -> (1,-1,0)
___v4___-> (0,0,0,1) -> (1,0,-1)

There are two things you still need to do: find appropriate vectors v3 and v4 (what is important about them?), and find a matrix for the first map (what does its inverse look like?).

owlpride: Hmm, regarding the vectors v3 and v4, should they simply be (0, 0,1,0) and (0,0,0,01) ? I think not, but I can't come up with any better criterion... That they're in the plane?

(0,0,1,0) and (0,0,0,1) would work. They only have to form a basis for R^4 together with (1,2,3,4) and (0,1,2,3). If you used linearly dependent vectors, you would run into trouble constructing the map; linear maps always send dependent vectors to dependent vectors, and the four standard basis vectors are linearly independent.

If your problem was more restrictive, it could determine exactly what v3 and v4 should be. But the current statement it only specifies the range, not which vectors in the domain should be sent to which vector in the range. So you get to pick!

OK, so I have to find the product of the inverse of (1,2,3,4) (0,1,2,3) ,(0,0,1,0),(0,0,0,1) and the matrix we found in the beginning? :)

Yes!!! :) Just pay attention to the order in which you multiply the matrices.

Call the matrix that we found last for B, the one that we found first for A.
Then:
inv(B)*A would give it, no?

The other way round :)

A* inv(B) * (1,2,3,4) = A* (1,0,0,0) = (0,0,0).

Ah, how come we take it the otherway around though? And why the (1,2,3,4)? :)

(1,2,3,4) was just an example vector. I wanted to show you why you had to multiply the matrices the other way round. The matrix you are looking for (call it M) should satisfy M*(1,2,3,4) = (0,0,0), among other things.

If M = inv(B)*A, then M*(1,2,3,4) = inv(B)*A*(1,2,3,4). But A does not know what to do with the vector (1,2,3,4) - well, it does, but it does not do anything "nice" with it. We constructed A so that it sent the standard basis vectors to vectors that we controlled.

inv(B), on the other hand, knows exactly what to do with (1,2,3,4). It was constructed so that inv(B)*(1,2,3,4) = (1,0,0,0). When we compose the matrices, we get

A* inv(B) * (1,2,3,4) = A* (1,0,0,0) = (0,0,0).

That's exactly what we want M to do!

Is there any general rule regarding which way to multiplcate composed matrices like this? :) Thanks for all your help!

There are rules but I keep confusing them. The safest bet seems to be to think about what each matrix does individually and test the outcome of the different orders of compositions, like I did in the previous post.

In this example, you could consider inv(B) a change of basis, and those usually come first. Another good thing to look out for are dimensions. inv(B) sends R^4 -> R^4, and A sends R^4 -> R^3. Hence inv(B) has to act first (i.e. B is the right-most matrix in the product) so that you get maps R^4 -> R^4 -> R^3. Applying A first would leave inv(B) with a vector in R^3, which makes no sense...

Last edited:
When I do as we said, I get the matrix:
3 -1 2
-5 2 -3
1 -1 0
1 0 1
But my matrix should be the transpose of that. Anyone willing to help?

A is
(0,0,1,1)
(0,0,-1,0)
(0,0,0,-1)

B is
(1,0,0,0)
(2,1,0,0)
(3,2,1,0)
(4,3,0,1)

A*inv(B) gives me the correct matrix.