1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Transformation (Image, Kernel, Basis, Dimension)

  1. Dec 6, 2015 #1
    Mod note: Moved from Precalc section
    1. The problem statement, all variables and given/known data

    Given l : IR3 → IR3 , l(x1, x2, x3) = (x1 + 2x2 + 3x3, 4x1 + 5x2 + 6x3, x1 + x2 + x3), find Ker(l), Im(l), their bases and dimensions.

    My language in explaining my steps is a little sloppy, but I'm trying to understand the process and put it in terms that I understand.

    2. Relevant equations
    ker(l) Ax=0

    3. The attempt at a solution
    Step 1: Find the matrix associated to this transformation using the standard basis.

    l(1,0,0) = (1,4,1)
    l(0,1,0) = (2,5,1)
    l(0,0,1) = (3,6,1)

    Im(l) =

    [1] [4] [1]
    [2] , [5] , [1]
    [3] [6] [1]

    To find Ker(l) = Ax=0. Put Im(l) into an augmented matrix and set it =0

    [1 4 1 |0]
    [2 5 1 |0]
    [3 6 1 |0]

    Reduced row echelon form =

    [1 0 -1/3 |0]
    [0 1 1/3 |0]
    [0 0 0 |0]

    X1= X1 [1] X2 [0] X3 [ -1/3]
    X2= X1 [0] + X2 [1] + X3 [ 1/3 ]
    X3= X1 [0] X2 [0] X3 [ 0 ]

    ker(l) =

    [1] [0] [-1/3]
    [0] , [1] , [1/3]
    [0] [0] [0]

    Basis(l) = minimum number of vectors that span the subspace

    [1] [0]
    [0] , [1]
    [0] [0]

    Dimension = number of vectors that span the subspace = 2

    I hope I've done this right.
     
    Last edited by a moderator: Dec 6, 2015
  2. jcsd
  3. Dec 6, 2015 #2

    Mark44

    Staff: Mentor

    ker(L) = {x in R3 such that Ax = 0}
    It doesn't look like you actually did anything here. The image of L, Image(L), is the set of all possible outputs of the transformation. IOW, Image(L) = {y in R3 such that y = Ax, for some x in R3}
    When you're finding the kernel, you don't need an augmented matrix. The rightmost column starts off with all zeroes, and can't possibly change, so there's no point in dragging it along. It doesn't hurt to have it, but it isn't necessary here (in finding the kernel).
    I get something different.
    This isn't how it works. Based on your last augmented matrix (which is incorrect), you have
    x1 = 1/3 x3
    x2 = -1/3 x3
    x3 = x3

    As a check on your work, if x = <1/3, -1/3, 1>^, is Ax = 0?
    No.
    This is not a basis for the kernel. As it turns out, dim(ker(L)) = 1.
     
  4. Dec 6, 2015 #3
    I'm a bit confused on how I am supposed to find the Im(l)
     
  5. Dec 6, 2015 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Trying to use matrices and matrix methods is almost a waste of time in this problem. To find the kernel, you just need to determine the dimensionality of the solution space to the linear system
    [tex]\begin{array}{l} x_1 + 2x_2 + 3x_3 = 0\\
    4x_1 + 5x_2 + 6x_3 = 0\\
    x_1 + x_2 + x_3 = 0
    \end{array} [/tex]
    More generally (since it also helps with other parts of the problem) you can consider the general system
    [tex]\begin{array}{l} x_1 + 2x_2 + 3x_3 = r_1\\
    4x_1 + 5x_2 + 6x_3 = r_2\\
    x_1 + x_2 + x_3 = r_3
    \end{array} [/tex]

    It is just as easy to deal with these systems directly and forget about matrices. For example, in the second system the third equation implies that ##x_3 = r_3 - x_1 - x_2##, and substituting that into the first two equations yields two equations in the two unknowns ##x_1, x_2##. When ##r_1 = r_2 = r_3 = 0## you can continue the process, using the (new) first equation to solve for ##x_2 ## in terms of ##x_1##, then see what that gives you for the (new) second equation. When the ##r## are not all zero, you will get a result for the third equation that makes sense only if ##r_1, r_2, r_3## are related in some way that you can determine from the resulting formulas. Figuring out what ##r##-values are allowed will tell you the image of ##l##. Figuring out what the solution looks like when all ##r##'s are zero will tell you what is the kernel of ##l##.
     
  6. Dec 6, 2015 #5
    l(x1, x2, x3) = (x1 + 2x2 + 3x3, 4x1 + 5x2 + 6x3, x1 + x2 + x3)

    x1, x2, x3 are vectors?

    l(x1) = x1 + 2x2 + 3x3
    l(x2) = 4x1 + 5x2 + 6x3
    l(x3) = 6x3, x1 + x2 + x3

    x1 + 2x2 + 3x3 = 0
    4x1 + 5x2 + 6x3 = 0
    x1 + x2 + x3 = 0

    x1 = x3
    x2 = -2x3
    x3 = free variable

    reduced row echelon form:

    [1 0 -1]
    [0 1 2 ]
    [0 0 0 ]

    Dimension = 2

    ker(l) =

    [1] [0]
    [0] , [1]
    [0] [0]
     
  7. Dec 6, 2015 #6

    Mark44

    Staff: Mentor

    No, they are components of a vector in R3.
    The above is not helpful.
    Edit: No, you're slightly off. This is correct See my comment below your reduced matrix.
    Also, this says that ##\begin{bmatrix} x_1 \\ x_2 \\ x_3\end{bmatrix} = x_3\begin{bmatrix} 1 \\ -2 \\ 1\end{bmatrix}##
    Is A times the last vector above equal to the zero vector? That's a sanity check you should always do in this sort of work.
    Edit: Check your arithmetic. You have an error in the first row.
    This is correct

    Also, the equations you have above should come from this row-reduced matrix, so I don't understand why you wrote the equations first, and then the reduced matrix.
    No. The equations you have above suggest that dim(Ker(L)) = 1, not 2.
     
    Last edited: Dec 6, 2015
  8. Dec 6, 2015 #7
    I learn much faster and better if I know the process before definitions because I get really bogged down with terminology and new definitions. I think in my rush to get the process down pat I've got a bit lost with this problem.

    A =
    x1 + 2x2 + 3x3
    4x1 + 5x2 + 6x3
    x1 + x2 + x3

    l (x1,x2,x3) = A (x1,x2,x3) =

    [ 1 2 3 ] [x1]
    [ 4 5 6 ] [x2]
    [ 1 1 1 ] [x3]

    [1] [2] [3]
    [4] x1 + [5] x2 + [6] x3
    [1] [1] [1]

    Does this mean Im(l) =

    [1] [2] [3]
    [4] , [5] , [6]
    [1] [1] [1]

    It looks similar to my first answer, only with a different orientation.
     
  9. Dec 6, 2015 #8

    Mark44

    Staff: Mentor

    This is really the wrong order. How can a process make any sense if you don't understand the basic definitions behind the process?
    If you don't understand the basic terms, such as ker(L), Im(L), and so on, you're just blindly applying some steps
    Yes. Instead of trying to use rote memorization of some process, focus on understanding what the terms mean.
    That's why I corrected what you wrote about Ker(L) and Im(L) in a previous post.

    This is tied into what Ray Vickson said, as well, that for this problem you don't really need to use matrices.
    No.
    I would start by finding Ker(L), which means solving the equations
    x1 + 2x2 + 3x3 = 0
    4x1 + 5x2 + 6x3 = 0
    x1 + x2 + x3 = 0

    You can do this by directly working with the equations, or you can row-reduce this matrix:
    ##\begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 1 & 1 & 1\end{bmatrix}##
    You almost had it in a previous post. Reread my post where I talk about this, and see if you can find Ker(L).
     
  10. Dec 6, 2015 #9
    I get a reduced row echelon form matrix:

    [ 1 0 -1]
    [ 0 1 2]
    [ 0 0 0]


    so ker(l) =
    [1] [0] [-1]
    [0] , [1] , [2]
    [0] [0] [0]
     
  11. Dec 6, 2015 #10

    Mark44

    Staff: Mentor

    Yes, this is correct (and I had a sign error in my work earlier).

    The equations that this matrix represents are just as you wrote before:
    In other words,
    ##\begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = x_3 \begin{bmatrix} 1 \\ -2 \\ 1\end{bmatrix}##
    This should be a pretty good hint about Ker(L) and its size (dimension).

     
  12. Dec 6, 2015 #11
    Yep!

    X1=
    [1]
    [0]
    [1]

    X2=
    [0]
    [1]
    [-2]

    X3 =
    [1]
    [-2]
    [1]
     
  13. Dec 6, 2015 #12
    ker(l)=

    [1] [0] [1]
    [0] , [1] , [-2]
    [1] [-2] [1]
     
  14. Dec 6, 2015 #13
    Only independent variable, so only one basis vector. That is done by making x3=1

    ker(l) =

    [1]
    [-2]
    [1]

    Checked it by plugging in values into original transformation and I get 0s
     
  15. Dec 6, 2015 #14

    Mark44

    Staff: Mentor

    No, I don't know what you're doing here.
    Or here, either.
    Yes! The kernel is the one-dimensional vector space spanned by this vector. IOW, the kernel is a line through the origin in the direction of <1, -2, 1>.

    Next up is figuring out what the Im(L) is.
    By definition, which I'm hoping you're starting to realize is something important, Im(L) = {y in R3, such that Ax = y, for some x in R3}.
    To find Im(L) you want to find out if there is some vector x that produces an arbitrary y (denoted as <a, b, c> below).

    ##\begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 1 & 1 & 1\end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3\end{bmatrix} = \begin{bmatrix} a \\ b \\ c \end{bmatrix}##
    You can set this up as an augmented matrix whose fourth column is <a, b, c>T. Your textbook should have an example or two of how this works.
     
  16. Dec 6, 2015 #15
    My textbook doesn't even use augmented matrices -- it's easily the worst formatted piece of text I've ever read, but I digress...

    The definition you put of Im(l) is a little vague. If Ax= y, and I multiply the matrix A by x1,x2,x3 and that = a,b,c then I get the three vectors below, which was one of my original answers.

    [1] [4] [1]
    [2] , [5] , [1]
    [3] [6] [1]
     
  17. Dec 6, 2015 #16

    Mark44

    Staff: Mentor

    In the first post of this thread you set up an augmented matrix, so you must have at least heard about them, even if your textbook doesn't use them.

    What part of the definition I gave are you uncertain about?
    I don't know what you're doing here.
    ##\begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 1 & 1 & 1\end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3\end{bmatrix} = \begin{bmatrix} a \\ b \\ c \end{bmatrix}##

    The augmented matrix looks like this:
    ##\begin{bmatrix} 1 & 2 & 3 & | & a \\ 4 & 5 & 6 & | & b\\ 1 & 1 & 1 & | & c\end{bmatrix}##
    Row reduce this augmented matrix, and the final matrix gives you a set of equations that show which vectors ##<x_1, x_2, x_3>## have <a, b, c> as their image. Again, your textbook must have at least one example of how this works.
     
  18. Dec 6, 2015 #17
    I row reduced, not sure if I did it right. I got a set of equations on the RHS of the equation but I'm not entirely sure what they mean.
    I got this row reduced echelon form matrix on the LHS
    [1 0 -1]
    [0 1 2]
    [0 0 0 ]

    Equations (from top to bottom) on the right hand side of this matrix are as follows

    (2b-5a)/3
    (-b+4a)/3
    c-a-(b/3)+(4a/3)
     
  19. Dec 6, 2015 #18

    Mark44

    Staff: Mentor

    So combining what you showed, the final augmented matrix looks like this:
    ##\begin{bmatrix} 1 & 0 & -1 & | & -(5/3)a + (2/3)b\\ 0 & 1 & 2 & | & (4/3)a - (1/3)b\\ 0 & 0 & 0 & | & -(1/3)a + (1/3)b -c\end{bmatrix}##

    Since the third row on the left side of the augmented matrix is all zeroes, the corresponding equation is
    ##0x_1 + 0x_2 + 0x_3 = -(1/3)a + (1/3)b -c##
    That equation gives you a condition on the possible values of a, b, and c. And if you play with it a bit, it will give you a basis for Im(L).
     
  20. Dec 6, 2015 #19
    If I setup in a similar way that i did for solving the kernel

    [X1] [-5/3 ] [2/3] [0]
    [X2] a [ 4/3 ] + b [-1/3] + c [0]
    [X3] [ 1/3 ] [-1/3] [1]

    c = -(1/3)a + (1/3)b

    setting c to 1

    -(1/3)6 + (1/3)21 = 1

    -(5/3)6+(2/3)21 + c = -10+14= 4

    (4/3)6 - (1/3)21 = 8-7 = 1

    Basis for Im(l) =
    [4]
    [1]
    [1]
     
  21. Dec 6, 2015 #20

    Mark44

    Staff: Mentor

    No.

    As it turns out, and you might not know this yet, a basis for Im(L) will have to have two vectors.

    This equation will tell you everything you need to know about Im(L):
    ##0x_1 + 0x_2 + 0x_3 = -(1/3)a + (1/3)b -c##

    What conditions does it place on a, b, and c? These conditions can give you an idea of the possible outputs of Ax.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted