Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear Algebra- Onto and One to One Linear Transformations

  1. Nov 3, 2012 #1
    Hey guys, I'm studying these concepts in linear algebra right now and I was wanting to confirm that my interpretation of it was correct.

    One to one in algebra means that for every y value, there is only 1 x value for that y value- as in- a function must pass the horizontal line test (Even functions, trig functions would fail (not 1-1), for example, but odd functions would pass (1-1))

    Onto means that in a function, every single y value is used, so again, trig and event functions would fail, but odd functions would pass- Any kind of function with a vertical asymptote would pass

    So i tried to put these concepts in the context of linear functions and this is what I'm thinking-


    Since transformations are represented by matrices,

    Linearly independent transformation matrices would be considered one to one- because they have a unique solution. Linearly dependent transformations would not be one-to-one because they have multiple solutions to each y(=b) value, so you could have multiple x values for b

    Now for onto, I feel like if a linear transformation spans the codomain it's in, then that means that all b values are used, so it is onto.

    Examples:

    1-1 but not onto

    A linearly independent transformation from R3->R4 that ends up spanning only a plane in R4

    Onto but not 1-1

    A linearly dependent transformation from R3->R2 thats spans R2

    1-1 AND onto

    A linearly independent transformation from R3->R3 that spans R3

    Neither 1-1 nor onto

    A linearly dependent transformation from R2->R2 that spans a line


    Is this interpretation correct?
     
  2. jcsd
  3. Nov 3, 2012 #2

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    So an odd function such as sin(x) would be one-to-one?

    So sin(x) is onto?

    So |tan(x)| is onto??

    And what exactly is a "linear independent matrix" or "linearly dependent transformation"??

    How does a linear transformation span the codomain exactly?? What does that mean?

     
  4. Nov 3, 2012 #3
    What are you getting at? If I'm incorrect, then just tell me
     
  5. Nov 3, 2012 #4

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    I just gave feedback on your post and I gave you some things to think about.

    Furthermore, I honestly do not understand what you mean with linearly dependent matrix or a transformation spanning the codomain. So you should really say what you mean with that.
     
  6. Nov 3, 2012 #5
    A linearly dependent matrix is a matrix that is linearly dependent matrix...I don't know how I can really explain this...you understand what linear dependence means, right?

    Also, when I say spans the codomain, I mean that the b in T(x)=b could be any vector in the codomain...
     
  7. Nov 3, 2012 #6

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    I know very well what linear dependence means, but not in the context you are talking about. To me a set [itex]\{v_1,...,v_n\}[/itex] is linearly independent if for all [itex]\alpha_1,...,\alpha_n\in \mathbb{R}[/itex] holds:
    [tex]\alpha_1v_1+...\alpha_nv_n=0~\Rightarrow \alpha_1=...=\alpha_n=0[/tex]

    So, we are talking about a set of vectors here that is linearly indepenent. What you mean with a linearly independent matrix is a mystery to me. How does your book define it?

    What is b, what is x?? Can you look the definition up in your textbook and quote it here?
     
  8. Nov 3, 2012 #7
    Ok, well linear dependence in context of a matrix is just like linear dependence with a set of vectors...

    for example

    [1 4 8 3]
    [2 4 1 7]
    [3 2 6 7]

    If this^ matrix is linearly dependent, then it is equivalent to saying that these vectors:

    [1] [4] [8] [3]
    [2] [4] [1] [7]
    [3],[2],[6],[7]

    are linearly independent...
    (The formal definition is that if you can set a matrix Ax=0 and your only solution is x=0, then the matrix/set of vectors is linearly independent)



    Here is what I mean by b

    [A|b] is the augmented matrix

    so that

    Ax=b

    where
    A is the coefficient matrix
    x is the solution or kernel of solutions
    and b is the vector in question to be found

    Analogous to regular algebra

    Ax=b

    is to

    mx=y
     
  9. Nov 3, 2012 #8

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    OK, so you define a matrix to be linearly (in)dependent if their column vectors are linearly (in)dependent? I've never really seen this definition before, but ok.

    This is indeed equivalent to

    [tex]Ax=0~\Rightarrow x=0[/tex]

    for all x.

    And indeed, if a matrix satisfies that, then it is one-to-one. So any "linearly independent matrix" is one-to-one.

    With "A spanning the codomain" you seem to mean that Ax=b has a solution for every b in the codomain. This is indeed equivalent to onto.

    About your examples:

    I don't think it is possible for a transformation to be both "linearly independent" and only spanning a plane. So while your example does imply that the transformation is 1-1 but not onto, I fear that there are no such transformations.

    OK

    Also ok.
    A nice fact in linear algebra is the following: Let T is a transformation from [itex]\mathbb{R}^n\rightarrow \mathbb{R}^n[/itex] (so very important: domain and codomain must have the same dimension). If T is 1-1, then it is onto. And if T is onto, then it is 1-1.
    This is sometimes called the alternative theorem.
     
  10. Nov 3, 2012 #9
    Ok cool, and that alternative theorem is very convenient and it makes sense....thanks for the help!
     
  11. Nov 3, 2012 #10
    Well about that 1-1 but not onto thing, I just did an example-

    If you want to transform

    [x1]
    [x2]

    ->

    [3*x1]
    [x1+4*x2]
    [x1+5]


    You'd get a transformation matrix

    [3 0] ... ........[0]
    [1 4] *[x1] +[0]
    [1 0] *[x2] [5]

    right?

    in this case, it'd be linearly independent, but it'd only span a plane in r3
     
  12. Nov 3, 2012 #11

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Right, but the domain of that is not [itex]\mathbb{R}^3[/itex], as you claimed before.
     
  13. Nov 3, 2012 #12
    well the domain of the matrix is indeed in R3...

    [3 0]
    [1 4]
    [1 0]
     
  14. Nov 3, 2012 #13

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Really? So what is the image of (1,1,1) then?
     
  15. Nov 3, 2012 #14
    oh wait, nevermind...domain is in r2...but either way...

    it wouldn't be onto because you're still only spanning a plane in r3

    but if you look at the transformation, matrix, it's linearly independent
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook