1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear transformation representation with a matrix

  1. May 19, 2016 #1
    1. The problem statement, all variables and given/known data

    For the linear transformation T: R2-->R2 defined by T(x1, X2) = (x1 + x2, 2x1 - x2), use the matrix A to find T(v), where v = (2, 1). B = {(1, 2), (-1, 1)} and B' = {(1, 0), (0, 1)}.


    2. Relevant equations

    T(v) is given, (x1+x2, 2x1-x2)

    3. The attempt at a solution

    Okay, I see that T(v) is simply (2+1, 2*2-1) --> (3, 3), but this matrix business has me a bit confused. From my textbook:

    Using the basis B = {(1, 2), (-1, 1)}, you find that v = (2, 1) = 1(1, 2) - 1(-1, 1), which implies [v]B = [1 -1]T.

    Now, where did this seemingly 'magical' 1 and -1 come from? The matrix relative to B and B' is obviously {(3, 0), (0, 3)}, but I have no idea where this [1 -1]T came from, nor what it is. I see that this is the coordinate matrix for (3, 3) relative to B, because I see that A*v in this case gives (3, 3), but have no idea how it was derived. Perhaps there's some small gem or concept here that I'm missing?
     
    Last edited: May 19, 2016
  2. jcsd
  3. May 19, 2016 #2

    Mark44

    Staff: Mentor

    The coordinates of a vector represent the constants that multiply the vectors in a basis.
    In the standard basis for ##\mathbb{R}^2##, the vector ##\begin{bmatrix} 3 \\ 2\end{bmatrix}## means ##3\begin{bmatrix} 1 \\ 0\end{bmatrix} + 2\begin{bmatrix} 0\\ 1\end{bmatrix}##.
    What you show as ##[v]_B## are the coordinates of basis B to result in ##\begin{bmatrix} 2 \\ 1 \end{bmatrix}##
     
  4. May 19, 2016 #3
    Okay, so

    v = (2, 1) = 1(1, 2) - 1(-1, 1)

    because
    1*1 - 1*-1 = 2
    1*2 - 1*1 = 1

    So, the book simply skipped this piece of the explanation?
     
  5. May 19, 2016 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Yes.
     
  6. May 20, 2016 #5
    Thanks gents, this got me past this headscratcher, and then I had an on-the-fly tutor session yesterday and I get it now. To convert between nonstandard bases and find a transform for a vector expressed in the standard basis:

    -To get the relative matrix we transform the first nonstandard base, say, B1, and express each column vector as a linear combination of our second nonstandard base, say, B2
    -To get v expressed in terms of multiples of B1 we express v as a linear combination of B1.
    -We then multiply the result by our transform matrix: Av, and this is expressed in terms of our coefficients for B2
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted