# Homework Help: Linear transformation representation with a matrix

Tags:
1. May 19, 2016

### patricio2626

1. The problem statement, all variables and given/known data

For the linear transformation T: R2-->R2 defined by T(x1, X2) = (x1 + x2, 2x1 - x2), use the matrix A to find T(v), where v = (2, 1). B = {(1, 2), (-1, 1)} and B' = {(1, 0), (0, 1)}.

2. Relevant equations

T(v) is given, (x1+x2, 2x1-x2)

3. The attempt at a solution

Okay, I see that T(v) is simply (2+1, 2*2-1) --> (3, 3), but this matrix business has me a bit confused. From my textbook:

Using the basis B = {(1, 2), (-1, 1)}, you find that v = (2, 1) = 1(1, 2) - 1(-1, 1), which implies [v]B = [1 -1]T.

Now, where did this seemingly 'magical' 1 and -1 come from? The matrix relative to B and B' is obviously {(3, 0), (0, 3)}, but I have no idea where this [1 -1]T came from, nor what it is. I see that this is the coordinate matrix for (3, 3) relative to B, because I see that A*v in this case gives (3, 3), but have no idea how it was derived. Perhaps there's some small gem or concept here that I'm missing?

Last edited: May 19, 2016
2. May 19, 2016

### Staff: Mentor

The coordinates of a vector represent the constants that multiply the vectors in a basis.
In the standard basis for $\mathbb{R}^2$, the vector $\begin{bmatrix} 3 \\ 2\end{bmatrix}$ means $3\begin{bmatrix} 1 \\ 0\end{bmatrix} + 2\begin{bmatrix} 0\\ 1\end{bmatrix}$.
What you show as $[v]_B$ are the coordinates of basis B to result in $\begin{bmatrix} 2 \\ 1 \end{bmatrix}$

3. May 19, 2016

### patricio2626

Okay, so

v = (2, 1) = 1(1, 2) - 1(-1, 1)

because
1*1 - 1*-1 = 2
1*2 - 1*1 = 1

So, the book simply skipped this piece of the explanation?

4. May 19, 2016

### Ray Vickson

Yes.

5. May 20, 2016

### patricio2626

Thanks gents, this got me past this headscratcher, and then I had an on-the-fly tutor session yesterday and I get it now. To convert between nonstandard bases and find a transform for a vector expressed in the standard basis:

-To get the relative matrix we transform the first nonstandard base, say, B1, and express each column vector as a linear combination of our second nonstandard base, say, B2
-To get v expressed in terms of multiples of B1 we express v as a linear combination of B1.
-We then multiply the result by our transform matrix: Av, and this is expressed in terms of our coefficients for B2