# Linear algebra, linear trasformation

1. Mar 17, 2013

### Mdhiggenz

1. The problem statement, all variables and given/known data

let b1=(1,1,0)T ;b2=(1 0 1)T; b3=(0 1 1)T

and let L be the linear transformation from R2

into R3 defined by

L(x)=x1b1+x2b2+(x1+x2)b3

Find the matrix A representing L with respect to the bases (e1,e2)
and (b1,b2,b3)

2. Relevant equations

3. The attempt at a solution

First thing I did was label out my e1 and e2

e1=(1,0)

e2=(0,1)

L(e1)=b1+b3
= (1,2,1)T

L(e2)=b2+b3
=(1,1,1)T

So I would assume my A to be (L(e1),L(e2))T
However that is incorrect.

I'm not sure what I am doing incorrect the book does the same steps, but gets a different answer.

Last edited by a moderator: Mar 17, 2013
2. Mar 17, 2013

### Fredrik

Staff Emeritus
Row i, column j of A is $(Le_j)_i$ (i.e. the ith component of $Le_j$ in the given ordered basis). This makes $Le_1$ the first column, but you made it the first row.

Also, you computed $b_2+b_3$ wrong.