1. The problem statement, all variables and given/known data let b1=(1,1,0)T ;b2=(1 0 1)T; b3=(0 1 1)T and let L be the linear transformation from R2 into R3 defined by L(x)=x1b1+x2b2+(x1+x2)b3 Find the matrix A representing L with respect to the bases (e1,e2) and (b1,b2,b3) 2. Relevant equations 3. The attempt at a solution First thing I did was label out my e1 and e2 e1=(1,0) e2=(0,1) L(e1)=b1+b3 = (1,2,1)T L(e2)=b2+b3 =(1,1,1)T So I would assume my A to be (L(e1),L(e2))T However that is incorrect. I'm not sure what I am doing incorrect the book does the same steps, but gets a different answer.