Linear algebra, linear trasformation

Mdhiggenz
Messages
324
Reaction score
1

Homework Statement



let b1=(1,1,0)T ;b2=(1 0 1)T; b3=(0 1 1)T

and let L be the linear transformation from R2

into R3 defined by

L(x)=x1b1+x2b2+(x1+x2)b3

Find the matrix A representing L with respect to the bases (e1,e2)
and (b1,b2,b3)

Homework Equations


The Attempt at a Solution



First thing I did was label out my e1 and e2

e1=(1,0)

e2=(0,1)

L(e1)=b1+b3
= (1,2,1)T

L(e2)=b2+b3
=(1,1,1)T

So I would assume my A to be (L(e1),L(e2))T
However that is incorrect.

I'm not sure what I am doing incorrect the book does the same steps, but gets a different answer.
 
Last edited by a moderator:
Physics news on Phys.org
Row i, column j of A is ##(Le_j)_i## (i.e. the ith component of ##Le_j## in the given ordered basis). This makes ##Le_1## the first column, but you made it the first row.

Also, you computed ##b_2+b_3## wrong.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top