- #1

Mdhiggenz

- 327

- 1

## Homework Statement

let b1=(1,1,0)

^{T};b2=(1 0 1)

^{T}; b3=(0 1 1)

^{T}

and let L be the linear transformation from R

^{2}

into R

^{3}defined by

L(x)=x

_{1}b

_{1}+x

_{2}b

_{2}+(x

_{1}+x

_{2})b

_{3}

Find the matrix A representing L with respect to the bases (e

_{1},e

_{2})

and (b

_{1},b

_{2},b

_{3})

## Homework Equations

## The Attempt at a Solution

First thing I did was label out my e1 and e2

e1=(1,0)

e2=(0,1)

L(e1)=b1+b3

= (1,2,1)

^{T}

L(e2)=b2+b3

=(1,1,1)

^{T}

So I would assume my A to be (L(e1),L(e2))

^{T}

However that is incorrect.

I'm not sure what I am doing incorrect the book does the same steps, but gets a different answer.

Last edited by a moderator: