Linear algebra Matrix with respect to basis

Technique101
Messages
6
Reaction score
0

Homework Statement



Find the matrix of the linear operator with respect to the given basis B.

D: P2 -> P2 defined by D(ax2 + bx + c) = 2ax+b, B = { 3x2+2x+1, x2-2x, x2+x+1 }

Homework Equations



None.

The Attempt at a Solution



I set the basis B = { (3,2,1), (1,-2,0), (1,1,1) } based on the equations

then I did
D(3,2,1) = 6x+2 = (0,6,2)
D(1,-2,0) = 2x-2 = (0,2,-2)
D(1,1,1) = 2x-1 = (0,2,-1)

I believe those are the steps I have to take? I'm just not sure where to go from here.

Thanks!
 
Physics news on Phys.org
I would probably start with the D matrix in the {x^2,x,1} basis, then use the B matrix you found to transform it into the new basis
 
now for clarity let the bases be given by
e = {e_1, e_2, e_3} = {x^2,x,1}

b = {b_1, b_2, b_3} = { 3x2+2x+1, x2-2x, x2+x+1 }

so as another way, you have the effect of the matrix D on the b basis vectors, you could re-write the resultant vectors in terms of the b basis to find D relative to b (call it Db)

though the first method outlined is probably better
 
Hmm, I don't know if I quite follow. So:

D(1,0,0) = 2x = (0,2,0)
D(0,1,0) = 1 = (0,0,1)
D(0,0,1) = 0 = (0,0,0)

So what would be the next step?
 
ok, so given a column vector in the e basis ue = (a,b,c)T representing (ax^2+b+c), what is the matrix De, such that result is ve in also the e basis

v^e = D^e.u^e

you've got all the values, just put them in matrix form
De =
[ ? ? ?]
[ ? ? ?]
[ ? ? ?]

the you need to find the matrix T, that transforms from the b basis to the v basis, so given a vector ub, what matrix T, takes it to ue

u^e = T.u^b

ub & ue are the same vector just written in different bases. The matrix T will be very closely related to the set of vectors B you gave.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top