# Proof concerning similarity between matrices of Linear Transformations

1. Jun 19, 2009

### WiFO215

1. The problem statement, all variables and given/known data
Let V be a finite dimensional vector space over the field F and let S and T be linear operators on V. We ask: When do there exist ordered bases B and B' for V such that B = [T]B'? Prove that such bases exist only if there is an invertible linear operator U on V such that T = USU-1.

3. The attempt at a solution

I have a hand-waiving argument and am not too sure of what to do.

Assume that T = USU-1.

Multiplying both sides from the left by U-1.

You have U-1T = SU-1.

Now since S, U and T are functions, lets "plug in" a vector whose co-ordinates are from B', say Y.

U-1T (Y) = SU-1 (Y)

T will "act on" Y since it is from B' using its respective matrix and the U-1 will take it from B' to B after it has been transformed. On the right hand side, the vector Y comes in, gets converted to a vector in B by being "acted on" by U. Then it will be transformed by S. Now the two vectors on either side are equal to each other, so effectively, their co-ordinate matrices from B and B' will be equal and so [T]B' = B.

Apart from this vague line of thinking, I don't know where to start on a formal proof. Please and thank you.

Last edited: Jun 20, 2009
2. Jun 20, 2009

### Dick

It's probably not going to be clear to everyone what _B=[T]_B' means. You do mean the matrix of S in the basis B is the same as the matrix of T over the basis B', right?

3. Jun 20, 2009

### WiFO215

I'm sorry. Yes I do mean the matrix of S in basis B and that of T in B'. I see that notation used in Hoffman and Kunze so I thought it was universal notation. My mistake.

4. Jun 20, 2009

### HallsofIvy

There is no such thing as "universal" notation! Always explain your notation.

Remember that a linear transformation may have many different matrix representations for different bases. In fact, if two matrices are "similar" if and only if they represent the same linear transformation in different bases. Suppose v is a vector, let $v_B$ represent its column-matrix representation in basis B and $v_{B'}$ represent its column-matrix representation in basis B'. Can you show that $T(v)_{B'}$ and $S(v)_{B}$ represent the same vector?

5. Jun 20, 2009

### WiFO215

Oh I see. You are saying S and T represent the same transformation in different bases as T = USU-1? So then I suppose I could show T[v]B' = S[v]B.

So this would be the proof:

Since T and S do pretty much the same thing, but only catch is they do whatever they do on different bases. So when you plug in vector V with respect to B' to T or with respect to B to S, you land up with the same vector. Since I land up with the same vector on either side, I can argue that the co-ordinate matrix of V with respect to B' multiplied with the co-ordinate matrix of T in B' is the same as the co-ordinate matrix of V with respect to B' multiplied by the matrix of S with respect to B.

Am I correct?

6. Jun 21, 2009

### WiFO215

Anyone?

7. Jun 21, 2009

### Dick

Just make you 'vague' line of thinking less vague. Pick a basis B=<b1,...,bn>. So Sbi=s_1i*b1+s_2i*b2+...+s_ni*bn. If S=U^(-1)TU, can you show the matrix of T is the same in the basis B'=<Ub1...Ubn>.

8. Jun 21, 2009

### WiFO215

Hurray! Yes I can! Yes I can! Thanks Dick and Halls!!