Linear algebra, basis, linear transformation and matrix representation

fluidistic
Gold Member
Messages
3,928
Reaction score
272

Homework Statement


In a given basis \{ e_i \} of a vector space, a linear transformation and a given vector of this vector space are respectively determined by \begin{pmatrix} 2 & 1 & 0 \\ 1 & 2 & 0\\ 0&0&5\\ \end{pmatrix} and \begin{pmatrix} 1 \\ 2 \\3 \end{pmatrix}.
Find the matrix representations of the transformation and the vector in a new basis such that the old one is represented by e_1 =\begin{pmatrix} 1 \\ 1 \\0 \end{pmatrix}, e_2=\begin{pmatrix} 1 \\ -1 \\0 \end{pmatrix} and e_3=\begin{pmatrix} 0 \\ 0 \\1 \end{pmatrix}.

Homework Equations


Not even sure.

The Attempt at a Solution


I suspect it has to do with eigenvalues since the problem in the assignments is right after an eigenvalue problem. I would not have thought of this otherwise.
It reminds me of 2 similar matrices, say A and B, then A=P^(-1)BP but nothing more than this.
I've found that the given matrix has 3 different eigenvalues, thus 3 eigenvectors (hence non 0 vectors) that are linearly independent (since the 3 eigenvalues are all different) and thus the matrix is similar to a triangular one with the eigenvalues 1, 3 and 5 on the main diagonal.
So now I know that there exist an invertible matrix P such that A=P^(-1)BP. But I'm really stuck at seeing how this can help me.
Any tip's welcome.
 
Physics news on Phys.org
First you'll need to find the change-of-basis matrix. That is, a matrix A such that if x are coordinates in the old basis, then Ax are coordinates in the new basis.

You know already what Ae_1,~Ae_2,~Ae_3 are. So can you use this to find the matrix A?
 
micromass said:
First you'll need to find the change-of-basis matrix. That is, a matrix A such that if x are coordinates in the old basis, then Ax are coordinates in the new basis.

You know already what Ae_1,~Ae_2,~Ae_3 are. So can you use this to find the matrix A?

Thanks for your help. I'm confused, I didn't know I knew what Ae_i are.
Are these the column vectors of the first matrix of my first post?
 
fluidistic said:
Thanks for your help. I'm confused, I didn't know I knew what Ae_i are.
Are these the column vectors of the first matrix of my first post?

Oh, no, sorry. I was thinking of the e_ias the original basis.

So, if I rephrase it, you know what the image of (1,0,0), (0,1,0) and (0,0,1) under A are?
 
micromass said:
Oh, no, sorry. I was thinking of the e_ias the original basis.

So, if I rephrase it, you know what the image of (1,0,0), (0,1,0) and (0,0,1) under A are?
Ah ok. :)
Hmm no either. \{ e_i \} isn't necessarily the canonical basis as far as I know...
 
The coordinates of any basis is always (1,0,0), (0,1,0), (0,0,1). Regardless of whether it is the canonical basis.

By definition, the coordinates of a vector v with respect to \{e_1,e_2,e_3\} are (\alpha_1,\alpha_2,\alpha_3) such that

v=\alpha_1e_1+\alpha_2e_2+\alpha_3e_3.

From this, we see that the coordinates of e_1 are always (1,0,0).
 
Hi fluidistic! :smile:

First let's define our symbols.
Let's call your linear transformation matrix A, and your vector v.

What you need is a matrix B that defines a basis transformation.
B is defined by the fact that for instance the unit vector (1,0,0) is transformed to your e1. That is: B (1,0,0) = e1.

Can you deduce what the matrix B is?

If you apply this B to your vector v, you'll find the represention of the v in the alternative basis.
 
I like Serena said:
Hi fluidistic! :smile:

First let's define our symbols.
Let's call your linear transformation matrix A, and your vector v.

What you need is a matrix B that defines a basis transformation.
B is defined by the fact that for instance the unit vector (1,0,0) is transformed to your e1. That is: B (1,0,0) = e1.

Can you deduce what the matrix B is?

If you apply this B to your vector v, you'll find the represention of the v in the alternative basis.
Hi!
Is B= \begin{pmatrix} 1 & 1 & 0 \\ 1& -1 & 0\\ 0& 0 & 1 \end{pmatrix}? It seems to work for me. Its columns vectors are the one given in the problem statement.
 
fluidistic said:
Hi!
Is B= \begin{pmatrix} 1 & 1 & 0 \\ 1& -1 & 0\\ 0& 0 & 1 \end{pmatrix}? It seems to work for me. Its columns vectors are the one given in the problem statement.

Yep, that's it! :smile:
It's simply the given unit vectors as columns of the matrix B.

Calculate Bv and you have your representation of v in the basis defined by B.

For your matrix A, you need B-1AB.
That is, first you transform a vector to the alternate basis (wrt which A has been defined), then you apply A, and then you transform the result back to your original basis.
 
  • #10
I like Serena said:
Yep, that's it! :smile:
It's simply the given unit vectors as columns of the matrix B.

Calculate Bv and you have your representation of v in the basis defined by B.

For your matrix A, you need B-1AB.
That is, first you transform a vector to the alternate basis (wrt which A has been defined), then you apply A, and then you transform the result back to your original basis.
I reached v'= \begin{pmatrix} 3 \\ -1 \\ 3 \end{pmatrix}. And A' is a diagonal matrix with the eigenvalues of A as entries... (3, 1 and 5).
 
  • #11
That's it! :smile:

I'm surprised myself that A' is diagonal, but I see that it is.

In retrospect it is clear that it is, since the given basis consists of exactly the eigenvectors of A.
Note that A is diagonized by a base transition consisting of its eigenvectors.

I suspect this problem was intended to demonstrate this feature of eigenvectors.
 
  • #12
I like Serena said:
That's it! :smile:

I'm surprised myself that A' is diagonal, but I see that it is.

In retrospect it is clear that it is, since the given basis consists of exactly the eigenvectors of A.
Note that A is diagonized by a base transition consisting of its eigenvectors.

I suspect this problem was intended to demonstrate this feature of eigenvectors.

Yes it is. In my first post I saw that A was diagonalizable.
So A=P^{-1}A'P. I didn't know that P^{-1}=B^{-1}.
I had calculated the eigenvalues of A, so that I think I had calculated A'. (Up to the order of the eigenvalue on the main diagonal, but I don't think it's relevant).
Is there a way I could have found P=B which is faster than the one I've done?
In other words, knowing A' from start, is there a quick way to solve the problem?
 
  • #13
Well, this problem is actually not about eigenvectors.
It's about a base transformation.
You can't assume that the new basis consists of the eigenvectors!
 
  • #14
I like Serena said:
Well, this problem is actually not about eigenvectors.
It's about a base transformation.
You can't assume that the new basis consists of the eigenvectors!

Ok. I think I understand. :smile:
Thanks guys!
 
Back
Top