# Matrices of linear transformation

1. Feb 20, 2013

### eaglesmath15

1. The question
Let V be a vector space with the ordered basis β={v1, v2,...,vn}. Define v0=0. Then there exists a linear transformation T:V→V such that T(vj) = vj+vj-1 for j=1,2,...,n. Compute [T]β.

2. Relevant equations
[T]γβ = (aij), 1≤i≤m, 1≤j≤n (where m is dimension of γ and n is the dimension of β (I think)

3. The attempt at a solution
Basically, I know that T(v1)= v1, and T(v2) = v2 + v1, all the way up to T(vn) = vn + vn-1, but I'm not sure how this helps me form the matrix. Also, I know that the matrix is generally the dimension of the range by the dimension of the domain, which would make this matrix nxn, but I'm just not sure how to get it.

2. Feb 20, 2013

### jbunniii

Let's start with the first equation, $T(v_1) = v_1$. What is $[v_1]_{\beta}$?

3. Feb 21, 2013

### eaglesmath15

Do you multiply T(v1) by beta?

4. Feb 21, 2013

### jbunniii

No, $\beta$ isn't a number, so you can't multiply by it. It is an ordered list of the elements of a basis for the vector space $V$, just like you said: $\beta = (v_1, v_2, \ldots v_n)$.

If $A$ is a linear transformation from $V$ to $V$, the notation $[A]_\beta$ means the matrix of $A$ with respect to the basis $\beta$. Similarly, if $v$ is a vector in $V$, then $[v]_\beta$ means the matrix (in this case it will be a column vector) of $v$ with respect to $\beta$.

5. Feb 21, 2013

### eaglesmath15

Right, I should have clarified. Do you multiply v1 by the components of beta? So the column vector would be (v1v1 v1v2 ... v1vn) (except going down of course, not across)?

6. Feb 21, 2013

### SqueeSpleen

Do you know what are the coordinates of a vector in a basis?
$[v_{1}]_β$ is just the coordinates of $v_1$ in β

Let's be $β={v_1, v_2,...,v_n}$
If β is a basis of a vector space V, then all v$\in$V can be write as:
$v=\sum _{i=1}^{n}\alpha _{i}v_i$
And $\alpha_1, \alpha_2,...,\alpha_n$ are unique.
Then:
$(\alpha_1, \alpha_2,...,\alpha_n)$ is the coordinates of V in B.

Then:
If $[T]_{β}=\begin{pmatrix} a_{11} &... &a_{1n} \\ \vdots & &\vdots \\ a_{n1} &... & a_{nn} \end{pmatrix}$
and
$[v]_{β}=(\alpha_1, \alpha_2,...,\alpha_n)$
Then
$[T]_{β} \cdot [v]_{β} = \begin{pmatrix} a_{11} &... &a_{1n} \\ \vdots & &\vdots \\ a_{n1} &... & a_{nn} \end{pmatrix} \cdot \begin{pmatrix} \alpha_{1} \\ \vdots \\ \alpha_{n} \end{pmatrix}$

Note: With the definition I'm using (and I learned in my Linear Algebra's course), we shortcut $(A.x^t)^t$ as $A.x$, that's why I'm talking of a row vector but when I use the matrix product it's a column vector.
If something on my post is confusing feel free to ask me to clarify it.

Last edited: Feb 21, 2013
7. Feb 21, 2013

### HallsofIvy

Have you looked at simple two and three dimensional examples?
In two dimensions, $T(v_1)= v_1$ and $T(v_2)= v_2- v1$. Using those as columns, the matrix representation of T is
$$\begin{bmatrix}1 & -1 \\ 0 & 1 \end{bmatrix}$$

In three dimensions, $T(v_1)= v_1$, $T(v_2)= v_2- v_1$ and $T(v_3)= v_3- v_2$. Using those as columns, thew matrix representation of T is
$$\begin{bmatrix}1 & -1 & 0 \\ 0 & 1 & -1 \\ 0 & 0 & 1\end{bmatrix}$$