Matrices of linear transformation

eaglesmath15
Messages
7
Reaction score
0
1. The question
Let V be a vector space with the ordered basis β={v1, v2,...,vn}. Define v0=0. Then there exists a linear transformation T:V→V such that T(vj) = vj+vj-1 for j=1,2,...,n. Compute [T]β.


Homework Equations


[T]γβ = (aij), 1≤i≤m, 1≤j≤n (where m is dimension of γ and n is the dimension of β (I think)


The Attempt at a Solution


Basically, I know that T(v1)= v1, and T(v2) = v2 + v1, all the way up to T(vn) = vn + vn-1, but I'm not sure how this helps me form the matrix. Also, I know that the matrix is generally the dimension of the range by the dimension of the domain, which would make this matrix nxn, but I'm just not sure how to get it.
 
Physics news on Phys.org
Let's start with the first equation, ##T(v_1) = v_1##. What is ##[v_1]_{\beta}##?
 
jbunniii said:
Let's start with the first equation, ##T(v_1) = v_1##. What is ##[v_1]_{\beta}##?
Do you multiply T(v1) by beta?
 
eaglesmath15 said:
Do you multiply T(v1) by beta?
No, ##\beta## isn't a number, so you can't multiply by it. It is an ordered list of the elements of a basis for the vector space ##V##, just like you said: ##\beta = (v_1, v_2, \ldots v_n)##.

If ##A## is a linear transformation from ##V## to ##V##, the notation ##[A]_\beta## means the matrix of ##A## with respect to the basis ##\beta##. Similarly, if ##v## is a vector in ##V##, then ##[v]_\beta## means the matrix (in this case it will be a column vector) of ##v## with respect to ##\beta##.
 
jbunniii said:
No, ##\beta## isn't a number, so you can't multiply by it. It is an ordered list of the elements of a basis for the vector space ##V##, just like you said: ##\beta = (v_1, v_2, \ldots v_n)##.

If ##A## is a linear transformation from ##V## to ##V##, the notation ##[A]_\beta## means the matrix of ##A## with respect to the basis ##\beta##. Similarly, if ##v## is a vector in ##V##, then ##[v]_\beta## means the matrix (in this case it will be a column vector) of ##v## with respect to ##\beta##.
Right, I should have clarified. Do you multiply v1 by the components of beta? So the column vector would be (v1v1 v1v2 ... v1vn) (except going down of course, not across)?
 
Do you know what are the coordinates of a vector in a basis?
[v_{1}]_β is just the coordinates of v_1 in β

Let's be β={v_1, v_2,...,v_n}
If β is a basis of a vector space V, then all v\inV can be write as:
v=\sum _{i=1}^{n}\alpha _{i}v_i
And \alpha_1, \alpha_2,...,\alpha_n are unique.
Then:
(\alpha_1, \alpha_2,...,\alpha_n) is the coordinates of V in B.

Then:
If [T]_{β}=\begin{pmatrix}<br /> a_{11} &amp;... &amp;a_{1n} \\ <br /> \vdots &amp; &amp;\vdots \\ <br /> a_{n1} &amp;... &amp; a_{nn}<br /> \end{pmatrix}
and
[v]_{β}=(\alpha_1, \alpha_2,...,\alpha_n)
Then
<br /> [T]_{β} \cdot [v]_{β}<br /> =<br /> \begin{pmatrix}<br /> a_{11} &amp;... &amp;a_{1n} \\ <br /> \vdots &amp; &amp;\vdots \\ <br /> a_{n1} &amp;... &amp; a_{nn}<br /> \end{pmatrix}<br /> \cdot <br /> \begin{pmatrix}<br /> \alpha_{1} \\ <br /> \vdots \\ <br /> \alpha_{n}<br /> \end{pmatrix}<br />

Note: With the definition I'm using (and I learned in my Linear Algebra's course), we shortcut (A.x^t)^t as A.x, that's why I'm talking of a row vector but when I use the matrix product it's a column vector.
If something on my post is confusing feel free to ask me to clarify it.
 
Last edited:
Have you looked at simple two and three dimensional examples?
In two dimensions, T(v_1)= v_1 and T(v_2)= v_2- v1. Using those as columns, the matrix representation of T is
\begin{bmatrix}1 &amp; -1 \\ 0 &amp; 1 \end{bmatrix}

In three dimensions, T(v_1)= v_1, T(v_2)= v_2- v_1 and T(v_3)= v_3- v_2. Using those as columns, thew matrix representation of T is
\begin{bmatrix}1 &amp; -1 &amp; 0 \\ 0 &amp; 1 &amp; -1 \\ 0 &amp; 0 &amp; 1\end{bmatrix}
 
Back
Top