Matrices of linear transformation

In summary, we are given a vector space V with an ordered basis β={v1, v2,...,vn}. We define v0=0 and are asked to compute the matrix representation of a linear transformation T:V→V such that T(vj) = vj+vj-1 for j=1,2,...,n. We use the formula [T]β = (aij) where m is the dimension of γ and n is the dimension of β. We determine the coordinates of v1 in β and use them to find the matrix representation of T. The resulting matrix will have a dimension of nxn. We can also look at simple two and three dimensional examples to better understand the concept.
  • #1
eaglesmath15
7
0
1. The question
Let V be a vector space with the ordered basis β={v1, v2,...,vn}. Define v0=0. Then there exists a linear transformation T:V→V such that T(vj) = vj+vj-1 for j=1,2,...,n. Compute [T]β.


Homework Equations


[T]γβ = (aij), 1≤i≤m, 1≤j≤n (where m is dimension of γ and n is the dimension of β (I think)


The Attempt at a Solution


Basically, I know that T(v1)= v1, and T(v2) = v2 + v1, all the way up to T(vn) = vn + vn-1, but I'm not sure how this helps me form the matrix. Also, I know that the matrix is generally the dimension of the range by the dimension of the domain, which would make this matrix nxn, but I'm just not sure how to get it.
 
Physics news on Phys.org
  • #2
Let's start with the first equation, ##T(v_1) = v_1##. What is ##[v_1]_{\beta}##?
 
  • #3
jbunniii said:
Let's start with the first equation, ##T(v_1) = v_1##. What is ##[v_1]_{\beta}##?
Do you multiply T(v1) by beta?
 
  • #4
eaglesmath15 said:
Do you multiply T(v1) by beta?
No, ##\beta## isn't a number, so you can't multiply by it. It is an ordered list of the elements of a basis for the vector space ##V##, just like you said: ##\beta = (v_1, v_2, \ldots v_n)##.

If ##A## is a linear transformation from ##V## to ##V##, the notation ##[A]_\beta## means the matrix of ##A## with respect to the basis ##\beta##. Similarly, if ##v## is a vector in ##V##, then ##[v]_\beta## means the matrix (in this case it will be a column vector) of ##v## with respect to ##\beta##.
 
  • #5
jbunniii said:
No, ##\beta## isn't a number, so you can't multiply by it. It is an ordered list of the elements of a basis for the vector space ##V##, just like you said: ##\beta = (v_1, v_2, \ldots v_n)##.

If ##A## is a linear transformation from ##V## to ##V##, the notation ##[A]_\beta## means the matrix of ##A## with respect to the basis ##\beta##. Similarly, if ##v## is a vector in ##V##, then ##[v]_\beta## means the matrix (in this case it will be a column vector) of ##v## with respect to ##\beta##.
Right, I should have clarified. Do you multiply v1 by the components of beta? So the column vector would be (v1v1 v1v2 ... v1vn) (except going down of course, not across)?
 
  • #6
Do you know what are the coordinates of a vector in a basis?
[itex][v_{1}]_β[/itex] is just the coordinates of [itex]v_1[/itex] in β

Let's be [itex]β={v_1, v_2,...,v_n}[/itex]
If β is a basis of a vector space V, then all v[itex]\in[/itex]V can be write as:
[itex]v=\sum _{i=1}^{n}\alpha _{i}v_i[/itex]
And [itex]\alpha_1, \alpha_2,...,\alpha_n[/itex] are unique.
Then:
[itex](\alpha_1, \alpha_2,...,\alpha_n)[/itex] is the coordinates of V in B.

Then:
If [itex][T]_{β}=\begin{pmatrix}
a_{11} &... &a_{1n} \\
\vdots & &\vdots \\
a_{n1} &... & a_{nn}
\end{pmatrix}[/itex]
and
[itex][v]_{β}=(\alpha_1, \alpha_2,...,\alpha_n)[/itex]
Then
[itex]
[T]_{β} \cdot [v]_{β}
=
\begin{pmatrix}
a_{11} &... &a_{1n} \\
\vdots & &\vdots \\
a_{n1} &... & a_{nn}
\end{pmatrix}
\cdot
\begin{pmatrix}
\alpha_{1} \\
\vdots \\
\alpha_{n}
\end{pmatrix}
[/itex]

Note: With the definition I'm using (and I learned in my Linear Algebra's course), we shortcut [itex](A.x^t)^t[/itex] as [itex]A.x[/itex], that's why I'm talking of a row vector but when I use the matrix product it's a column vector.
If something on my post is confusing feel free to ask me to clarify it.
 
Last edited:
  • #7
Have you looked at simple two and three dimensional examples?
In two dimensions, [itex]T(v_1)= v_1[/itex] and [itex]T(v_2)= v_2- v1[/itex]. Using those as columns, the matrix representation of T is
[tex]\begin{bmatrix}1 & -1 \\ 0 & 1 \end{bmatrix}[/tex]

In three dimensions, [itex]T(v_1)= v_1[/itex], [itex]T(v_2)= v_2- v_1[/itex] and [itex]T(v_3)= v_3- v_2[/itex]. Using those as columns, thew matrix representation of T is
[tex]\begin{bmatrix}1 & -1 & 0 \\ 0 & 1 & -1 \\ 0 & 0 & 1\end{bmatrix}[/tex]
 

1. What is a matrix of linear transformation?

A matrix of linear transformation is a rectangular array of numbers that represents a linear transformation between two vector spaces. It is used to describe how a set of coordinates in one vector space is transformed into another set of coordinates in a different vector space.

2. How is a matrix of linear transformation related to linear algebra?

Linear algebra is the branch of mathematics that deals with vector spaces and linear transformations. Matrices of linear transformation are an important tool in linear algebra as they allow us to represent and manipulate linear transformations in a concise and efficient manner.

3. How do you multiply two matrices of linear transformation?

To multiply two matrices of linear transformation, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix will have the same number of rows as the first matrix and the same number of columns as the second matrix. The individual elements of the resulting matrix are calculated by multiplying corresponding elements from the first and second matrix and then summing them up.

4. What are some real-world applications of matrices of linear transformation?

Matrices of linear transformation have various applications in fields such as computer graphics, data analysis, and engineering. They are used to rotate, scale, and translate objects in computer graphics, to analyze and manipulate data in data science, and to model and solve engineering problems.

5. Can a matrix of linear transformation have an inverse?

Not all matrices of linear transformation have an inverse. Only square matrices (matrices with the same number of rows and columns) can have an inverse. The inverse of a matrix of linear transformation can be used to "undo" the original transformation, and it is calculated using a specific formula. If a matrix of linear transformation has an inverse, it is called an invertible matrix.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
513
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
3K
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
Replies
6
Views
822
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Thermodynamics
Replies
19
Views
1K
Back
Top