Matrix Representation of Linear Transformation

In summary, a matrix representation of a linear transformation is a way to represent a linear transformation using a matrix. It is calculated by applying the linear transformation to a chosen basis for the vector space and writing the resulting vectors as columns of a matrix. This representation is significant as it allows for efficient calculations and comparison of different linear transformations, and it is widely used in real-world applications such as computer graphics and machine learning. The matrix representation can also change depending on the choice of basis, making it a versatile tool in various fields.
  • #1
KT KIM
24
0
la_1.png


This is where I am stuck. I studied ordered basis and coordinates vector previous to this.
of course I studied vector space, basis, linear... etc too,
However I can't understand just this part. (maybe this whole part)
Especially
la_2.png

this one which says [[T(b1)]]c...[[T(bn)]]c be a columns of matrix.

Can anyone please explain me how this works? I've stuck at here too long.
 
Physics news on Phys.org
  • #2
KT KIM said:
la_1.png


This is where I am stuck. I studied ordered basis and coordinates vector previous to this.
of course I studied vector space, basis, linear... etc too,
However I can't understand just this part. (maybe this whole part)
Especially
la_2.png

this one which says [[T(b1)]]c...[[T(bn)]]c be a columns of matrix.

Can anyone please explain me how this works? I've stuck at here too long.

This is how ##A## is defined: ("Define ##A## to be ...") the images of basis vectors of ##V## under the transformation ##T## expressed in coordinates of ##C## with respect to the given bases in ##C## as column vectors of ##A##.

The author then shows that the so defined ##A## describes / is in accordance to / concurs / fully determines (whatever) the entire transformation ##T##, as it maps any vector ##v## when expressed in the coordinates of ##V## with respect to the basis ##\mathit{B}## (RHS) onto the image ##T(v)## expressed in the coordinates of ##W## with respect to the basis ##\mathit{C}## (LHS).

EDIT: For short: The matrix ##A## of ##T## can be written as all images of basis vectors of ##V## arranged in columns.
 
  • Like
Likes KT KIM
  • #3
Any vector ##v\in V## can be uniquely written as a linear combination of ##\{b_1,\ldots,b_n\}##, i.e. ##v = \sum_{i=1}^n \beta_i b_i##. Operating ##T## on ##v##,
$$
Tv = \sum_{i=1}^n \beta_i (Tb_i)
$$
The thing inside the bracket in right side above implies that the action of ##T## on any vector in ##V## is going to be completely characterized if you know ##Tb_i## for ##i=1,\ldots,n##.

Now suppose ##A## be the matrix representation of ##T##. In ##k^n##, ##b_1 = (1,0,...,0)^T##, ##b_2 = (0,1,...,0)^T##, and so on. If you multiply ##A## with ##b_1= (1,0,...,0)^T##, you will get a vector in ##k^m## which equals the first column of ##A##, right? Thus the first column in ##A## equals ##T## applied to ##b_1## and written in ##\mathcal{C}## basis, which is ##[[T(b_1)]]_\mathcal{C}##. The similar argument goes for the other columns of ##A##.
 
  • Like
Likes KT KIM
  • #4
I strongly suggest you stop here and consider some simple examples with actual numbers. Nothing makes a type of symbolic calculation clearer the first time you encounter it than working through concrete examples.

Use a 2-dimensional vector space over the real numbers. Make up simple numbers that take the place of the abstract symbols in the textbook or notes you quoted for us.

Do the explicit calculation separately for each of the two things that the quote claims are equal. This will very much get you used to this kind of calculation and help you see what is going on.
 
  • Like
Likes KT KIM and jasonRF

1. What is a matrix representation of a linear transformation?

A matrix representation of a linear transformation is a way to represent a linear transformation using a matrix. It involves mapping the input vectors to output vectors using a matrix multiplication. In other words, each element in the matrix represents the coefficient of a specific variable in the linear transformation equation.

2. How is the matrix representation of a linear transformation calculated?

The matrix representation of a linear transformation is calculated by first choosing a basis for the vector space. Then, the linear transformation is applied to the basis vectors, and the resulting vectors are written as columns of a matrix. The resulting matrix is the matrix representation of the linear transformation.

3. What is the significance of the matrix representation of a linear transformation?

The matrix representation of a linear transformation provides an efficient and organized way to perform calculations involving linear transformations. It also allows for easy comparison and analysis of different linear transformations.

4. Can the matrix representation of a linear transformation change?

Yes, the matrix representation of a linear transformation can change depending on the choice of basis for the vector space. Different basis vectors will result in different matrix representations of the same linear transformation.

5. How is the matrix representation of a linear transformation used in real-world applications?

The matrix representation of a linear transformation is used in a variety of real-world applications, such as computer graphics, image processing, and machine learning. It allows for efficient manipulation and transformation of data, making it an essential tool in many scientific and technological fields.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
400
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
20
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
534
Replies
5
Views
1K
Back
Top