I don't quite understand what i read

  • Context: Undergrad 
  • Thread starter Thread starter Terrell
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around understanding linear transformations and their matrix representations, particularly the confusion surrounding the notation and the relationship between vectors in different vector spaces. Participants explore the implications of matrix multiplication in this context, addressing specific equations and representations in linear algebra.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about the notation used in a transformation from vector space V to W, questioning why y vectors appear instead of x vectors.
  • Another participant suggests that the transformation T is represented by the matrix A, implying that applying T to a basis vector xj from V yields a representation in W.
  • A different participant challenges this view, stating that the appearance of Axj in the quoted text is misleading and emphasizes the importance of distinguishing between a linear transformation and its matrix representation.
  • Some participants note that a transformation can have multiple matrix representations depending on the bases used for the domain and codomain.
  • There is a discussion about the conventional representation of vectors as column vectors and the implications of this choice on the understanding of linear transformations.
  • One participant points out that the equation in question is incorrect, highlighting that Axj is not defined in the context provided.
  • Another participant elaborates on the representation of vectors in different bases and how this affects the interpretation of linear transformations.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the correctness of the equation in question. There are competing views regarding the interpretation of the transformation and its matrix representation, with some asserting that the equation is incorrect while others maintain a different perspective.

Contextual Notes

Participants express uncertainty about the definitions and assumptions underlying the notation used in the equations, particularly regarding the distinction between linear transformations and their matrix representations. There is also mention of potential sloppiness in notation that can lead to confusion.

Terrell
Messages
316
Reaction score
26
please check the image file...
i am having a really tough time following what this is saying with all the abstractions. i am assuming that since it's a transformation from V to W. then why does the equation below have y vectors instead of x? and since linear transformation follow traditional matrix multiplication, then why is a column vector of A multiplied to the column vectors of W?
 

Attachments

  • linear transformations.png
    linear transformations.png
    12 KB · Views: 610
Physics news on Phys.org
I think what its saying is that you have some basis vector xj from the V space that you would like to represent in the W space so you apply the transformation T to it.

T is in fact the matrix A.

So you are doing a matrix multiplication of A and xj to get the xjvector represented by the yi basis vectors in the W space.

Does that make sense?

Hopefully @Mark44 correct me if I'm wrong.
 
jedishrfu said:
T is in fact the matrix A.
No.
jedishrfu said:
Does that make sense?
No, it does not.

The appearance of ##Ax_j## in (6) in the quoted text is misleading and incorrect. Try to read (6) without it. The scalars ##a_{1j},\ldots,a_{mj}## appearing in the linear combination are the matrix elements of the representation ##A## of ##T## w.r.t. the particular bases in ##V## and ##W##.
 
Last edited:
  • Like
Likes   Reactions: member 587159, beamie564, Terrell and 1 other person
Thanks Krylov. I'll remember to add you to the post when I need verification.

I guess I looked at it in a more operational sense where you have the linear transformation T implemented as the matrix A and you then applied a vector in the V space to get its representation in the W space via matrix multiplication. Its been many years since I've played with it and when I'm not sure I call in an advisor.
 
jedishrfu said:
T is in fact the matrix A.
A is a matrix representation of the transformation T. A transformation can have multiple matrix representations, depending on the bases for the domain and codomain.

I've been busy with some other things this morning and early afternoon -- I'll try to get back to this a little later.
 
  • Like
Likes   Reactions: Terrell
Krylov said:
No.

No, it does not.

The appearance of ##Ax_j## in (6) in the quoted text is misleading and incorrect. Try to read (6) without it. The scalars ##a_{1j},\ldots,a_{mj}## appearing in the linear combination are the matrix elements of the representation ##A## of ##T## w.r.t. the particular bases in ##V## and ##W##.
so the equation is incorrect? i think it is
 
Terrell said:
so the equation is incorrect? i think it is
Yes, it is incorrect. The vectors ##T(x_j)## and ##a_{1j}y_1 + a_{2j}y_2 + \ldots + a_{mj}y_m## are in W, which could be any ##m##-dimensional vector space.
##Ax_j## is not even defined, because ##A## maps from ##\mathbb{R}^n## to ##\mathbb{R}^m## but ##x_j## is a vector in ##V##, which could be any ##n##-dimensional vector space.

When you are learning linear algebra, it is important that your book distinguishes carefully between a linear transformation ##T## and its matrix representation ##A## w.r.t. a certain pair of bases. Afterwards, people often become sloppy about it. (There is a good reason for this sloppiness, but I do not consider it helpful at the beginning.)

(I remember that a tutorial was written about this topic a while ago: Matrix Representations of Linear Transformations. Its notation is different from your book, though.)
 
Last edited:
  • Like
Likes   Reactions: Terrell
Mark44 said:
A is a matrix representation of the transformation T. A transformation can have multiple matrix representations, depending on the bases for the domain and codomain.

I've been busy with some other things this morning and early afternoon -- I'll try to get back to this a little later.
Krylov said:
Yes, it is incorrect. The vectors ##T(x_j)## and ##a_{1j}y_1 + a_{2j}y_2 + \ldots + a_{mj}y_m## are in W, which could be any ##m##-dimensional vector space.
##Ax_j## is not even defined, because ##A## maps from ##\mathbb{R}^n## to ##\mathbb{R}^m## but ##x_j## is a vector in ##V##, which could be any ##n##-dimensional vector space.

When you are learning linear algebra, it is important that your book distinguishes carefully between a linear transformation ##T## and its matrix representation ##A## w.r.t. a certain pair of bases. Afterwards, people often become sloppy about it. (There is a good reason for this sloppiness, but I do not consider it helpful at the beginning.)

(I remember that a tutorial was written about this topic a while ago: Matrix Representations of Linear Transformations. Its notation is different from your book, though.)
exactly the cause of my confusion. will check the article. thanks!
 
  • Like
Likes   Reactions: S.G. Janssens
Terrell said:
why does the equation below have y vectors instead of x?
Are you referring to the right hand side of the equation? The image of the transform is in W, so it would be a vector expressed in the basis for W and that basis is the set of the ##y_i##.

since linear transformation follow traditional matrix multiplication, then why is a column vector of A multiplied to the column vectors of W?

The vectors of W aren't necessarily row vectors or column vectors. The usual way to represent a linear transformation is to represent it as multiplication by a matrix A on the left hand side of a column vector, which produces another column vector. ( However, I suppose there are a few books where the representation is done by multiplying a row vector on the right hand side by a matrix A to produce another row vector. Let's assume your text uses the conventional approach.)

The representation of the linear transformation ##T## by a matrix involves assuming we will represent the vectors in V and W in a certain manner. For example, if ##X## is a vector in ##V## expressed in the (orthonormal) x-basis as ##X = a1 x_1 + a2 x_2 + a3 x_3## then a "natural" way to represent ##X## by the ordered triple ##(a1,a2,a3)##. Notice this is not "naturally" a column vector. In fact, it looks like a row vector. But, after reading a lot of algebra texts, one begins to accept that ##X## will be represent by a column vector - i.e. as the transpose of ##(a1,a2,a3) = (a1,a2,a3)^T##.

Using the convention of representing the vectors involved as column vectors, the representation of ##x_2## in the x-basis is ##(0,1,0)^T##. The image of ##x_2## under a linear transformation ##T## will be some vector ##T(x_2)## in W. Suppose W has dimension two and ##T(x_2) = b1 y_1 + b2 y_2 ## , which is represented as ##(b1, b2)^T##. Compare ##(b1,b2)^T## with the result of multiplying ##(0,1,0)## on the left by a matrix ##A## of the appropriate dimensions.

##\begin{pmatrix} a_{11}&a_{12}& a_{13} \\ a_{21}&a_{22}&a_{23} \end{pmatrix} \begin{pmatrix}0\\1\\0\end{pmatrix} = \begin{pmatrix}a_{12}\\a_{22}\end{pmatrix}##

So ##a_{12} = b_1## and ##a_{22} = b_2##. As your text indicated, the second column of ##A## is determined by the coefficients of ##T(x_2)## when it is represented as a vector in the y-basis.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K