Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I I don't quite understand what i read

  1. Aug 31, 2016 #1
    please check the image file...
    i am having a really tough time following what this is saying with all the abstractions. i am assuming that since it's a transformation from V to W. then why does the equation below have y vectors instead of x? and since linear transformation follow traditional matrix multiplication, then why is a column vector of A multiplied to the column vectors of W???
     

    Attached Files:

  2. jcsd
  3. Aug 31, 2016 #2

    jedishrfu

    Staff: Mentor

    I think what its saying is that you have some basis vector xj from the V space that you would like to represent in the W space so you apply the transformation T to it.

    T is in fact the matrix A.

    So you are doing a matrix multiplication of A and xj to get the xjvector represented by the yi basis vectors in the W space.

    Does that make sense?

    Hopefully @Mark44 correct me if I'm wrong.
     
  4. Aug 31, 2016 #3

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    No.
    No, it does not.

    The appearance of ##Ax_j## in (6) in the quoted text is misleading and incorrect. Try to read (6) without it. The scalars ##a_{1j},\ldots,a_{mj}## appearing in the linear combination are the matrix elements of the representation ##A## of ##T## w.r.t. the particular bases in ##V## and ##W##.
     
    Last edited: Aug 31, 2016
  5. Aug 31, 2016 #4

    jedishrfu

    Staff: Mentor

    Thanks Krylov. I'll remember to add you to the post when I need verification.

    I guess I looked at it in a more operational sense where you have the linear transformation T implemented as the matrix A and you then applied a vector in the V space to get its representation in the W space via matrix multiplication. Its been many years since I've played with it and when I'm not sure I call in an advisor.
     
  6. Aug 31, 2016 #5

    Mark44

    Staff: Mentor

    A is a matrix representation of the transformation T. A transformation can have multiple matrix representations, depending on the bases for the domain and codomain.

    I've been busy with some other things this morning and early afternoon -- I'll try to get back to this a little later.
     
  7. Aug 31, 2016 #6
    so the equation is incorrect? i think it is
     
  8. Sep 1, 2016 #7

    Krylov

    User Avatar
    Science Advisor
    Education Advisor

    Yes, it is incorrect. The vectors ##T(x_j)## and ##a_{1j}y_1 + a_{2j}y_2 + \ldots + a_{mj}y_m## are in W, which could be any ##m##-dimensional vector space.
    ##Ax_j## is not even defined, because ##A## maps from ##\mathbb{R}^n## to ##\mathbb{R}^m## but ##x_j## is a vector in ##V##, which could be any ##n##-dimensional vector space.

    When you are learning linear algebra, it is important that your book distinguishes carefully between a linear transformation ##T## and its matrix representation ##A## w.r.t. a certain pair of bases. Afterwards, people often become sloppy about it. (There is a good reason for this sloppiness, but I do not consider it helpful at the beginning.)

    (I remember that a tutorial was written about this topic a while ago: Matrix Representations of Linear Transformations. Its notation is different from your book, though.)
     
    Last edited: Sep 1, 2016
  9. Sep 1, 2016 #8
    exactly the cause of my confusion. will check the article. thanks!
     
  10. Sep 28, 2016 #9

    Stephen Tashi

    User Avatar
    Science Advisor

    Are you referring to the right hand side of the equation? The image of the transform is in W, so it would be a vector expressed in the basis for W and that basis is the set of the ##y_i##.

    The vectors of W aren't necessarily row vectors or column vectors. The usual way to represent a linear transformation is to represent it as multiplication by a matrix A on the left hand side of a column vector, which produces another column vector. ( However, I suppose there are a few books where the representation is done by multiplying a row vector on the right hand side by a matrix A to produce another row vector. Let's assume your text uses the conventional approach.)

    The representation of the linear transformation ##T## by a matrix involves assuming we will represent the vectors in V and W in a certain manner. For example, if ##X## is a vector in ##V## expressed in the (orthonormal) x-basis as ##X = a1 x_1 + a2 x_2 + a3 x_3## then a "natural" way to represent ##X## by the ordered triple ##(a1,a2,a3)##. Notice this is not "naturally" a column vector. In fact, it looks like a row vector. But, after reading a lot of algebra texts, one begins to accept that ##X## will be represent by a column vector - i.e. as the transpose of ##(a1,a2,a3) = (a1,a2,a3)^T##.

    Using the convention of representing the vectors involved as column vectors, the representation of ##x_2## in the x-basis is ##(0,1,0)^T##. The image of ##x_2## under a linear transformation ##T## will be some vector ##T(x_2)## in W. Suppose W has dimension two and ##T(x_2) = b1 y_1 + b2 y_2 ## , which is represented as ##(b1, b2)^T##. Compare ##(b1,b2)^T## with the result of multiplying ##(0,1,0)## on the left by a matrix ##A## of the appropriate dimensions.

    ##\begin{pmatrix} a_{11}&a_{12}& a_{13} \\ a_{21}&a_{22}&a_{23} \end{pmatrix} \begin{pmatrix}0\\1\\0\end{pmatrix} = \begin{pmatrix}a_{12}\\a_{22}\end{pmatrix}##

    So ##a_{12} = b_1## and ##a_{22} = b_2##. As your text indicated, the second column of ##A## is determined by the coefficients of ##T(x_2)## when it is represented as a vector in the y-basis.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: I don't quite understand what i read
  1. I don't get this (Replies: 3)

  2. I^i means WHAT? (Replies: 6)

Loading...