Matrix Representations of Linear Transformations - Comments

In summary, Fredrik submitted a new PF Insights post discussing matrix representations of linear transformations.
  • #1
Fredrik
Staff Emeritus
Science Advisor
Gold Member
10,877
422
Fredrik submitted a new PF Insights post

Matrix Representations of Linear Transformations

lineartransformations-80x80.png


Continue reading the Original PF Insights Post.
 
  • Like
Likes etotheipi
Physics news on Phys.org
  • #2
Maybe I'm wrong and it's my browser but it seems there are non compiled latex lines in the text.
 
  • #3
fresh_42 said:
Maybe I'm wrong and it's my browser but it seems there are non compiled latex lines in the text.
Fixed, thanks!
 
  • #4
Thank you for the nice article!

I hope it will help beginning students to avoid the kind of confusion that I used to experience. (I think that part of this confusion is due to the fact that in physics literature, one usually doesn't distinguish between an operator and its matrix representation and, often, one also omits the specification of the underlying bases. For trained readers this is usually not a problem, but for students just coming from an LA course and looking to apply the theory in physics problems, I believe this can cause unnecessary difficulties.)

One typo:

In the line starting with: "We just define T to be the unique linear ##T:X→Y## such that (...)" you probably meant to write
$$
T e_j = \sum_{i=1}^m{T_{ij}f_i}
$$
since at this point in the text you have not yet assumed that ##X = Y##, etc.

Two suggestions:

  1. I think it would make the article even better if you would also discuss a second example, this time of an operator acting on an abstract (but still finite dimensional) vector space, such as a space of polynomials or so. This way, it becomes clear that a vector / matrix and its representation w.r.t. a basis are really two different things, and it also showcases the power of matrix representations when doing computations with abstract operators.
  2. Perhaps, alluding to your remark on QM at the end, it would be nice if you would write a follow-up on how this generalizes quite easily to bounded linear operators on separable Hilbert spaces. Then, you could also comment on what happens when you replace a bounded operator with an unbounded (differential) operator, which is typically the case physicists encounter when studying QM.
Hopefully you do not consider these comments an interference, but rather an expression of my enthusiasm for the subject and the attention that it has recently received on PF.
 
Last edited:
  • #5
As pointed out by Krylov

"Given an m×n matrix M, there’s a simple way to define a linear transformation T:X→Y such that the matrix representation of T with respect to (A,B) is M. We just define T to be the unique linear ##T:X→Y ## such that ##Tej=∑ni=1Tijei## for all j∈{1,…,n}."

only works if ##X=Y##. A matrix determines a linear transformation for each choice of basis for ##X## and ##Y##. Without a choice of bases, the matrix does not determine a linear transformation.
 
  • #6
lavinia said:
A matrix determines a linear transformation for each choice of basis for XX and YY. Without a choice of bases, the matrix does not determine a linear transformation.
But given a matrix the unity vectors in both spaces always define a natural basis to which the matrix is a linear transformation.
 
  • #7
Krylov said:
One typo:

In the line starting with: "We just define T to be the unique linear ##T:X→Y## such that (...)" you probably meant to write
$$
T e_j = \sum_{i=1}^m{T_{ij}f_i}
$$
Good catch. That line isn't present in the last draft I that I discussed with other people (in February 2013) before I turned it into a FAQ post (in June 2013...I'm pretty slow apparently), so I must have put it in later and not proofread it well enough.

Krylov said:
Two suggestions:

  1. I think it would make the article even better if you would also discuss a second example, this time of an operator acting on an abstract (but still finite dimensional) vector space, such as a space of polynomials or so. This way, it becomes clear that a vector / matrix and its representation w.r.t. a basis are really two different things, and it also showcases the power of matrix representations when doing computations with abstract operators.
  2. Perhaps, alluding to your remark on QM at the end, it would be nice if you would write a follow-up on how this generalizes quite easily to bounded linear operators on separable Hilbert spaces. Then, you could also comment on what happens when you replace a bounded operator with an unbounded (differential) operator, which is typically the case physicists encounter when studying QM.
Hopefully you do not consider these comments an interference, but rather an expression of my enthusiasm for the subject and the attention that it has recently received on PF.
Your comments are welcome, and I like your suggestions. Unfortunately I don't have a lot of time to improve this post right now. If you would like to do it, I'm more than OK with that.

The LaTeX can be improved. When I wrote this in 2013, LaTeX behaved differently here. There was no automatic numbering of equations for example. I would like to make sure that only those equations that should be numbered are numbered. Removing all the numbers is also an option. Also, the equation that begins with Tx= wasn't split over two lines before. It needs an explicit line break followed by an alignment symbol. (I could edit the post when it was a normal FAQ post. I don't think I can now that it's an Insights post).
 
  • Like
Likes Greg Bernhardt
  • #8
I also think the bases selected in each of ##X,Y ## both have to be ordered bases for there to be an isomorphism between ## L(X,Y) ## , linear maps between ## X,Y ## and ##M_{n \times m}(R)## , where ##R## is the Ring; ## M_{n \times m}(R) ## is the space of matrices with coefficients in the ring and##X,Y ## are (free, of course) ##R ##-modules (both right- or left-, I think); I think this is the most general scope of the isomorphism
 
Last edited:
  • #9
fresh_42 said:
But given a matrix the unity vectors in both spaces always define a natural basis to which the matrix is a linear transformation.

Yes if one already has two bases then the matrix defines a linear map. But there is no natural given basis for a vector space. You need to select one. Not sure what you mean by the unity vectors.
 
  • #10
lavinia said:
Yes if one already has two bases then the matrix defines a linear map. But there is no natural given basis for a vector space. You need to select one. Not sure what you mean by the unity vectors.
Physicists probably write them ##e_i = (δ_{ij})_j##. I learned unit vectors. Ok, it's not the i-th basis vector but the coordinate representation of the i-th basis vector. But that is hair-splitting. To awake the impression that a matrix isn't a linear transformation is negligent. There is always a basis to which the matrix is a linear transformation. And in the finite dimensional case even without the use of the axiom of choice. I just wanted to avoid someone saying: "But I've read on the internet that a matrix isn't a linear transformation." The discussion distinguishing between the vectors themselves and their coordinate representation is in my opinion something for specialists and logicians.
 
  • #11
fresh_42 said:
Physicists probably write them ##e_i = (δ_{ij})_j##. I learned unit vectors. Ok, it's not the i-th basis vector but the coordinate representation of the i-th basis vector. But that is hair-splitting. To awake the impression that a matrix isn't a linear transformation is negligent. There is always a basis to which the matrix is a linear transformation. And in the finite dimensional case even without the use of the axiom of choice. I just wanted to avoid someone saying: "But I've read on the internet that a matrix isn't a linear transformation." The discussion distinguishing between the vectors themselves and their coordinate representation is in my opinion something for specialists and logicians.

Hmmm. The point I was trying to make is that a matrix determines a continuum of linear transformations each of which depends on a choice of basis.
 
  • #12
lavinia said:
Hmmm. The point I was trying to make is that a matrix determines a continuum of linear transformations each of which depends on a choice of basis.
Good point. I remember I had my difficulties, too, when I first learned the concept. All of a sudden there surfaced matrices ##T## and ##T^{-1}## surrounding my original ##A##...or even worse ##T## and ##S^{-1}##
 
  • #13
fresh_42 said:
Physicists probably write them ##e_i = (δ_{ij})_j##. I learned unit vectors. Ok, it's not the i-th basis vector but the coordinate representation of the i-th basis vector. But that is hair-splitting. To awake the impression that a matrix isn't a linear transformation is negligent. There is always a basis to which the matrix is a linear transformation. And in the finite dimensional case even without the use of the axiom of choice. I just wanted to avoid someone saying: "But I've read on the internet that a matrix isn't a linear transformation." The discussion distinguishing between the vectors themselves and their coordinate representation is in my opinion something for specialists and logicians.
But a matrix does not necessarily describe a linear transformation (sorry if this is not what you mean). It can represent the adjacency conditions of a graph, a Markov process, etc. If you mean that there is a bijection ( isomorphism) between linear maps and matrices, then I agree.
 
  • #14
WWGD said:
But a matrix does not necessarily describe a linear transformation (sorry if this is not what you mean). It can represent the adjacency conditions of a graph, a Markov process, etc. If you mean that there is a bijection ( isomorphism) between linear maps and matrices, then I agree.
That's a good one. But to be honest, e.g. Markov processes didn't come to my mind in a thread about linear transformations.
It reminds me on a test I once recorded. The student could perfectly define a linear transformation and was asked about an example. The professor would had been satisfied with a rotation or just a matrix. Unfortunately for the student he couldn't tell one. I remember this because I still wonder what the professor would have answered on my example. I would have answered: 0. (And 1 next.)
But of course you are completely right: a matrix is nothing else as any elements of any set ordered in a rectangle. Or a movie ...
 
Last edited:
  • #15
lavinia said:
Hmmm. The point I was trying to make is that a matrix determines a continuum of linear transformations each of which depends on a choice of basis.
I don't know if this gets you into Philosophy, but isn't a linear transformation expressed in different bases essentially the same linear transformation, i.e., given ##L## in anyone basis, then the set { ## S^{-1}LS ##} for any (invertible) matrix S just one linear transformation?
 
  • #16
WWGD said:
I don't know if this gets you into Philosophy, but isn't a linear transformation expressed in different bases essentially the same linear transformation, i.e., given ##L## in anyone basis, then the set { ## S^{-1}LS ##} for any (invertible) matrix S just one linear transformation?

Yes it is the same. But its matrix representation changes by a conjugation - at least for a linear map of a vector space into itself.
Perhaps the insight should explain this by showing how the matrix changes for a change of basis.
 
  • #17
fresh_42 said:
Physicists probably write them ##e_i = (δ_{ij})_j##. I learned unit vectors. Ok, it's not the i-th basis vector but the coordinate representation of the i-th basis vector. But that is hair-splitting.
Really? Take ##P_2##, the vector space of, say, real polynomials of order ##\le 2##, including the zero polynomial. There are no unit vectors here. (We cannot even normalize to unity because there is no norm chosen yet.) Let's pick a basis, perhaps ##\mathcal{A} := \{1, x, x^2\}## and consider ##p \in P_2## defined by ##p(x) = 6 - x^2##. Then its coordinate vector is ##[p]_{\mathcal{A}} = [6, 0, -1] \in \mathbb{R}^3##. However, with respect to the basis ##\mathcal{B} = \{2 - x, x, -x^2\}## we have ##[p]_{\mathcal{B}} = [3, 3, 1]_{\mathcal{B}} \in \mathbb{R}^3##. Also, the representation of the first basis vector in ##\mathcal{A}## with respect to ##\mathcal{B}## is ##[1]_{\mathcal{B}} = [\tfrac{1}{2},\tfrac{1}{2},0] \in \mathbb{R}^3##, etc.

Especially when learning LA, it is very important that students, in mathematics and physics alike, distinguish between ##p##, ##[p]_{\mathcal{A}}## and ##p_{\mathcal{B}}##. (Later on, they may learn that these vectors are related through isomorphisms, but that is not how one starts.) It is also crucial when doing computations, for example in numerical analysis.
fresh_42 said:
To awake the impression that a matrix isn't a linear transformation is negligent.
Nobody awoke this impression, but as we have just seen, one has to be precise, especially when dealing with vector spaces different from ##\mathbb{R}^n## or ##\mathbb{C}^n##.
fresh_42 said:
There is always a basis to which the matrix is a linear transformation. And in the finite dimensional case even without the use of the axiom of choice.
In combination with your earlier comments on unit vectors, you seem to suggest that for infinite dimensional vector spaces every matrix defines a linear transformation on the space. This is already false for separable Hilbert spaces. Take the sequence space ##\ell_2## with the canonical basis (yes indeed, the one consisting of the unit vectors ##\{(\delta_{mn})_{m = 1}^{\infty} \,:\,n \in \mathbb{N}\}##) and consider the infinite matrix ##M = (\delta_{mn}n)_{m,n=1}^{\infty}##. Then ##x = [n^{-1}]_{n=1}^{\infty}## is in ##\ell_2## but ##Mx = [1]_{n=1}^{\infty}## is not.

In fact, by Parseval's identity there is no orthonormal basis of ##\ell_2## with respect to which ##M## represents a linear operator.
fresh_42 said:
The discussion distinguishing between the vectors themselves and their coordinate representation is in my opinion something for specialists and logicians.
No, it is not, as we have already seen in the example on ##P_2##. It is part of any decent first course on linear algebra. On the other hand, if by "specialists" are meant people that actually know what they are talking about, then I agree.

Yes, I'm quite irritated. Your post lacks any, well... insight, and is one of those that sometimes makes me wonder whether I'm wasting my time here.
 
Last edited:
  • Like
Likes micromass
  • #18
I have made some minor edits. (Greg showed me how). I fixed the mistake that Krylov found, removed the equation numbers, and made some minor changes to the language.

I got a comment about my usage of the term "n-tuple". I have always felt that it's unnecessary to say "ordered n-tuple", since no one uses the term "n-tuple" to refer to a set of cardinality n. How do you guys feel about this? Do you feel that my usage is like saying "line" instead of "straight line", or that it's plain wrong?
 
  • Like
Likes Greg Bernhardt
  • #20
Fredrik said:
since no one uses the term "n-tuple" to refer to a set of cardinality n
Haven't met anyone either. If it's not ordered you won't say tuple. Even the notation in round brackets implies it's ordered, imao.
 

What is a matrix representation of a linear transformation?

A matrix representation of a linear transformation is a way to represent the output of a linear transformation as a matrix. It shows the relationship between the input and output vectors of the transformation, and allows for easier computation of the transformation.

How is a matrix representation of a linear transformation different from a linear transformation?

A linear transformation is a function that maps one vector space to another, while a matrix representation of a linear transformation is a specific way to represent that transformation using a matrix. The matrix representation is a more concrete and computationally useful representation of the transformation.

What are the benefits of using a matrix representation of a linear transformation?

Using a matrix representation of a linear transformation allows for easier computation of the transformation, as matrix operations are well-defined and straightforward. It also allows for a clearer visualization of the transformation and its effects on the input vectors.

Can any linear transformation be represented as a matrix?

No, not all linear transformations can be represented as a matrix. Only linear transformations between finite-dimensional vector spaces can be represented as matrices. Additionally, only linear transformations that preserve certain properties, such as linearity and basis vectors, can be represented as matrices.

Do all matrices represent linear transformations?

No, not all matrices represent linear transformations. Only matrices with certain properties, such as being square and invertible, can represent linear transformations. Additionally, not all matrices represent linear transformations between the same vector spaces.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
995
  • Linear and Abstract Algebra
Replies
1
Views
322
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
20
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Back
Top