The vector space of linear transformations

Click For Summary

Discussion Overview

The discussion revolves around the relationship between linear transformations and transformation matrices, specifically whether every linear mapping between arbitrary vector spaces can be represented as multiplication by a transformation matrix. The scope includes theoretical aspects of linear algebra and the representation of linear maps in finite-dimensional spaces.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants propose that linear transformations from ℝ^n to ℝ^m can be viewed as multiplication by an m × n matrix, suggesting a generalization to arbitrary vector spaces.
  • Others argue that every linear map between finite-dimensional vector spaces can be represented as a matrix, referencing standard topics in linear algebra.
  • A participant questions whether establishing an isomorphism is necessary for the representation of linear transformations as matrices.
  • Clarifications are made regarding the components of vectors in a basis and the distinction between linear maps and their matrix representations.
  • It is noted that the discussion may become more complex if the vector spaces are not finite-dimensional.

Areas of Agreement / Disagreement

Participants generally agree that linear maps between finite-dimensional vector spaces can be represented as matrices, but there is a lack of consensus on whether this representation holds for arbitrary vector spaces without additional conditions.

Contextual Notes

Some limitations include the dependence on the choice of basis and the potential complications arising from infinite-dimensional spaces, which remain unresolved in the discussion.

Bipolarity
Messages
773
Reaction score
2
Consider the operation of multiplying a vector in ℝ^{n} by an m \times n matrix A. This can be viewed as a linear transformation from ℝ^{n} to ℝ^{m}. Since matrices under matrix addition and multiplication by a scalar form a vector space, we can define a "vector space of linear transformations" from ℝ^{n} to ℝ^{m}.

My question is whether this connection between linear transformations and transformation matrices exists for linear mappings to and from arbitrary vector spaces. So given some general n-dimensional vector space U and m-dimensional vector space W, can every linear mapping from U to W be viewed as multiplication by a m \times n transformation matrix ?

Or is there a linear transformation which cannot be viewed as multiplication by a transformation matrix?

BiP
 
Last edited:
Physics news on Phys.org
Yes, because you can always reduce your general situation down to ##\mathbb{R}^n##.

Given a general n-dimensional vector space U, you can choose a basis ##\{ \hat e_1, \hat e_2, \ldots, \hat e_n \}## (and without loss of generality you can let it be orthogonal, for convenience). Now consider the map $$\phi: U \to \mathbb{R}^n, u_1 \hat e_1 + \cdots + u_n \hat e_n \mapsto ( u_1, \ldots, u_n)$$.
 
What is \{ u_{1},u_{2}...,u_{n} \} ?
And once you've reduced to ℝ^{n} and ℝ^{m}, is it necessarily the case that the linear transformation can be viewed as multiplication by a transformation matrix? Would we need to establish an isomorphism between them?BiP
 
Every linear map between two finite dimensional vector spaces can be represented as a matrix between the associated euclidean spaces. This is called the matrix representation of the linear map and is a standard topic in most linear algebra textbooks. See, for example, chapter 5 of Lang "Linear Algebra".
 
Bipolarity said:
What is \{ u_{1},u_{2}...,u_{n} \} ?
They are the components of the vector in U. In the definition of the function I wrote an arbitrary element of U as ##u_1 \hat e_1 + u_2 \hat e_2 + \cdots + u_n \hat e_n## which I can do because ##\{ \hat e_i \mid i = 1, \ldots, n \}## is a basis.

WannabeNewton said:
Every linear map between two finite dimensional vector spaces can be represented as a matrix between the associated euclidean spaces.

To give you an idea: choose a basis ##\{ \hat e_1, \hat e_2, \ldots, \hat e_n \}## on ##\mathbb{R}^n## and ##\{ \hat f_1, \hat f_2, \ldots, \hat f_m \}## on ##\mathbb{R}^m##. Let ##T : \mathbb{R}^n \to \mathbb{R}^m## be a linear transformation. Then you can write
$$T \hat e_i = a_{i,1} \hat f_1 + \cdots + a_{i,m} \hat f_m = \sum_{j = 1}^m a_{ij} \hat f_j$$
The coefficients ##a_{ij}## form entries of an ##n \times m## matrix ##A##. It is straightforward to check that multiplying ##A## with a vector which contains a 1 in position i and 0 in all others (e.g. (0, 0, ..., 0, 1, 0, 0, ...)) gives you the correct coefficients, so that A is the matrix representation of T.
Note that I make a distinction between the linear map T itself and the matrix A. Of course, you are free to choose another basis on ##\mathbb{R}^n## and/or ##\mathbb{R}^m##, which will change the numbers in the matrix. The transformation T doesn't change though - this can be confusing at first! :)

Now if ##S: U \to V## is another transformation you can let ##\phi: U \to \mathbb{R}^n## be as I described earlier, mapping an element of U into a vector in ##\mathbb{R}^n## by picking out the components relative to some arbitrary basis. Let ##\psi: V \to \mathbb{R}^m## be a similar map for V. Now you can show that ##\phi## and ##\psi## are bijections and therefore for any u in U, you can write S(u) as ## \psi^{-1} A \phi u## for some suitable matrix A, which just represents a linear transformation ##\mathbb{R}^n \to \mathbb{R}^m##.

If you don't follow the above paragraph, just think of it this way: if U is an n-dimensional vector space, you can see it as just ##\mathbb{R}^n## looking slightly different. So any map between U and an m-dimensional space V is secretly just a map from ##\mathbb{R}^n## to #\mathbb{R}^m##.

(PS: Note that things do get trickier if the spaces are no longer finite-dimensional.)
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 27 ·
Replies
27
Views
3K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
8K