What is the importance of linear transformations in linear algebra?

Click For Summary

Discussion Overview

The discussion revolves around the importance of linear transformations in linear algebra, exploring their role in teaching and understanding the subject. Participants debate whether linear transformations should be introduced before or after matrices, and the implications of each approach for student comprehension and motivation.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Technical explanation

Main Points Raised

  • Some participants argue that linear transformations should be taught independently of matrices, emphasizing their geometric significance and foundational role in linear algebra.
  • Others suggest that matrices are essential for practical applications and computations, though they acknowledge that matrices are merely representations of transformations.
  • A participant presents differentiation as an example of a linear transformation that cannot be easily described by a matrix in infinite-dimensional spaces, but can be represented in finite-dimensional subspaces.
  • There is a mention of Emil Artin's perspective that linear transformations should be the primary focus, with matrices used only for computations.
  • One participant proposes introducing linear transformations first, suggesting that this approach can clarify the relationship between transformations and their matrix representations, particularly when discussing bases.
  • Another participant corrects a previous statement about the equivalence of bases and matrices, noting that multiple bases can yield the same matrix representation for certain transformations.

Areas of Agreement / Disagreement

Participants express differing views on the order of introducing linear transformations and matrices, with no consensus reached on the best pedagogical approach. The discussion remains unresolved regarding the optimal method for teaching these concepts.

Contextual Notes

Some limitations include the dependence on definitions of linear transformations and matrices, as well as the potential confusion arising from the relationship between bases and their corresponding matrices.

matqkks
Messages
283
Reaction score
6
How important are linear transformations in linear algebra? In some texts linear transformations are introduced first and then the idea of a matrix. In other books linear transformations are relegated to being an application of matrices. What is the best way of introducing linear transformation on a linear algebra course? How do we motivate students to study transformations as part of linear algebra? What is their real impact?
 
Physics news on Phys.org
I personally feel that linear transformations can and should be taught without matrices at all. Matrices are a means of representing such transformations; they're not the transformations themselves. While students should be familiar with matrices as a matter of practicality, as a physicist, I can see the geometric content of this...

\underline P(a) = a - (a \cdot \hat n) \hat n

...more than I can in just being given the matrix \underline P(e_i) \cdot e_j. But I know math is much, much bigger and broader than what I as a physicist use, so perhaps some of that breadth makes the matrix representation necessary.
 
Last edited:
the basic example of a linear transformation in mathematics is perhaps differentiation.

this is not readily described by a matrix since the space of differentiable functions is infinite dimensional.

however when restricted to a finite dimensional subspace such as polynomials of degree ≤ n, one obtains the basic example of "nilpotent"matrix.

when restricted to a finite dimensional space of functions of form (e^ct)t^k, for 0≤ k ≤ n, one obtains as matrix a basic jordan block.

failure to give this example early on may explain why students of advanced linear algebra find jordan form so strange.

for a presentation of linear algebra including these examples, you may see either the nice book by Insel, Friedberg, and Spence, or my free notes (#7) for math 4050 on my website at UGA.

http://www.math.uga.edu/%7Eroy/4050sum08.pdf

One difference between these two sources is my heavier reliance on the basic concept of the minimal polynomial for a linear transformation. Their book, replete with examples, is also about 4 times as long as mine.

by the way the famous mathematician Emil Artin, argues in his book Geometric Algebra precisely that linear transformations should be the primary object of study and that matrices should be left out almost entirely, except say when a computation needs to be made, say of a determinant. Then he says to immediately throw them out again afterwards. I.e. matrices are entirely a device for representing and computing with linear transformations.

A matrix is given by a pair: (linear transformation of a vector space, basis for that vector space)

equivalently, given a vector space and a linear transformation, matrices for that transformation are equivalent to bases of the vector space. Hence a good choice of basis may be expected give a good matrix.
 
Last edited by a moderator:
oops my bad. i wasn't thinking. several different bases may give the same matrix, so the two concepts are not equivalent. e.g. the zero transformation has the same matrix in all bases, as does the identity transformation. a basis gives a matrix. also given a basis, transformations are equivalent to matrices.

the concept sliding around in my memory was that given an n dimensional vector space V, bases of V are equivalent to isomorphisms of V with k^n.
 
I think it's better to introduce linear transformations first. Matrices can be introduced as soon as the student understand bases. It might be a good idea to first use the notation that puts the row index upstairs, because that makes it almost impossible to get the formula for the components of a linear transformation with respect to a pair of bases wrong: ##A^i_j=(Ae_j)^i##. This way the definition of matrix multiplication can be properly motivated. If ##A:Y\to Z## and ##B:X\to Y## are linear transformations, and [A] and denote the matrices corresponding to A and B (given bases for X,Y,Z), the product [A] is defined as [A##\circ##B], i.e. as the matrix corresponding to ##A\circ B:X\to Z##.
 
Last edited by a moderator:

Similar threads

  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K