Motivating definitions in calculus on manifolds

Click For Summary
SUMMARY

The discussion focuses on the challenges of understanding tensor products in Spivak's "Calculus on Manifolds." The user seeks resources that provide practical examples and visualizations to aid comprehension. They highlight the utility of outer products in constructing n x n matrices and emphasize that while outer products yield rank-1 matrices, any n x n matrix can be represented as a sum of outer-product matrices. The user expresses a preference for simplified proofs, similar to those presented by 3Blue1Brown.

PREREQUISITES
  • Understanding of vector spaces and linear algebra concepts
  • Familiarity with tensor products and their applications
  • Basic knowledge of matrix operations and ranks
  • Experience with visual learning tools, such as videos or graphical representations
NEXT STEPS
  • Explore "Linear Algebra Done Right" by Sheldon Axler for a deeper understanding of linear transformations and tensor products
  • Watch 3Blue1Brown's video series on linear algebra for visual explanations of outer products and matrix ranks
  • Study "Introduction to Tensor Analysis and the Calculus of Moving Surfaces" by G. D. Smith for practical applications of tensors
  • Research the concept of multilinear algebra to gain insights into higher-dimensional arrays and their operations
USEFUL FOR

Students and self-learners in mathematics, particularly those studying calculus on manifolds, linear algebra, and tensor analysis. This discussion is beneficial for anyone seeking to grasp abstract mathematical concepts through practical examples and visual aids.

Avatrin
Messages
242
Reaction score
6
Hi

I am a person who always have had a hard time picking up new definitions. Once I do, the rest kinda falls into place. In the case of abstract algebra, Stillwell's Elements of Algebra saved me. However, in the case of Spivak's Calculus on Manifolds, I get demotivated when I get to concepts like tensor products (I am reading through it for self-study).

I just need to see why something like the tensor product is a useful operation. So, I need examples of books or videos that can prepare me to read through a book as devoid of examples as Calculus on Manifolds.

Ideally, I would like to see a simplified/visualized proof a la this video by 3Blue1Brown:
 
Last edited:
  • Like
Likes   Reactions: FactChecker
Physics news on Phys.org
There are many different ways. Here's one way.

We have n-dimensional vectors that are columns of n numbers, and we now want to introduce a generalisation of that to an object that is a n x n matrix (which will be an order-2 tensor).

We can start by 'multiplying' two vectors u and v together by taking their outer product. This means that the element in row i and column j of the product matrix is the i-th element of u times the j-th element of v.

We can get a very large collection of n x n matrices as outer products of vectors, but we can't get all n x n matrices because our outer products all have the characteristic that all rows are multiples of one another, and the same goes for the columns. So the rank of each outer product matrix is only 1. But nonzero n x n matrices can have any rank from 1 to n, so our set of outer products is only a small part of the set of possible n x n matrices.

How do we get the other matrices? By adding together outer-product matrices. Matrix ranks are not preserved by addition, so we can alter the rank by adding them together. In fact we can get all any n x n matrix as a sum of no more than ##n^2## outer-product matrices, since we can express it as a linear combination of all the ##n^2## matrices that have a 1 in only one spot and zeroes everywhere else. Each of those matrices is an outer-product matrix because the matrix with a 1 in the i,j position is the outer product of the vector with 1 in the ith position and the vector with 1 in the j-th position.

We can then generalise from matrices, which are two-dimensional arrays of numbers, to higher dimensional arrays, by making more outer products and taking linear combinations of them.
 
For example let ##f:\mathbb{R}^3\to\mathbb{R},\quad f\ne 0## be a linear function. Then ##\Pi=\{x\in\mathbb{R}^3\mid f(x)=0\}## is a plane, and let ##v\notin \Pi## be a vector. This vector defines a line ##L=\{t v\mid t\in\mathbb{R}\}##. Consider an operator ##P:\mathbb{R}^3\to L## of projection onto ##L## along the plane ##\Pi##. In terms of tensor product this operator is expressed extremely simple:
$$P=\frac{v\otimes f}{f(v)}$$
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
6K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 8 ·
Replies
8
Views
4K