What was the motivation to define what a vector space is?

Click For Summary

Discussion Overview

The discussion revolves around the motivation for defining vector spaces, particularly in the context of linear algebra and their application to solving linear equations. Participants explore the theoretical underpinnings and practical implications of vector spaces, including their role in various mathematical and applied contexts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • One participant notes that the motivation for defining a vector space relates to Gauss' reduction method and the handling of linear combinations of rows.
  • Another participant explains that vector spaces provide a general framework for solving linear equations, allowing for scalar multiplication and addition to make sense in various contexts.
  • A further contribution highlights that many phenomena behave like vectors, and defining spaces in terms of inner products allows for a geometric interpretation of these behaviors, particularly in applications like Fourier series decomposition.
  • The concept of orthonormal bases in inner product spaces is discussed, emphasizing their utility in decomposing functions and ensuring the independence of components in mathematical applications.

Areas of Agreement / Disagreement

Participants generally agree on the importance of vector spaces in the context of linear equations and their applications, but multiple perspectives on their motivations and implications are presented without a clear consensus.

Contextual Notes

Some assumptions about the definitions and properties of vector spaces are not fully explored, and the discussion does not resolve the complexities involved in applying these concepts across different mathematical contexts.

Tosh5457
Messages
130
Reaction score
28
The book I use for linear algebra explains that the motivation for defining a vector space has to do with the Gauss' reduction method taking linear combinations of the rows, but I don't understand the explanation very well. Can somebody explain?
 
Physics news on Phys.org
Hi Tosh5457! :smile:

The idea behind linear algebra is to provide a theory to solve (or at least to handle) linear equations. Solving linear equation in the real numbers is quite easy using Gauss' method.

Now, a vector space is the most general possible space in which linear equations still make sense and in which they are solvable.

For example, if V is an abstract vector space, then it makes sense to ask which vectors v and w satisfy

[tex]\left\{\begin{array}{c} 2v + 3w=0\\ 3v+4w=0\\ \end{array}\right.[/tex]

This makes sense. Indeed, 2v and 3w makes sense because it is scalar multiplication (and we always have scalar multiplication on a vector space). Also 2v+3w makes sense because it's an addition (and we always have an addition on a vector space). And 0 makes sense since the vector space always has a zero. So this system of equations makes sense.

Furthermore, we can solve this system of equations by exactly the same methods as we would solve a system over [itex]\mathbb{R}[/itex]. Try to solve this system over this general vector space!

So a vector space is a structure in which it is possible to describe linear equations and in which it is possible to solve them. Also, Gauss elimination works in any vector space.

So if we want to study how to solve linear equations, we might as well study them over arbitrary vector spaces. And this is what we do.
 
Adding to what micromass has said, a lot of phenomena act like vectors and when you have the ability to define things in term of an "inner product space", you end up getting a "geometric picture" of how particular objects behave with respect to particular basis.

Fourier series decomposition for periodic functions is a good demonstration of this. We can treat different frequency components as being independent contributions to the total signal so that no other frequency would affect a particular signal in the same way that changing the x co-ordinate of a point doesn't change the y or z co-ordinate of a point.

So with the inner product spaces (which are vector spaces with an inner product) we can actually find out orthonormal basis, and if the inner product is valid, we can treat these objects like arrows and for orthogonal basis, each contribution looks like it is at "right angles" to each other and can pictured like adding up "perpendicular lines" like a right angle triangle.

As a result, this provides a standard way of building decompositions and checking if decompositions are valid by showing orthonormality. Mathematically this is great because it is a way of breaking things down into atoms that are independent, and using this you can do all kinds of things like compressing signals, to doing fast classifications and many other applications.
 
Thanks for the explanations :smile:
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K