Vectors as geometric objects and vectors as any mathematical objects

In summary, a vector in geometry is represented by a set of coordinates in n-dimensions and follows its own laws of arithmetic. In Linear Analysis, a polynomial is also considered a vector, along with other mathematical objects that can be analyzed. However, a distinction is made between the object itself and its measurement, with coordinates representing the measurements of the vector. In the case of polynomials, they can be represented as an infinite-dimensional vector space with a basis of polynomials. This basis allows for the representation of an arbitrary polynomial as a linear combination of the basis vectors, with the coefficients being the coordinates of the polynomial. This is similar to measuring a point in a Cartesian coordinate system, where the coordinates represent the components of the vector.f
  • #1
351
81
TL;DR Summary
Matrix representation of Linear Transformations.
In geometry, a vector ##\vec{X}## in n-dimensions is something like this
$$
\vec{X} = \left( x_1, x_2, \cdots, x_n\right)$$
And it follows its own laws of arithmetic.

In Linear Analysis, a polynomial ##p(x) = \sum_{I=1}^{n}a_n x^n ##, is a vector, along with all other mathematical objects of which analysis can be done.

So far so consistent, no trespassing in each other's domain, just a same name. Not a big deal.

But, then comes the Mattresses. If ##A## is a mattress, that is a rectangular array of numbers whose columns are coefficients of basis elements of co-domain of a linear transformation (to which it corresponds) when applied on basis elements of domain. It is being said that, if ##x## is any vector in the domain of a linear transformation ##T##, then ##T(x) = b##, is the same thing as
$$
A x = b$$
where, ##A x## is a matrix multiplication. But isn't this interchanging only valid in cases of geometric vectors? That is,
$$
T(\vec{x}) = \vec{b}$$
$$
T [ (x_1, x_2, \cdots x_n) ] = (b_1, b_2 , \cdots b_n)$$
$$
\begin{bmatrix}
a_{11} & \cdots a_{1n} \\
\vdots &\vdots \\
a_{n1} & \cdots a_{nn}\\
\end{bmatrix}
\times
\begin{bmatrix}
x_1\\
x_2\\
\cdots \\
x_n\\
\end{bmatrix}
=
\begin{bmatrix}
b_1 \\
b_2 \\
\cdots \\
b_n\\
\end{bmatrix}
$$

Quite well.

But how do you represent by mattress multiplication the linear transformation which acts on a polynomial and gives out its derivative as output? There are no components of a polynomial, so how would we get a column for ##p(x) = \sum_{I=1}^{n}a_n x^n ##? Would each unlike term form a component?
 
  • #2
But how do you represent by mattress multiplication the linear transformation which acts on a polynomial and gives out its derivative as output? There are no components of a polynomial, so how would we get a column for ##p(x) = \sum_{I=1}^{n}a_n x^n ##? Would each unlike term form a component?
First of all, it is 'matrix' and 'matrices'.

The distinction you make is not the one between geometry and algebra or analysis, it is the distinction between an object and its measurement. Coordinates are the marks on the scales that measure the vectors.

If we look at the vector space of polynomials, then we have in general, i.e. in the case of arbitrary degrees and infinite scalar fields, an infinite-dimensional vector space. So any polynomial can be written as
$$
p(x)=a_0+a_1x+\ldots+a_nx^n \triangleq (a_0,a_1,\ldots,a_n, 0,\ldots) \triangleq (a_k)_{k\in \mathbb{N_0}} \text{ with } a_k=_{a.a.}0
$$
where a.a. means almost all means all except finitely many.

The linear transformation ##p\longrightarrow p'## is simply
$$
(a_k)_{k\in \mathbb{N_0}} \longmapsto (k\cdot a_k)_{k\in \mathbb{N}}.
$$
 
  • Like
Likes dextercioby and Delta2
  • #3
To add a bit, the Vector Space of Polynomials has a basis ( As do all V.Spaces, if you assume Choice). Then you define the transformation on basis vectors, and then extend by linearity.
 
Last edited:
  • #4
The distinction you make is not the one between geometry and algebra or analysis, it is the distinction between an object and its measurement. Coordinates are the marks on the scales that measure the vectors.
Can you please amplify it a little more? Did you mean a polynomial is just ##p(x)## and its measure (I don't know if you intended the usual meaning of the word) is
$$
\left(a_0, a_1, \cdots a_n\right)$$
?
 
  • #5
Can you please amplify it a little more? Did you mean a polynomial is just ##p(x)## and its measure (I don't know if you intended the usual meaning of the word) is
$$
\left(a_0, a_1, \cdots a_n\right)$$
?
A polynomial is an expression ##p(x)=a_0+a_1x+a_2x^2+\ldots+a_nx^n=\sum_{k=0}^\infty a_kx^k## where almost all ##a_k=0.## It is represented by a sequence of scalars that is zero up to finitely many positions where it is not zero. Hence
$$
p(x)\triangleq (a_0,a_1,a_2,\ldots,a_n) \triangleq (a_0,a_1,a_2,\ldots, a_n, 0,0,0,0,\ldots)
$$
They span an infinite-dimensional vector space. E.g.
\begin{align*}
p_0(x)=1&\triangleq (1,0,0,0,0,\ldots) \\
p_1(x)=x&\triangleq (0,1,0,0,0,\ldots) \\
p_2(x)=x^2&\triangleq (0,0,1,0,0,\ldots) \\
p_3(x)=x^3&\triangleq (0,0,0,1,0,\ldots) \\
p_4(x)=x^4&\triangleq (0,0,0,0,1,\ldots) \\
\phantom{p_n(x)=x^n}&\,\,\,\vdots \\
p_n(x)=x^n&\triangleq (0,0,0,0,0,\ldots,0,1,0,\ldots) \\
\phantom{p_n(x)=x^n}&\,\,\,\vdots
\end{align*}

An arbitrary polynomial is thus
$$
p(x)=a_0\cdot p_0(x)+a_1\cdot p_1(x) +a_2\cdot p_2(x) +\ldots+a_n\cdot p_n(x)+ 0\cdot p_{n+1}(x)+0\cdot p_{n+2}(x)+\ldots
$$
expressed according to the basis I have chosen.
##\{p_k(x)=x^k\,|\,k\in \mathbb{N}_0\}## are the basis vectors
##(a_0,a_1,a_2,\ldots,a_n,0,0,\ldots)## are the coordinates of ##p(x)## according to this basis if we draw a Cartesian coordinate system: we mark ##a_k## on every ##k##-th axis. We measure the ##k##-th portion of ##p(x).##
##p(x)=\sum_{k_0}^\infty a_kp_k(x)## and ##\{a_0,a_1x,a_2x^2,\ldots,a_nx^n,0,0,\ldots\}## are the components of ##p(x),## the vectors we have to add in order to get ##p(x).##

So whether we add polynomials or add sequences makes no difference. Since the basis above is natural, i.e. comes automatically to mind first, many authors of algebra textbooks identify both representations. However, formally it requires a fixed basis.

You are probably used to measuring a point ##(x,y)=(2,1)## in a Cartesian coordinate system. It is a certain point, or a vector from ##(0,0)## to that point, depending on how you want to see it.

It is the same here. Only that this vector would be ##p(x)=2+1\cdot x.##

We need the infinite sequence (filled up with ##0##) because we do not have any restriction on the degree.
 
Last edited:
  • Like
Likes Steve4Physics, Orodruin, PhDeezNutz and 2 others
  • #6
Just to mention: The ”matrix” for a linear transformation on an infinite dimensional vector space is of course also infinite. You can also look at polynomials of at most degree ##N## and your vector space will instead be ##N+1##-dimensional and the matrix representing a linear transformation ##(N+1)\times(N+1)##.

You also do not need to choose the monomials ##x^n## as your basis (just as you do not need to choose (1,0) and (0,1) as a basis for ##\mathbb R^2##). Another basis would lead to a different matrix representing the linear transformation.
 
Last edited by a moderator:
  • #7
Hi @Hall. If you are still reading this thread, you might find the video below helpful. It includes a very good explanation of treating polynomials as vectors and how differentiation can then be treated as a matrix operating on such a vector.

If the video seems too long (it’s ~17mins), jump straight to 6:50. But (IMO) the whole video is beautifully done and worth watching.

 
  • Like
Likes PhDeezNutz and Hall
  • #8
Hi @Hall. If you are still reading this thread, you might find the video below helpful. It includes a very good explanation of treating polynomials as vectors and how differentiation can then be treated as a matrix operating on such a vector.
Oh! thank you, Steve.
 
  • #9
Summary: Matrix representation of Linear Transformations.

In geometry, a vector ##\vec{X}## in n-dimensions is something like this
$$
\vec{X} = \left( x_1, x_2, \cdots, x_n\right)$$
And it follows its own laws of arithmetic.

In Linear Analysis, a polynomial ##p(x) = \sum_{I=1}^{n}a_n x^n ##, is a vector, along with all other mathematical objects of which analysis can be done.

So far so consistent, no trespassing in each other's domain, just a same name. Not a big deal.

But, then comes the Mattresses. If ##A## is a mattress, that is a rectangular array of numbers whose columns are coefficients of basis elements of co-domain of a linear transformation (to which it corresponds) when applied on basis elements of domain. It is being said that, if ##x## is any vector in the domain of a linear transformation ##T##, then ##T(x) = b##, is the same thing as
$$
A x = b$$
where, ##A x## is a matrix multiplication. But isn't this interchanging only valid in cases of geometric vectors? That is,
$$
T(\vec{x}) = \vec{b}$$
$$
T [ (x_1, x_2, \cdots x_n) ] = (b_1, b_2 , \cdots b_n)$$
$$
\begin{bmatrix}
a_{11} & \cdots a_{1n} \\
\vdots &\vdots \\
a_{n1} & \cdots a_{nn}\\
\end{bmatrix}
\times
\begin{bmatrix}
x_1\\
x_2\\
\cdots \\
x_n\\
\end{bmatrix}
=
\begin{bmatrix}
b_1 \\
b_2 \\
\cdots \\
b_n\\
\end{bmatrix}
$$

Quite well.

But how do you represent by mattress multiplication the linear transformation which acts on a polynomial and gives out its derivative as output? There are no components of a polynomial, so how would we get a column for ##p(x) = \sum_{I=1}^{n}a_n x^n ##? Would each unlike term form a component?
I literally thought you meant "matrasses", to put on beds. I thought it was a new word used in math. Mattresses are rectangular, and their thickness could harbor vectors... Then I read the comment.
 
  • #10
Can't, beside polynomials, all orthogonal sets of functions be described by vectors in vector spaces?
 
  • #11
Can't, beside polynomials, all orthogonal sets of functions be described by vectors in vector spaces?
I believe so. Wikipedia says
"In mathematics, orthogonal functions belong to a function space that is a vector space equipped with a bilinear form."
https://en.wikipedia.org/wiki/Orthogonal_functions

In fact the orthogonal functions are not merely vectors - they are basis vectors.

Edit: Before I (possibly) get told-off, can I note that I was using Wikipedia for illustration, not as a definitive source.
 
Last edited:
  • #12
Can't, beside polynomials, all orthogonal sets of functions be described by vectors in vector spaces?
Why do you restrict the set to orthogonal functions? Any set of functions can be described as vectors in a vector space. We have ##(f+g)(x)=f(x)+g(x)## and ##(\alpha f)(x)=\alpha f(x)## and that is all we need.
 
  • Like
Likes JandeWandelaar
  • #13
Why do you restrict the set to orthogonal functions? Any set of functions can be described as vectors in a vector space. We have ##(f+g)(x)=f(x)+g(x)## and ##(\alpha f)(x)=\alpha f(x)## and that is all we need.

Provided the codomain of these functions is a field.
 
  • #14
Why do you restrict the set to orthogonal functions? Any set of functions can be described as vectors in a vector space. We have ##(f+g)(x)=f(x)+g(x)## and ##(\alpha f)(x)=\alpha f(x)## and that is all we need.
… and presuming ##f(x) + g(x)## and ##\alpha f(x)## belong to the set if ##f(x)## and ##g(x)## do …
 
  • #15
Why do you restrict the set to orthogonal functions? Any set of functions can be described as vectors in a vector space. We have ##(f+g)(x)=f(x)+g(x)## and ##(\alpha f)(x)=\alpha f(x)## and that is all we need.
Yes. And we can use any complete orthogonal set as basis? Or any set of non colinear functions?
 
  • #16
Yes. And we can use any complete orthogonal set as basis? Or any set of non colinear functions?
There is no need for there to be an inner product, which would define what ”orthogonal” means.
 
  • #17
… and presuming ##f(x) + g(x)## and ##\alpha f(x)## belong to the set if ##f(x)## and ##g(x)## do …
No. We only spoke of sets, not subspaces. ##M:=\{f,g\}## is a set of vectors (given that their codomain and scalar domain is a field) regardless of whether ##f+g\in M## or not. Functions are vectors of some vector space. The point was that we do not need an angle. I did not intend to repeat any definitions.

Provided the codomain of these functions is a field.
And I think it is sufficient that the codomain is a vector space over the same field as the scalars for the function. I do not see where we need the field property of the codomain. If we are fussy, then we should be rigorously fussy.
 
  • #18
IIRC, you can turn a set into a vector space by just using formal sums of elements and use the 'expected' scaling properties over a Field/Ring( for vector Spaces/Modules), to define a free vector space over a set ##S ##;

Given ##S:= \{s_i : i \in I \} ##, define;'
'## s_j+ s_k ## as a formal sum, and ## c (\Sigma s_j + \Sigma _k ^{h}s_k)=\Sigma c s_j + \Sigma cs_k##

Basically, an injection between ##S ## and the underlying subset to a vector space ##V## allows you to pullback the Vector Space structure from ##V## into ##S##

IIRC @micromass ( apparently not around anymore) talked about this here a few times

https://math.stackexchange.com/questions/2916947/free-vector-space-over-a-set
 
  • #19
IIRC, you can turn a set into a vector space by just using formal sums of elements and use the 'expected' scaling properties over a Field/Ring( for vector Spaces/Modules), to define a free vector space over a set ##S ##;

Given ##S:= \{s_i : i \in I \} ##, define;'
'## s_j+ s_k ## as a formal sum, and ## c (\Sigma s_j + \Sigma _k ^{h}s_k)=\Sigma c s_j + \Sigma cs_k##

Basically, an injection between ##S ## and the underlying subset to a vector space ##V## allows you to pullback the Vector Space structure from ##V## into ##S##

IIRC @micromass ( apparently not around anymore) talked about this here a few times

https://math.stackexchange.com/questions/2916947/free-vector-space-over-a-set
It is called the linear hull of a set.
 
  • #20
It is called the linear hull of a set.
I think it's somewhat -different. In the linear hull, you have a VS to start with, and the Linear Hull is the smallest subspace containing the set S. In this construction, you start with a "pure" set S without any additional structure and turn it into a Vector Space/Module with that construction.
 
  • #21
I think it's somewhat -different. In the linear hull, you have a VS to start with, and the Linear Hull is the smallest subspace containing the set S. In this construction, you start with a "pure" set S without any additional structure and turn it into a Vector Space/Module with that construction.
And where should be the difference? I do not need a predefined vector space structure or module. I use the free object generated by the set and get the linear hull.

In nerdy English (3 pages in my paperback Homological Algebra):

The co-universal construction "generating free modules" (a representing pair of a co-universal mapping problem that can be shown to be solvable) is a covariant functor ##\text{Set}\longrightarrow \text{Mod}_R.##

And as always with homological algebra, there is a phrasing in common English: take the formal sums and multiples.
 
  • #22
And where should be the difference? I do not need a predefined vector space structure or module. I use the free object generated by the set and get the linear hull.

In nerdy English (3 pages in my paperback Homological Algebra):

The co-universal construction "generating free modules" (a representing pair of a co-universal mapping problem that can be shown to be solvable) is a covariant functor ##\text{Set}\longrightarrow \text{Mod}_R.##

And as always with homological algebra, there is a phrasing in common English: take the formal sums and multiples.
I mean, in my experience, the linear hull over a set S is an object that exists in a given Vector Space V. It's the smallest subspace in V containing S. It assumes the existence of V. The other construction does not need such assumption.
 
  • #23
I mean, in my experience, the linear hull over a set S is an object that exists in a given Vector Space V. It's the smallest subspace in V containing S. It assumes the existence of V. The other construction does not need such assumption.
That's the normal case, but do you see a necessity? You can do the same thing with formal sums and multiples and abstract elements. If you want it even nerdier, then let us say: The category ##\text{Mod}_R## has enough projective objects.
 
  • #24
That's the normal case, but do you see a necessity? You can do the same thing with formal sums and multiples and abstract elements. If you want it even nerdier, then let us say: The category ##\text{Mod}_R## has enough projective objects.
My knowldge of that area is in the Categoryy of ##Rusty ##
 
  • #25
My knowldge of that area is in the Categoryy of ##Rusty ##
Take S={car, tree} then { a car + b tree | a,b from a field } is the linear hull or if you like it better, the free vector space generated by cars and trees. But as already mentioned, it is 3 pages in the book plus the pages that define universal mapping problems.
 
  • #26
Take S={car, tree} then { a car + b tree | a,b from a field } is the linear hull or if you like it better, the free vector space generated by cars and trees. But as already mentioned, it is 3 pages in the book plus the pages that define universal mapping problems.
My point was that I didn’t read it as you talking about the linear hull being a vector space. Your statement to me read that any set itself is a vector space:
Any set of functions can be described as vectors in a vector space.
I guess this is not what you meant and I do see that it can be read differently.
 
  • #27
I guess this is not what you meant and I do see that it can be read differently.

Yes, but I did not say vector (sub)space, I said a set of vectors.

Any set of functions can be described as vectors in a vector space.

'In a vector space' was necessary for the context, not the set.
 
  • #28
Yes, but I did not say vector (sub)space, I said a set of vectors.



'In a vector space' was necessary for the context, not the set.
Sure, I’m just explaining how I read it at first
 

Suggested for: Vectors as geometric objects and vectors as any mathematical objects

Replies
10
Views
613
Replies
5
Views
566
Replies
8
Views
631
Replies
15
Views
3K
Replies
1
Views
617
Replies
4
Views
1K
Replies
6
Views
765
Replies
1
Views
692
Back
Top