- #1

Hall

- 351

- 87

- TL;DR Summary
- Matrix representation of Linear Transformations.

In geometry, a vector ##\vec{X}## in n-dimensions is something like this

$$

\vec{X} = \left( x_1, x_2, \cdots, x_n\right)$$

And it follows its own laws of arithmetic.

In Linear Analysis, a polynomial ##p(x) = \sum_{I=1}^{n}a_n x^n ##, is a vector, along with all other mathematical objects of which analysis can be done.

So far so consistent, no trespassing in each other's domain, just a same name. Not a big deal.

But, then comes the Mattresses. If ##A## is a mattress, that is a rectangular array of numbers whose columns are coefficients of basis elements of co-domain of a linear transformation (to which it corresponds) when applied on basis elements of domain. It is being said that, if ##x## is any vector in the domain of a linear transformation ##T##, then ##T(x) = b##, is the same thing as

$$

A x = b$$

where, ##A x## is a matrix multiplication. But isn't this interchanging only valid in cases of geometric vectors? That is,

$$

T(\vec{x}) = \vec{b}$$

$$

T [ (x_1, x_2, \cdots x_n) ] = (b_1, b_2 , \cdots b_n)$$

$$

\begin{bmatrix}

a_{11} & \cdots a_{1n} \\

\vdots &\vdots \\

a_{n1} & \cdots a_{nn}\\

\end{bmatrix}

\times

\begin{bmatrix}

x_1\\

x_2\\

\cdots \\

x_n\\

\end{bmatrix}

=

\begin{bmatrix}

b_1 \\

b_2 \\

\cdots \\

b_n\\

\end{bmatrix}

$$

Quite well.

But how do you represent by mattress multiplication the linear transformation which acts on a polynomial and gives out its derivative as output? There are no

$$

\vec{X} = \left( x_1, x_2, \cdots, x_n\right)$$

And it follows its own laws of arithmetic.

In Linear Analysis, a polynomial ##p(x) = \sum_{I=1}^{n}a_n x^n ##, is a vector, along with all other mathematical objects of which analysis can be done.

So far so consistent, no trespassing in each other's domain, just a same name. Not a big deal.

But, then comes the Mattresses. If ##A## is a mattress, that is a rectangular array of numbers whose columns are coefficients of basis elements of co-domain of a linear transformation (to which it corresponds) when applied on basis elements of domain. It is being said that, if ##x## is any vector in the domain of a linear transformation ##T##, then ##T(x) = b##, is the same thing as

$$

A x = b$$

where, ##A x## is a matrix multiplication. But isn't this interchanging only valid in cases of geometric vectors? That is,

$$

T(\vec{x}) = \vec{b}$$

$$

T [ (x_1, x_2, \cdots x_n) ] = (b_1, b_2 , \cdots b_n)$$

$$

\begin{bmatrix}

a_{11} & \cdots a_{1n} \\

\vdots &\vdots \\

a_{n1} & \cdots a_{nn}\\

\end{bmatrix}

\times

\begin{bmatrix}

x_1\\

x_2\\

\cdots \\

x_n\\

\end{bmatrix}

=

\begin{bmatrix}

b_1 \\

b_2 \\

\cdots \\

b_n\\

\end{bmatrix}

$$

Quite well.

But how do you represent by mattress multiplication the linear transformation which acts on a polynomial and gives out its derivative as output? There are no

*components*of a polynomial, so how would we get a column for ##p(x) = \sum_{I=1}^{n}a_n x^n ##? Would each unlike term form a component?