Matrix Multiplication and Function Composition

Click For Summary
Matrix multiplication is fundamentally linked to function composition in linear algebra, where matrices represent linear transformations. The discussion highlights the confusion around how to apply these transformations to basis vectors, specifically regarding the functions f and g. The author of the referenced article emphasizes that understanding how linear transformations affect basis vectors is key, leading to the equation f(g(w1)) = f(w1 + w2). Clarifications are provided on defining g(x) as Bx using matrix notation, which simplifies the representation of linear functions. Ultimately, the conversation underscores the importance of grasping the relationship between matrix operations and linear transformations for a deeper understanding of linear algebra.
Septimra
Messages
27
Reaction score
0
I am doing linear algebra and want to fully understand it, not just pass the class. I was recently taught matrix multiplication and decided to look up how it works. The good part is that I understand the concept. Matrices are a way of representing linear transformations. So matrix multiplication is actually a composition of functions. That is why it is not communicative and it is associative.

But i recently came across this article and I could not follow the math near the middle of the page.
http://nolaymanleftbehind.wordpress.com/2011/07/10/linear-algebra-what-matrices-actually-are/

the matrices that are being multiplied are

[ 2 1 ] [ 1 2 ]
[ 4 3 ] [ 1 0 ]

the basis are w1 and w 2

and w1 = [ 1 0 ]
and w2 = [ 0 1 ]

The author states that all that is needed it to see how the linear transformation affects the basis vectors.

Then it states that f(g(w1)) = f(w1+w2)
How does that work? Where on Earth do you plug in the w1?
Please help
 
Physics news on Phys.org
That would depend on what the author sees as f and g ... but the basic principle is that a linear transformation can be represented as a transformation of the coordinate system. A square in an oblique coordinate system looks the same as an oblique shape in a rectangular coordinate system.
 
Septimra said:
But i recently came across this article and I could not follow the math near the middle of the page.
http://nolaymanleftbehind.wordpress.com/2011/07/10/linear-algebra-what-matrices-actually-are/

the matrices that are being multiplied are

[ 2 1 ] [ 1 2 ]
[ 4 3 ] [ 1 0 ]

the basis are w1 and w 2

and w1 = [ 1 0 ]
and w2 = [ 0 1 ]

The author states that all that is needed it to see how the linear transformation affects the basis vectors.

Then it states that f(g(w1)) = f(w1+w2)
How does that work? Where on Earth do you plug in the w1?
Please help
If we write elements of ##\mathbb R^2## as 2×1 matrices, the definition of ##g:\mathbb R^2\to\mathbb R^2## can be written as ##g(x)=Bx## for all ##x\in\mathbb R^2##. So
$$g(w_1)=Bw_1 =\begin{pmatrix}1 & 2\\ 1 & 0\end{pmatrix}\begin{pmatrix}1 \\ 0\end{pmatrix}=\begin{pmatrix}1\\ 1\end{pmatrix}=\begin{pmatrix}1\\ 0\end{pmatrix}+\begin{pmatrix}0\\ 1\end{pmatrix}=w_1+w_2.$$
You may find https://www.physicsforums.com/showthread.php?p=4402648#post4402648 useful.
 
Last edited by a moderator:
Thank you a lot, I appreciate it. I now see what the author was saying.
But I still have one minor question. I thought the author was trying to prove that g(x) = Bx. I now see that I was mistaken. But could one of you prove this? How does g(x) = Bx if x is a vector?
 
Septimra said:
How does g(x) = Bx if x is a vector?
x is an element of ##\mathbb R^2##. If we use the convention to write elements of ##\mathbb R^2## as 2×1 matrices, then we can just define ##g(x)=Bx## for all ##x\in\mathbb R^2##. If we instead use the convention to write elements of ##\mathbb R^2## in the standard ##(x_1,x_2)## notation for ordered pairs, the notation ##Bx## doesn't work, but we could e.g. define
$$g(x)=\left(\left(B\begin{pmatrix}x_1\\ x_2\end{pmatrix}\right)_1,\left(B\begin{pmatrix}x_1\\ x_2\end{pmatrix}\right)_2\right)$$ for all ##x\in\mathbb R^2##. This looks really awkward of course. This is why I chose to use the matrix notation instead of the ordered pair notation.

We could also define g by saying that it's the function defined by ##g(s,t)=(s+2t,s)## for all ##s,t\in\mathbb R##. The matrix of this function with respect to the standard ordered basis ##(e_1,e_2)## where ##e_1=(1,0)## and ##e_2=(0,1)##, has ##g(e_j)_i## on row i, column j, as explained in the FAQ post. This is the ith component of the vector we get when g takes e_j as input. For example, row 2, column 1, of this matrix is
$$g(e_1)_2=(g(1,0))_2=(1+2\cdot 0,1)_2=1.$$ Note that this is equal to ##B_{21}##, as it's supposed to be.

If you want to understand how matrix multiplication is really composition of linear functions, then you should study the FAQ post and do this exercise: Let A and B be linear functions from ##\mathbb R^n## to ##\mathbb R^n##. Let [A] and denote their matrix representations with respect to the standard basis for ##\mathbb R^n##. Let [AB] denote the matrix representation of AB with respect to the standard basis for ##\mathbb R^n##. Prove that for all ##i,j\in\{1,\dots,n\}##, we have
$$[A\circ B]_{ij}=[AB]_{ij}.$$ This result tells us that the matrix representation of ##A\circ B## is equal to the matrix product of the matrix representations of A and B.

Hint: The definition of matrix multiplication is ##(XY)_{ij}=\sum_k X_{ik}Y_{kj}##. You will also have to use the fact that every vector is a linear combination of basis vectors.
 
Last edited:
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K