What is the equation for representing a linear operator in terms of a matrix?

Favicon
Messages
14
Reaction score
0
I'm working through a proof that every linear operator, A, can be represented by a matrix, A_{ij}. So far I've got

Let \textbf{p}=\sum_{i}p_{i}\widehat{\textbf{e}}_{i}
A(\textbf{p}) = \sum_{i}p_{i}A(\textbf{e}_{i})

which is fine. Then it says that A(\textbf{e}_{i}) is a vector, given by:

A(e_{i}) = \sum_{j}A_{j}(p_{i})e_{j} = \sum_{j}A_{ji}e_{j}.

The fact that its a vector is fine with me, but I can't get my head around the equation for it. why does the operator acting on one of the base vectors depend on p_{i}? Surely the base vectors are independent of p_{i} and so should be any operation acting on them.
 
Physics news on Phys.org
Indeed, they don't.
I would write it like
A(\hat e_i) = \sum_j (A_i)_j \hat e_j
where Ai is some vector of coefficients.
 
This is how I do this thing: Suppose A:U\rightarrow V is linear, and that \{u_j\} is a basis for U, and \{v_i\} is a basis for V. Consider the equation y=Ax, and expand in basis vectors.

y=y_i v_i

Ax=A(x_j u_j)=x_j Au_j= x_j (Au_j)_i v_i

I'm using the Einstein summation convention: Since we're always supposed to do a sum over the indices that appear exactly twice, we can remember that without writing any summation sigmas (and since the operator is linear, it wouldn't matter if we put the summation sigma to the left or right of the operator). Now define A_{ij}=(Au_j)_i. The above implies that

y_i=x_j(Au_j)_i=A_{ij}x_j

Note that this can be interpreted as a matrix equation in component form. y_i is the ith component of y in the basis \{v_i\}. x_j is the jth component of x in the basis \{u_j\}. A_{ij} is row i, column j, of the matrix of A in the pair of bases \{u_j\}, \{v_i\}.

Favicon said:
A(e_{i}) = \sum_{j}A_{j}(p_{i})e_{j} = \sum_{j}A_{ji}e_{j}
This one should be

Ae_{i} = \sum_{j}(Ae_{i})_j e_{j} = \sum_{j}A_{ji}e_{j}

Note that the first step is just to express the vector Ae_i as a linear combination of basis vectors, and that (Ae_i)_j is just what I call the jth component.
 
Last edited:
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top