Math Amateur
Gold Member
MHB
- 3,920
- 48
I am spending time revising vector spaces. I am using Dummit and Foote: Abstract Algebra (Chapter 11) and also the book Linear Algebra by Stephen Freidberg, Arnold Insel and Lawrence Spence.
I am working on Theorem 10 which is a fundamental theorem regarding an isomorphism between the space of all linear transformations from a vector space $$V$$ to a vector space $$W$$, $$Hom_F(V, W)$$ and the space of $$m \times n $$ matrices with coefficients in $$F$$, $$M_{m \times n} (F)$$.
I need help to fully understand the proof of Theorem 10.
Theorem 10 and its proof (D&F page 416) read as follows:View attachment 3029
Now to define the terminology for a formal and rigorous proof of Theorem 10 we have:
V, W are vector spaces over a field F.
$$ \mathcal{B} = \{ v_1, v_2, ... \ ... v_n \} \text{ is an ordered basis of } V $$
$$ \mathcal{E} = \{ w_1, w_2, ... \ ... w_m \} \text{ is an ordered basis of } W $$
Let $$\phi, \psi \in Hom_F(V,W)$$ be linear transformations from $$V$$ to $$W$$.
For each $$j = \{ 1, 2, ... \ ... n \}$$ write the image of $$v_j$$ under $$\phi, \psi$$ in terms of the basis $$ \mathcal{E}$$ as follows:
$$\phi (v_j) = \sum_{i = 1}^m \alpha_{ij} w_i$$
$$= \alpha_{1j}w_1 + \alpha_{2j}w_2 + ... \ ... \alpha_{mj}w_m
$$
and
$$\psi (v_j) = \sum_{i = 1}^m \beta_{ij} w_i
$$
$$= \beta_{1j}w_1 + \beta_{2j}w_2 + ... \ ... \beta_{mj}w_m $$We define the coordinates of $$v_j$$ relative to the basis $$ \mathcal{E}$$ as follows:$$[ \phi (v_j) ]_{\mathcal{E}} = \begin{bmatrix} \alpha_{1j} \\ \alpha_{2j} \\ . \\ . \\ . \\ \alpha_{mj} \end{bmatrix}
$$
and
$$[ \psi (v_j) ]_{\mathcal{E}} = \begin{bmatrix} \beta_{1j} \\ \beta_{2j} \\ . \\ . \\ . \\ \beta_{mj} \end{bmatrix}
$$
Now Theorem 10 concerns the following map:
$$\Phi \ : \ Hom_F(V, W) \to M_{m \times n} (F)$$
where
$$ \Phi ( \phi ) = M_\mathcal{B}^\mathcal{E} ( \phi )$$ for all $$\phi \in Hom_F (V, W)$$
where $$ M_\mathcal{B}^\mathcal{E} ( \phi) $$ is the matrix of the linear transformation $$\phi$$ with respect to the bases $$\mathcal{B}$$ and $$\mathcal{E}$$.
Further, Theorem 10 asserts that $$\Phi$$ is a vector space isomorphism.
So, the first thing to demonstrate is that for $$\phi, \psi \in Hom_F (V, W)
$$ we have:
$$\Phi ( \phi + \psi ) = \Phi ( \phi ) + \Phi ( \psi ) $$ ... ... ... ... ... (1)
and
$$\Phi ( c \phi) = c \Phi ( \phi)$$ ... ... ... ... ... (2)
In respect of proving (1), (2) above - that is, proving that $$\Phi$$ is a linear transformation D&F (page 416) say the following:
" ... ... The columns of the matrix $$M_\mathcal{B}^\mathcal{E}$$ are determined by the action of $$\phi$$ on the basis $$\mathcal{B}$$ as in Equation (3). This shows in particular that the map $$\phi \to M_\mathcal{B}^\mathcal{E} ( \phi )$$ is an $$F$$-linear map since $$\phi$$ is $$F$$-linear ... ... ... "
[Equation (3) is the following:
$$\phi (v_j) = \sum_{i = 1}^m \alpha_{ij} w_i$$ ]
I do not follow this argument ... can anyone help me frame an explicit, formal and rigorous demonstration/proof that $$\Phi$$ is a linear transformation?
I note that in an explicit and formal proof we would need, firstly to show that:
$$\Phi ( \phi + \psi ) = \Phi ( \phi ) + \Phi ( \psi ) $$
... ... reflecting ... ... to do this we need to express $$\Phi, \Phi ( \phi ) , \Phi ( \psi ), \Phi ( \phi + \psi ) $$ ... in terms of the notation above, that is in terms of the notation of D&F Section 11.2 (see below) and ... we need a basis for $$Hom_F (V, W)$$ and a basis for $$M_{m \times n} (F)$$ ... but what is the nature/form of such bases ...
Can someone help ...?
I would appreciate the help, especially as Theorem 10 seems so fundamental!Peter
***NOTE***
The relevant text in D&F introducing the definitions and notation for the matrix of a linear transformation is as follows:
View attachment 3030
I am working on Theorem 10 which is a fundamental theorem regarding an isomorphism between the space of all linear transformations from a vector space $$V$$ to a vector space $$W$$, $$Hom_F(V, W)$$ and the space of $$m \times n $$ matrices with coefficients in $$F$$, $$M_{m \times n} (F)$$.
I need help to fully understand the proof of Theorem 10.
Theorem 10 and its proof (D&F page 416) read as follows:View attachment 3029
Now to define the terminology for a formal and rigorous proof of Theorem 10 we have:
V, W are vector spaces over a field F.
$$ \mathcal{B} = \{ v_1, v_2, ... \ ... v_n \} \text{ is an ordered basis of } V $$
$$ \mathcal{E} = \{ w_1, w_2, ... \ ... w_m \} \text{ is an ordered basis of } W $$
Let $$\phi, \psi \in Hom_F(V,W)$$ be linear transformations from $$V$$ to $$W$$.
For each $$j = \{ 1, 2, ... \ ... n \}$$ write the image of $$v_j$$ under $$\phi, \psi$$ in terms of the basis $$ \mathcal{E}$$ as follows:
$$\phi (v_j) = \sum_{i = 1}^m \alpha_{ij} w_i$$
$$= \alpha_{1j}w_1 + \alpha_{2j}w_2 + ... \ ... \alpha_{mj}w_m
$$
and
$$\psi (v_j) = \sum_{i = 1}^m \beta_{ij} w_i
$$
$$= \beta_{1j}w_1 + \beta_{2j}w_2 + ... \ ... \beta_{mj}w_m $$We define the coordinates of $$v_j$$ relative to the basis $$ \mathcal{E}$$ as follows:$$[ \phi (v_j) ]_{\mathcal{E}} = \begin{bmatrix} \alpha_{1j} \\ \alpha_{2j} \\ . \\ . \\ . \\ \alpha_{mj} \end{bmatrix}
$$
and
$$[ \psi (v_j) ]_{\mathcal{E}} = \begin{bmatrix} \beta_{1j} \\ \beta_{2j} \\ . \\ . \\ . \\ \beta_{mj} \end{bmatrix}
$$
Now Theorem 10 concerns the following map:
$$\Phi \ : \ Hom_F(V, W) \to M_{m \times n} (F)$$
where
$$ \Phi ( \phi ) = M_\mathcal{B}^\mathcal{E} ( \phi )$$ for all $$\phi \in Hom_F (V, W)$$
where $$ M_\mathcal{B}^\mathcal{E} ( \phi) $$ is the matrix of the linear transformation $$\phi$$ with respect to the bases $$\mathcal{B}$$ and $$\mathcal{E}$$.
Further, Theorem 10 asserts that $$\Phi$$ is a vector space isomorphism.
So, the first thing to demonstrate is that for $$\phi, \psi \in Hom_F (V, W)
$$ we have:
$$\Phi ( \phi + \psi ) = \Phi ( \phi ) + \Phi ( \psi ) $$ ... ... ... ... ... (1)
and
$$\Phi ( c \phi) = c \Phi ( \phi)$$ ... ... ... ... ... (2)
In respect of proving (1), (2) above - that is, proving that $$\Phi$$ is a linear transformation D&F (page 416) say the following:
" ... ... The columns of the matrix $$M_\mathcal{B}^\mathcal{E}$$ are determined by the action of $$\phi$$ on the basis $$\mathcal{B}$$ as in Equation (3). This shows in particular that the map $$\phi \to M_\mathcal{B}^\mathcal{E} ( \phi )$$ is an $$F$$-linear map since $$\phi$$ is $$F$$-linear ... ... ... "
[Equation (3) is the following:
$$\phi (v_j) = \sum_{i = 1}^m \alpha_{ij} w_i$$ ]
I do not follow this argument ... can anyone help me frame an explicit, formal and rigorous demonstration/proof that $$\Phi$$ is a linear transformation?
I note that in an explicit and formal proof we would need, firstly to show that:
$$\Phi ( \phi + \psi ) = \Phi ( \phi ) + \Phi ( \psi ) $$
... ... reflecting ... ... to do this we need to express $$\Phi, \Phi ( \phi ) , \Phi ( \psi ), \Phi ( \phi + \psi ) $$ ... in terms of the notation above, that is in terms of the notation of D&F Section 11.2 (see below) and ... we need a basis for $$Hom_F (V, W)$$ and a basis for $$M_{m \times n} (F)$$ ... but what is the nature/form of such bases ...
Can someone help ...?
I would appreciate the help, especially as Theorem 10 seems so fundamental!Peter
***NOTE***
The relevant text in D&F introducing the definitions and notation for the matrix of a linear transformation is as follows:
View attachment 3030
Last edited: