Canonical Isomorphism and Tensor Products

  • Context: MHB 
  • Thread starter Thread starter Sudharaka
  • Start date Start date
  • Tags Tags
    Isomorphism Tensor
Click For Summary
SUMMARY

The discussion focuses on the canonical isomorphism \(\alpha\) from \(V^*\otimes V\) to \(L(V, V)\) and the computation of \(\alpha(t)(v)\) for \(t=(e^1+e^2)\otimes (e_3+e_4)\) and \(v=2e_1+3e_2+2e_3+3e_4\). The participants clarify that \(\alpha(e^i\otimes e_j)(v)\) evaluates to \(\langle e^i,v\rangle e_j\), effectively translating tensor products into linear transformations. The final result of the computation yields \(\alpha(t)(v) = 5e_3 + 5e_4\), demonstrating the application of linearity in tensor operations.

PREREQUISITES
  • Understanding of tensor products, specifically \(V^*\otimes V\)
  • Familiarity with linear transformations and the notation \(L(V, V)\)
  • Knowledge of dual spaces and linear functionals
  • Basic proficiency in matrix representation of linear mappings
NEXT STEPS
  • Study the properties of canonical isomorphisms in linear algebra
  • Learn about the dual space and its relationship with vector spaces
  • Explore the concept of bilinear forms and their applications in finite-dimensional spaces
  • Investigate the use of sesquilinear forms in complex vector spaces
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in advanced topics in vector spaces and tensor analysis.

Sudharaka
Gold Member
MHB
Messages
1,558
Reaction score
1
Hi everyone, :)

Here's a problem that I have trouble understanding. Specifically I am not quite getting what it means by the expression \(\alpha (t)(v)\). Hope somebody can help me to improve my understanding. :)

Problem:

Let \(\alpha\) be the canonical isomorphism from \(V^*\otimes V\) to \(L(V,\, V)\). Find \(\alpha(t)(v)\) where \(t=(e^1+e^2)\otimes (e_3+e_4)\) and \(v=2e_1+3e_2+2e_3+3e_4\).
 
Physics news on Phys.org
Sudharaka said:
Hi everyone, :)

Here's a problem that I have trouble understanding. Specifically I am not quite getting what it means by the expression \(\alpha (t)(v)\). Hope somebody can help me to improve my understanding. :)

Problem:

Let \(\alpha\) be the canonical isomorphism from \(V^*\otimes V\) to \(L(V,\, V)\). Find \(\alpha(t)(v)\) where \(t=(e^1+e^2)\otimes (e_3+e_4)\) and \(v=2e_1+3e_2+2e_3+3e_4\).
With the usual caveat that I'm not really at home with this tensor notation, the idea is that an element of $V^*\otimes V$ gives rise to a linear transformation from $V$ to $V$. The elementary tensor $e^i\otimes e_j$ gives rise to the linear transformation $T = \alpha(e^i\otimes e_j)$ defined by $T(v) = \alpha(e^i\otimes e_j)(v) = \langle e^i,v\rangle e_j$ (for all $v\in V$), where the angled brackets $\langle x,v\rangle$ denote the action of $x\in V^*$ on the element $x\in V$ under the duality between the two spaces. Thus $\alpha(e^i\otimes e_j)(e_k) = \begin{cases}e_j& \text{ if }k=i, \\ 0&\text{ if }k\ne i. \end{cases}$
 
Opalg said:
With the usual caveat that I'm not really at home with this tensor notation, the idea is that an element of $V^*\otimes V$ gives rise to a linear transformation from $V$ to $V$. The elementary tensor $e^i\otimes e_j$ gives rise to the linear transformation $T = \alpha(e^i\otimes e_j)$ defined by $T(v) = \alpha(e^i\otimes e_j)(v) = \langle e^i,v\rangle e_j$ (for all $v\in V$), where the angled brackets $\langle x,v\rangle$ denote the action of $x\in V^*$ on the element $x\in V$ under the duality between the two spaces. Thus $\alpha(e^i\otimes e_j)(e_k) = \begin{cases}e_j& \text{ if }k=i, \\ 0&\text{ if }k\ne i. \end{cases}$

Thanks so much for your reply. I took a long time to understand this due to my limited knowledge about tensors, but I think now I am getting there. :)
 
As I indicated in another thread the basis of elementary tensors $e^j \otimes e_i$ of $V^{\ast} \otimes V$ can be identified with the basis of elementary matrices $E_{ij}$ of $\text{Hom}(V,V)$.

Note that this identification uses a basis choice for $V$, but it is possible to do this in a completely basis-free manner.

One thing to remember is that linear functionals (at least for finite-dimensional vector spaces) are pretty much just "glorified inner products". That is, every element $f \in V^{\ast}$ can be thought of as the function:

$\langle u,\_\rangle$ for some vector $u$.

For example, we have $e^j = \langle e_j,\_ \rangle$ for the standard basis for $V$.

***Note*** Vector spaces, of course, do not always come equipped with a "natural" inner product. However, any non-degenerate bilinear form $B$ can be used to induce an isomorphism (only in the finite-dimensional case, n.b.) between $V$ and $V^{\ast}$, and the "go-to" non-degenerate bilinear form in an inner product space is, of course, the inner product (BUT...in complex vector spaces, it is often more convenient to use a sesquilinear form, due to the peculiarities of the complex conjugate).

In particular, it is natural to identify the 1-form (dual basis element) $e^j$ with the $j$-th projection function.

*******

Computing the values of these mappings is often tedious. $\alpha(e^i \otimes e_j) (v)$ basically takes the $i$-th coordinate of $v$ (in the given "$e$" basis) and sticks it in the $j$-th slot, with every other coordinate 0. To extend this to the entire tensor $t$ we extend by (multi-, in this case, bi-) linearity.

So if:

$t = (e^1 + e^2) \otimes (e_3 + e_4) = e^1 \otimes e_3 + e^1 \otimes e_4 + e^2 \otimes e_3 + e^2 \otimes e_4$

Then:

$\alpha(t)(v) = 2e_3 + 2e_4 + 3e_3 + 3e_4 = 5e_3 + 5e_4$

or, perhaps more understandably, relative to our given basis $S$ of $V$:

$[T]_S([v]_S) = [T]_S(2,3,2,3)^T = (0,0,5,5)^T$

(if I have done my arithmetic correctly), that is relative to the given basis, our linear mapping $T$ has the matrix:

$\begin{bmatrix}0&0&0&0\\0&0&0&0\\1&1&0&0\\1&1&0&0 \end{bmatrix}$
 
Deveno said:
As I indicated in another thread the basis of elementary tensors $e^j \otimes e_i$ of $V^{\ast} \otimes V$ can be identified with the basis of elementary matrices $E_{ij}$ of $\text{Hom}(V,V)$.

Note that this identification uses a basis choice for $V$, but it is possible to do this in a completely basis-free manner.

One thing to remember is that linear functionals (at least for finite-dimensional vector spaces) are pretty much just "glorified inner products". That is, every element $f \in V^{\ast}$ can be thought of as the function:

$\langle u,\_\rangle$ for some vector $u$.

For example, we have $e^j = \langle e_j,\_ \rangle$ for the standard basis for $V$.

***Note*** Vector spaces, of course, do not always come equipped with a "natural" inner product. However, any non-degenerate bilinear form $B$ can be used to induce an isomorphism (only in the finite-dimensional case, n.b.) between $V$ and $V^{\ast}$, and the "go-to" non-degenerate bilinear form in an inner product space is, of course, the inner product (BUT...in complex vector spaces, it is often more convenient to use a sesquilinear form, due to the peculiarities of the complex conjugate).

In particular, it is natural to identify the 1-form (dual basis element) $e^j$ with the $j$-th projection function.

*******

Computing the values of these mappings is often tedious. $\alpha(e^i \otimes e_j) (v)$ basically takes the $i$-th coordinate of $v$ (in the given "$e$" basis) and sticks it in the $j$-th slot, with every other coordinate 0. To extend this to the entire tensor $t$ we extend by (multi-, in this case, bi-) linearity.

So if:

$t = (e^1 + e^2) \otimes (e_3 + e_4) = e^1 \otimes e_3 + e^1 \otimes e_4 + e^2 \otimes e_3 + e^2 \otimes e_4$

Then:

$\alpha(t)(v) = 2e_3 + 2e_4 + 3e_3 + 3e_4 = 5e_3 + 5e_4$

or, perhaps more understandably, relative to our given basis $S$ of $V$:

$[T]_S([v]_S) = [T]_S(2,3,2,3)^T = (0,0,5,5)^T$

(if I have done my arithmetic correctly), that is relative to the given basis, our linear mapping $T$ has the matrix:

$\begin{bmatrix}0&0&0&0\\0&0&0&0\\1&1&0&0\\1&1&0&0 \end{bmatrix}$

Thanks so much. I will read all the details slowly to grab hold of them. The problem this tensor mathematics is that I find a lot of different approaches to a given problem with sometimes confuses me. :)
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K