# Linear decompostion tensor

1. Nov 17, 2013

### Jhenrique

Given a vector $$\vec{r}=\begin{bmatrix} x\\ y \end{bmatrix}$$
It's possible to decompose it linearly, so:

$$\vec{r}=x\hat{i}+y\hat{j}$$
So, how would the linear decomposition of a tensor?

Thx!

2. Nov 17, 2013

### PSarkar

Let $T \in \mathcal{T}^k(V)$ be a $k$ tensor on the $n$ dimensional vector space $V$ over $\mathbb{R}$. In other words, $T$ is the multilinear map $T: V^k \to \mathbb{R}$. Let $\{\varphi_1, \dotsc, \varphi_n\}$ be the dual basis of some basis $\{v_1, \dotsc, v_n\}$ of $V$. Then $\{\varphi_{i_1} \otimes \dotsb \otimes \varphi_{i_k}: 1 \leq i_1, \dotsc, i_k \leq n\}$ is a basis of the vector space of all tensors, $\mathcal{T}^k(V)$. Moreover, we have
$$T = \sum_{i_1, \dotsc, i_k = 1}^n T(v_{i_1}, \dotsc, v_{i_k}) \varphi_{i_1} \otimes \dotsb \otimes \varphi_{i_k}$$
since $\{\varphi_1, \dotsc, \varphi_n\}$ is the dual basis. If you want, you can choose the standard basis for $V$ and the corresponding standard dual basis.

3. Nov 17, 2013

### Jhenrique

I don't understand the notation recursive and summations, really!

Can you give me a concrete example? I refer to a 3x3 matrix, or 2x2, of rank 2.

4. Nov 17, 2013

### PSarkar

Let $T$ be a $2$ tensor, i.e. bilinear form, on $\mathbb{R}^2$ defined by
$$T(x) = x^T \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} x$$
regarding $x$ as a column matrix. Let $\{e_1, e_2\}$ be the standard basis and $\{\varphi_1, \varphi_2\}$ be the standard dual basis. We have $\varphi_i(e_j) = \delta_{ij}$. We can also express it in matrix form as
$$[\varphi_1] = \begin{pmatrix} 1 & 0 \end{pmatrix} \;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\; [\varphi_1] = \begin{pmatrix} 1 & 0 \end{pmatrix}$$
where $[\varphi_i]$ denotes the matrix of $\varphi_i$ with respect to the standard basis. Now, consider the set $\{\varphi_1 \otimes \varphi_1, \varphi_1 \otimes \varphi_2, \varphi_2 \otimes \varphi_1, \varphi_2 \otimes \varphi_2\}$. In general, we have
$$(\varphi_i \otimes \varphi_j)(e_k, e_l) = \varphi_i(e_k) \varphi_j(r_l) = \delta_{ik}\delta_{jl}.$$
So we can write this in matrix form as
$$(\varphi_i \otimes \varphi_j)(x) = x^T M_{ij} x$$
where $M_{ij}$ denote the matrix with zeros everywhere except the entry $1$ in the $i^\text{th}$ row and $j^\text{th}$ column. For example
$$(\varphi_1 \otimes \varphi_2)(x) = x^T M_{12} x = x^T \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} x.$$
So we have
$$T(x) = x^T \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} x = x^T \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} x + x^T \begin{pmatrix} 0 & 2 \\ 0 & 0 \end{pmatrix} x + x^T \begin{pmatrix} 0 & 0 \\ 3 & 0 \end{pmatrix} x + x^T \begin{pmatrix} 0 & 0 \\ 0 & 4 \end{pmatrix} x \\ = x^T \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} x + 2 x^T \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} x + 3 x^T \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix} x + 4 x^T \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix} x \\ = (\varphi_1 \otimes \varphi_1)(x) + 2(\varphi_1 \otimes \varphi_2)(x) + 3(\varphi_2 \otimes \varphi_1)(x) + 4(\varphi_2 \otimes \varphi_2)(x)$$
So $T = \varphi_1 \otimes \varphi_1 + 2\varphi_1 \otimes \varphi_2 + 3\varphi_2 \otimes \varphi_1 + 4\varphi_2 \otimes \varphi_2$ (this is the decomposition in terms of the chosen basis). As you can confirm, $T(e_1, e_1) = 1, T(e_1, e_2) = 2, T(e_2, e_1) = 3, T(e_2, e_2) = 4$. So we can rewrite it as $T = T(e_1, e_1)\varphi_1 \otimes \varphi_1 + T(e_1, e_2)\varphi_1 \otimes \varphi_2 + T(e_2, e_1)\varphi_2 \otimes \varphi_1 + T(e_2, e_2)\varphi_2 \otimes \varphi_2$. This is exactly what the sum is in the previous post. It is not recursive but merely stating that the two is equal.

If you want to see the above decomposition purely in terms of matrices, it is just the following statement
$$\begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} + 2 \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} + 3 \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix} + 4 \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}.$$
But tensors are actually the multilinear maps $V^k \to \mathbb{R}$. To deal with their matrices when $k = 2$, we have to choose some basis. In the above example I chose the standard basis but we can get a different equality in terms of matrices for different basis.

5. Nov 18, 2013

### Jhenrique

I understood. But, this multiplies my doubts! Although I know the Cauchy stress tensor, so far I haven't found any material that addresses the tensor as a geometric element. Would be convenient if I continued to ask my questions? The geometric interpretation of a tensor, the geometric interpretation of the tensor product between vectors, determinant, modulus... are things obscure to me...

Last edited: Nov 18, 2013
6. Nov 18, 2013

### PSarkar

I am not a expert on tensors either. I learned it in the book called Calculus on Manifolds by Michael Spivak in chapter 4. Since it is not a book on algebra, I'm sure there's a much much more to tensors and multilinear algebra which I did not learn.

Geometrically, its very hard to visualize because they are multilinear maps which are functions and you would typically need many dimensions to graph it in some way (since you need to consider the domain space and range space). I think its better to see what it is algebraically instead.

Anyway, you should note the approach to tensors that I learned and used above is a coordinate free approach. If you are doing physics, then the way tensors are dealt with is different since coordinates are chosen. Its similar to the difference in dealing with linear transformations instead of their matrices in linear algebra.

7. Nov 18, 2013

### Jhenrique

I'm not studying physics. I'm self-taught. I learned calculus alone and honestly, I don't like when others tell me what I need studying, as If I had not been born with this innate desire to learn. Anyway...

Do you know to tell me if the determinant of the tensor product of two vectors is equals area (parallelogram) formed between these two vectors? If yes, how to contrast this with the fact that the determinant of a 3x3 matrix corresponds to the volume (paralelepidido) formed by three vectors!?

8. Nov 18, 2013

### PSarkar

I have no idea what I said to get this response. I looked at my previous post a few times and still confused at your response. Anyway, sorry if I offended you (whatever it was that offended you).

As for the determinant stuff ... I not familiar with the determinant of the tensor product of two vectors, so unfortunately I don't have the answer.

9. Nov 18, 2013

### Jhenrique

AISEUhaIUHEiaHEiah

You didn't offend me! You supposed that I could be studying physics. I said that no and philosophized a bit, doing a critique to education system.

10. Nov 18, 2013

### PSarkar

Hahaha. Ok. Sorry for the misunderstanding!