Petar Mali
- 283
- 0
definition
\{\vec{A},\vec{B}\}\cdot \vec{C}=\vec{A}(\vec{B}\cdot\vec{C})
\vec{C}\cdot \{\vec{A},\vec{B}\}=(\vec{C}\cdot\vec{A})\vec{B}
I have a question. I found in some books that definition of tensor is
\hat{T}=\{\vec{T}_k,\vec{e}_k\}
where \hat{\T} is tensor!
Is here included sumation convention?
So is that
\hat{T}=\sum_k\{\vec{T}_k,\vec{e}_k\}
?
In sumation convention we have for example
\sum_i A_i\vec{e}_i=A_i\vec{e}^i
I don't understand why people here use sumation convention when both indices are down
For example if \vec{1} is unit tensor
we have in I understand this in correct way
\hat{1}=\sum_k\{\vec{e}_k,\vec{e}_k\}
Is in this way maybe
\hat{1}=\{\vec{e}_k,\vec{e}_k\}
Thanks for your answer!
\{\vec{A},\vec{B}\}\cdot \vec{C}=\vec{A}(\vec{B}\cdot\vec{C})
\vec{C}\cdot \{\vec{A},\vec{B}\}=(\vec{C}\cdot\vec{A})\vec{B}
I have a question. I found in some books that definition of tensor is
\hat{T}=\{\vec{T}_k,\vec{e}_k\}
where \hat{\T} is tensor!
Is here included sumation convention?
So is that
\hat{T}=\sum_k\{\vec{T}_k,\vec{e}_k\}
?
In sumation convention we have for example
\sum_i A_i\vec{e}_i=A_i\vec{e}^i
I don't understand why people here use sumation convention when both indices are down
For example if \vec{1} is unit tensor
we have in I understand this in correct way
\hat{1}=\sum_k\{\vec{e}_k,\vec{e}_k\}
Is in this way maybe
\hat{1}=\{\vec{e}_k,\vec{e}_k\}
Thanks for your answer!
Last edited: