- #1

Petar Mali

- 290

- 0

definition

[tex]\{\vec{A},\vec{B}\}\cdot \vec{C}=\vec{A}(\vec{B}\cdot\vec{C})[/tex]

[tex]\vec{C}\cdot \{\vec{A},\vec{B}\}=(\vec{C}\cdot\vec{A})\vec{B}[/tex]

I have a question. I found in some books that definition of tensor is

[tex]\hat{T}=\{\vec{T}_k,\vec{e}_k\}[/tex]

where [tex]\hat{\T}[/tex] is tensor!

Is here included sumation convention?

So is that

[tex]\hat{T}=\sum_k\{\vec{T}_k,\vec{e}_k\}[/tex]

?

In sumation convention we have for example

[tex]\sum_i A_i\vec{e}_i=A_i\vec{e}^i[/tex]

I don't understand why people here use sumation convention when both indices are down

For example if [tex]\vec{1}[/tex] is unit tensor

we have in I understand this in correct way

[tex]\hat{1}=\sum_k\{\vec{e}_k,\vec{e}_k\}[/tex]

Is in this way maybe

[tex]\hat{1}=\{\vec{e}_k,\vec{e}_k\}[/tex]

Thanks for your answer!

[tex]\{\vec{A},\vec{B}\}\cdot \vec{C}=\vec{A}(\vec{B}\cdot\vec{C})[/tex]

[tex]\vec{C}\cdot \{\vec{A},\vec{B}\}=(\vec{C}\cdot\vec{A})\vec{B}[/tex]

I have a question. I found in some books that definition of tensor is

[tex]\hat{T}=\{\vec{T}_k,\vec{e}_k\}[/tex]

where [tex]\hat{\T}[/tex] is tensor!

Is here included sumation convention?

So is that

[tex]\hat{T}=\sum_k\{\vec{T}_k,\vec{e}_k\}[/tex]

?

In sumation convention we have for example

[tex]\sum_i A_i\vec{e}_i=A_i\vec{e}^i[/tex]

I don't understand why people here use sumation convention when both indices are down

For example if [tex]\vec{1}[/tex] is unit tensor

we have in I understand this in correct way

[tex]\hat{1}=\sum_k\{\vec{e}_k,\vec{e}_k\}[/tex]

Is in this way maybe

[tex]\hat{1}=\{\vec{e}_k,\vec{e}_k\}[/tex]

Thanks for your answer!

Last edited: