# Intro to elementary index notation

This is a brief tutorial to cover the basics of index notation which are useful when handling complicated expressions involving cross and dot products.
I will skip over a lot of technicalities (such as covariant and contravariant vectors) and focus on 3 dimensions - but all of what I say here can easily be generalised and extended, and I encourage anyone with the background to do so.

Conventions and notation:
I use bold symbols to indicate vectors (invariably 3 dimensional) and use $$\bold{\hat{i}},\bold{\hat{j}},\bold{\hat{k}}$$ as the unit vectors in the x, y and z directions.

$$\nabla = \frac{d}{dx}\bold{\hat{i}} + \frac{d}{dy}\bold{\hat{j}} + \frac{d}{dz}\bold{\hat{k}}$$

The problem:
Suppose we have some complicated expression for example
$$\nabla \cdot (\bold A \times (\nabla V))$$
where $$\bold A$$ is some 3 dimensional vector and $$V=V(\bold r)$$ is some scalar function, and we want to write it in a simpler form.

There are formulas for this sort of thing such as the BAC CAB rule:
$$\bold A \times (\bold B \times \bold C) = \bold{B}(\bold{A} \cdot \bold{C}) - \bold{C}(\bold{A} \cdot \bold{B})$$
but these are derived using commuting vectors, and so if we use $$\nabla$$, since

$$(\nabla V)$$ does not equal $$(V \nabla)$$

However index notation provides a quick and easy way to derive these types of expressions.

Indicies and the summation convention

Indicies allow us to rewrite an expression component by component. For example
$$\bold{A}=(A_1,A_2,A_3)$$
$$\bold{B}=(B_1,B_2,B_3)$$

So
$$\bold{A}\cdot\bold{B}=A_1 B_1 + A_2 B_2 + A_3 B_3=\sum_{i=1}^3 A_i B_i$$
(Clearly this can be generalised to any number of components)

Now for compactness we introduce the (Einstein) summation convention: If an index is repeated we sum over it. So
$$A_i B_i=\sum_{i=1}^3 A_i B_i$$
This cuts down on a lot of writing. Note that there must be conservation of unpaired indicies, for example
$$A_i=B_j C_j D_i$$
is a fine expression - it says the ith component of A is the ith component of D (pre)multiplied by the dot product of B and C, that is
$$\bold{A}=(\bold{B} \cdot \bold{C}) \bold{D}$$

However
$$A_i=B_j C_j$$
only makes sense if it means that all components of A are the same. Even then this is bad notation and it is much better to use
$$A_i = B_j C_j I_i$$ where $$\bold{I}=(1,1,1)$$

If we stick to this kind of convention we always get the same unpaired indicies on either side of an expression, in the above case i.

An expression like
$$A_i B_i C_i$$
makes no sense, with this convention. If you are evaluating an expression such as $$(\bold{A} \cdot \bold{B})(\bold{C} \cdot \bold{D})$$ you must use different indicies so $$A_i B_i C_j D_j$$.

Finally note paired indicies are dummy indicies. We can change them (if we change both of them) to whatever we want (providing what we change it to is not already being used) without altering the result (because they are summed over). Unpaired indicies are not dummy indicies.
So we can write:
$$A_i B_j C_j = A_i B_k C_k = A_i B_{(cats)} C_{(cats)}$$
(where I take (cats) to represent a single variable) but NOT
$$A_i B_j C_j = A_i B_i C_i$$ or $$A_i B_j C_j = A_k B_j C_j$$

(Note in the first of the two wrong expressions the right hand side has an index 3 times, so must be wrong, and in the second expression the unpaired index is not conserved - i is on the left hand side but not the right, so it too must be wrong).

So that's a lot of boring detail without much gain, but stick on we'll get there

Multiple indicies and Symmetry
It's often useful to have expressions with multiple indicies (these represent tensors, in general). If we stick to indicies only taking values 1,2,3 then a multiple index object
$$A_{ij}$$ represents the elements of a 3x3 matrix (the ith row and the jth column).

If we have two matricies A and B, then their product is (by definition)
$$(AB)_{ij}=\sum_{k=1}^3A_{ik} B_{kj} = A_{ik} B_{kj}$$

Objects with more than 2 indicies are not as easy to interpret, so I won't, I'll just use them.

An object with 2 or more indicies is symmetric if it is unchanged under interchange of two indicies, e.g.
$$S_{ij}=S_{ji}$$ is symmetric, as is
$$S_{ijk}=S_{kij}=S_{jki}=S_{ikj}=S_{jik}=S_{kji}$$

Note that, if we view $$S_{ij}$$ as the i-jth matrix element then $$S_{ji}=S^{T}_{ij}$$ is the i-jth element of the transpose. So a 2 index object is symmetric iff it corresponds to a symmetric matrix.

An antisymmetric object is one that changes sign every time two indicies are interchanged, e.g.
$$A_{ij}=-A_{ji}$$ and
$$A_{ijk}=A_{kij}=A_{jki}=-A_{ikj}=-A_{jik}=-A_{kji}$$
(note that the 2nd and 3rd term in the latter expression correspond to interchanging TWO indicies, so the two negative signs cancel).

Finally if a symmetric object is contracted (i.e. summed over 2 or more indicies) with an antisymmetric object it is zero. By this I mean if S is symmetric and A is antisymmetric then
$$A_{ij}S_{ij}=-A_{ji}S_{ij}=-A_{ji}S_{ji}=-A_{ij}S_{ij}$$
where in the last step I have renamed the dummy indicies - switching i and j. So $$A_{ij}S_{ij}=0$$

(This also implies $$A_{ijk}S_{ij}=0$$ and similarly, all we need is two indicies summed over for the argument to work)

Kronecker Delta and Levi-Civita Symbols

It is handy to use the symbols

Kronecker delta: $$\delta_{ij}$$ which is 1 if i=j and 0 otherwise.

The Kronecker delta is symmetric
$$\delta_{ij}=\delta_{ji}$$
and corresponds to the matrix elements of the Identity matrix (diag{1,1}).

So $$\delta_{ij}A_i$$ is equal to $$A_i$$ when i=j and 0 otherwise. So
$$\delta_{ij}A_i=A_j$$

(A common mistake is to say $$\delta_{ii}=1$$ but this is wrong. Why?
$$\delta_{ii}=\sum_{i=1}^3 1=3$$)

Levi-Civita symbol $$\varepsilon_{ijk}$$ which is 1 if ijk=123 or 312 or 231 and -1 if ijk=132 or 213 or 321 and 0 otherwise. (Sorry for writing this out so horribly).

The Levi-Civita symbol is antisymmetric:
$$\varepsilon_{ijk}=\varepsilon_{kij}=\varepsilon_{jki}=-\varepsilon_{ikj}=-\varepsilon_{jik}=-\varepsilon_{kij}$$

The Levi-Civita symbol is related to the Kronecker Delta:
$$\varepsilon_{ijk}\varepsilon_{lmn} = \det \begin{vmatrix} \delta_{il} & \delta_{im}& \delta_{in}\\ \delta_{jl} & \delta_{jm}& \delta_{jn}\\ \delta_{kl} & \delta_{km}& \delta_{kn}\\ \end{vmatrix}$$
although I have not found this expression to be too useful in practice, setting i=l gives a very useful expression:
$$\varepsilon_{ijk}\varepsilon_{ilm}=\delta_{jl}\delta_{km}-\delta_{jm}\delta_{kl}$$
(Note the positive delta terms occur between indicies on the left hand side in the same place of the Levi-Civita symbol, and the negative terms between opposite places).

From this you can derive expressions for more summed indicies, such as:
$$\varepsilon_{ijk}\varepsilon_{ijl}=\delta_{jj}\delta_{kl}-\delta_{jl}\delta_{kj}=3\delta_{kl}-\delta_{kl}=2\delta_{kl}$$
And
$$\varespilon_{ijk}\varepsilon_{ijk}=2\delta{kk}=6$$

The Levi-Civita symbol is useful because of its relation to the cross product:
$$\det A =\varepsilon_{ijk} A_{1i} A_{2j} A_{3k}$$
and more importantly:
$$(\bold{A} \times \bold{B})_i=\varepsilon_{ijk} A_j B_k$$

That pretty much covers everything we're going to need.

Evaluating Expressions
Let's start with a very easy one:
$$\bold{A} \times \bold{A} = 0$$
This is well known, but provides an easy check:
$$(\bold{A} \times \bold{A})_i=\varepsilon_{ijk}A_jA_k$$
Now $$\varepsilon_{ijk}$$ is antisymmetric under interchange of j and k, but since $$A_j A_k = A_k A_j$$ the product $$A_j A_k$$ is symmetric under interchange of j and k. So the whole expression is zero.

What about the BAC CAB rule?
$$(\bold A \times (\bold B \times \bold C))_i =\varepsilon_{ijk}A_j(\bold B \times \bold C)_k=\varepsilon_{ijk}A_j\varepsilon_{klm}B_l C_m$$

$$=\varepsilon_{kij}\varepsilon_{klm}A_j B_l C_m = (\delta_{il}\delta_{jm}-\delta_{im}\delta_{jl})(A_j B_l C_m)$$

$$=A_m B_i C_m - A_l B_l C_i = B_i A_m C_m - C_i A_l B_l= B_i (\bold A \cdot \bold C) - C_i(\bold a \cdot \bold B)$$
That is dropping indicies:
$$\bold A \times (\bold B \times \bold C) = \bold{B}(\bold{A} \cdot \bold{C}) - \bold{C}(\bold{A} \cdot \bold{B})$$

This may look a tad messy, but it is much quicker than the normal way of doing this - expanding it out component by component.

I will write $$(\nabla)_i=\frac{d}{dr_i}=\partial_i$$

So let's try a slightly harder one

$$(\nabla \times (\nabla \times \bold{A}))_i=\varepsilon_{ijk}\partial_j \varepsilon_{klm} \partial_l A_m = \varepsilon_{kij}\varepsilon_{klm}\partial_l\partial_j A_m$$
$$=\partial_i\partial_m A_m - \partial_l \partial_l A_i=(\nabla)_i(\nabla \cdot \bold{A}) - \nabla^2 A_i$$
or: $$\nabla \times (\nabla \times \bold{A})=\nabla(\nabla \cdot \bold{A}) - \nabla^2 \bold{A}$$
(I have suppressed most of the detail here - once you get the hang of it you should be able to see these steps straight off, but for now, work them through it in detail).

I will do one more example, an identity I doubt you'd find in most books and would have to derive for yourself anyway:
$$\nabla \cdot (\bold A \times (\nabla V))=\partial_i (\varepsilon_{ijk} A_j \partial_k V) = \varepsilon_{ijk} (\partial_i (A_j) \partial_k V + A_j \partial_i \partial_k V)$$
Where the last step follows from the product rule for derivatives. Note that $$\partial_i \partial_k V = \partial_k \partial_i V$$ (assuming V is a sufficiently nice function - that is it is harmonic. This assumption is ok most of the time.) Consequently
$$\varepsilon_{ijk} \partial_i \partial_k V = 0$$ (Why?)
So
$$\nabla \cdot (\bold A \times (\nabla V)) = (\varepsilon_{kij}\partial_i (A_j)) \partial_k V = (\nabla \times \bold{A}) \cdot \nabla V$$
(Again: work through it)

Finally I would like to point out that this is extremely powerful on non-commuting linear operators (see: Quantum Mechanics - particularly useful in deriving commutators) and is a prelude to the notation that is used in relativity.

Do a few examples, you'll find once you get the hang of it you can derive identities very quickly.

Last edited:

## Answers and Replies

i just found this from a google search regarding tensors and index notation. it's very helpful, thanks.