# Index notation is driving me CRAZY!

1. Mar 21, 2009

### AxiomOfChoice

I've already posted a couple of index notation questions on here and I've gotten very helpful responses. So I thought I'd try my lucky again, though I'm a little more stumped on this question than I was on the others...

Let $\vec{x}$ be the position vector and $\vec{r}$ the radial unit vector. In index notation, I find myself confronted with simplifying

$$x_k \partial_i r_k.$$

I desperately want this to be zero, but I can't figure out why it should be. (I can't really figure out what this represents at all, as a matter of fact.) Isn't $\partial_i r_k$ a 2 by 2 matrix? And if so, how would multiplying with $x_k$ make everything go away? Again, maybe it doesn't.

Can anyone help?

2. Mar 21, 2009

### matt grime

Edit: of course it helps if I (read and hence) write the correct thing; here's what I should have written.

You can just write the sum out (doing this is what helped me from going from "summation convention, that's black magic" to "summation convention is really useful and easy".

$$\sum_k x_k\partial_i r_k$$

Now, what exactly is $\partial_i$?

Last edited: Mar 21, 2009
3. Mar 21, 2009

### matt grime

No, it is what happens when you apply $\partial_i$ to $r_k$.

I'm assuming that $\partial_i$ means partial derivative with respect to something (x_i perhaps, or r_i).

Passing back to matrices, remember that if M is a matrix we write M_{ij} for the i,j'th entry of a matrix (say a real number). M_{ij} is _not_ a matrix. It is an element of a matrix.

4. Mar 21, 2009

### AxiomOfChoice

I think I've shown that $x_k \partial_i r_k$ is identically zero: Rewrite $x_k \partial_i r_k$ as

$$x_k \partial_i (\frac{x_k}{r}),$$

where as usual $r = |\vec{r}|$. We can apply the chain rule to get

$$x_k \partial_i (\frac{x_k}{r}) = x_k x_k \partial_i(\frac{1}{r}) + \frac{x_k}{r} \partial_i x_k.$$

Things become easier now. The second term on the RHS is just

$$\frac{x_k}{r}\delta_{ik} = \frac{x_k}{r}$$.

Now, $\partial_i (1/r)$ is the same as $\nabla (1/r)$, and a quick calculation gives that $\nabla (1/r) = -x_k/r^3$ (in index notation). Since $x_k x_k = r^2$, the first term on the RHS is, using index notation,

$$-r^2\frac{x_k}{r^3} = -\frac{x_k}{r}.$$

Then

$$x_k \partial_i r_k = -\frac{x_k}{r} + \frac{x_k}{r} = 0.$$

I might have made a mistake, but I think this is right. Let me know if you disagree with my argument.

5. Mar 21, 2009

### matt grime

There is one clear mistake in your reasoning that your last line of tex tells you to look for: the LHS of that last line is a sum. The central term is not a sum but has k, the summation index, in it.

Here is one thing that is wrong: you write

$$\frac{x_k}{r}\delta_{ik} = \frac{x_k}{r}$$

The thing on the left is summed over k. The thing on the right has a k in it. It should have an i in it, not a k. You do something similar when you say that the i'th component of del of 1/r is x_k. Let me quote the thing in the next reply.

Last edited: Mar 21, 2009
6. Mar 21, 2009

### matt grime

No it's not. the former is a component, the latter is a vector. See above comment about matrices and elements of matrices. It is important to know what the things you're manipulating are, and why you can do what you do to them.

Here you use k when you *must* use i, otherwise, if it weren't for this next substitution:

you'd've ended up with more than 2 k's in something.

7. Mar 22, 2009

### AxiomOfChoice

$$\partial_i = \frac{\partial}{\partial x_i}.$$

Hopefully that clears things up.

8. Mar 22, 2009

### AxiomOfChoice

Matt:

I see your point. I think what I should have in my final expression is i's, not k's. If I follow this through, I still get 0 as my final answer.

I will try to post again in a second with my corrections.

9. Mar 22, 2009

### AxiomOfChoice

This should be corrected as follows:

$$\frac{x_k}{r}\delta_{ik} = \frac{x_i}{r}$$.

This should be corrected as follows:

A quick calculation gives

$$\partial_i (1/r) = -\frac{x_i}{r^3}$$

Hence

$$x_k x_k\partial_i(1/r) = r^2(-x_i/r^3) = -\frac{x_i}{r}.$$

This is because $x_k x_k$ is the same as $\vec{r} \cdot \vec{r} = r^2$. It follows that

$$x_k\partial_i r_k = \frac{x_i}{r} - \frac{x_i}{r} = 0.$$

(Oh God, PLEASE let that be right.)

10. Mar 23, 2009

### matt grime

Yes, I that's it.

11. Apr 17, 2009

### Will_taylor22

I'm bumping this because it's related to the problem i'm having. currently though, i'm just trying to understand what's been written and it would be great if you could help me out. here's a quote from what is above:

We can apply the chain rule to get

$$x_k \partial_i (\frac{x_k}{r}) = x_k x_k \partial_i(\frac{1}{r}) + \frac{x_k}{r} \partial_i x_k.$$

Things become easier now. The second term on the RHS is just

$$\frac{x_k}{r}\delta_{ik} = \frac{x_k}{r}$$.

how does that work?

$$\frac{x_k}{r} \partial_i x_k. = \frac{x_k}{r}\delta_{ik} = \frac{x_k}{r}$$.

I don't understand what you've done there, how that's equal.

12. Apr 17, 2009

### matt grime

What don't you understand? (note you've copied part that was wrong.)

This:
$$\frac{x_k}{r}\delta_{ik} = \frac{x_k}{r}$$

is not correct. Note again that on the left the k is a summed index. So it cannot appear on the right on its own. We are also trying to work out the i'th component of something so i should appear on the right. It should read

$$\frac{x_k}{r}\delta_{ik} = \frac{x_i}{r}$$

Is that the problem solved?

13. Apr 17, 2009

### Will_taylor22

yes, thankyou.