# Changed post

1. Aug 26, 2007

### ehrenfest

divergence question

show that the divergence transforms as a vector under 2D rotations.

I am so confused abouth what this question wants me to do. Obviously the divergence is not invariant under rotations. Consider the divergence of the function f(x,y) = x^2 * x-hat. The divergence is clearly different at (x,y) = (1,0) than at (x,y) = (-1,0). What can this possibly mean then?

Last edited: Aug 26, 2007
2. Aug 26, 2007

### Mindscrape

Yes, it is sort of strange, but think about as if you just threw a bunch of pine needles onto one spot in a pond. When the needles start to move they are diverging, if they move away then it would be positive divergence, and if they moved in it would be negative divergence. Now, if you stand at one spot on the pond and then move to another, the divergence of the needles (how much they are moving) will be the same.

What you need to do is show that

$$\nabla \cdot \mathbf{v} = \nabla \cdot \mathbf{v'}$$

You know from the rotation matrix that

$$x' = xcos\theta + ysin\theta$$
$$y' = -xsin\theta + ycos\theta$$

You also know from the chain rule that

$$\frac{\partial v}{\partial x'} = \frac{\partial v}{\partial x} \frac{\partial x}{\partial x'}$$

and the similar result for y'.

"Solving" for x and y, and using the above will help you out.

3. Aug 27, 2007

### ehrenfest

That statement just seems false to me. What about my example in the thread-opening post?

Do you mean
$$\frac{\partial v}{\partial x'} = \frac{\partial v}{\partial x} \frac{\partial x}{\partial x'} + \frac{\partial v}{\partial y} \frac{\partial y}{\partial x'}$$

4. Aug 27, 2007

### Mindscrape

I don't see what you mean in your opening post. Yes, the divergence is different at those points, but it will still transform as a scalar in two dimensions. The divergence is defined as how much the vector spreads out from a point in question.

That's what I meant, don't really know why I missed it (lots of typing), sorry. So with the combination of the rotation transform and the partial equations you end up getting

$$\frac{\partial v_x'}{\partial x'} = \frac{\partial v_x}{\partial x'}cos \theta + \frac{ \partial v_y}{\partial x'} sin \theta = (\frac{\partial v_x}{\partial x} \frac{\partial x}{\partial x'} + \frac{\partial v_x}{\partial y} \frac{\partial y}{\partial x'})cos \theta + ( \frac{\partial v_y}{\partial x} \frac{\partial x}{\partial x'} + \frac{\partial v_y}{\partial y} \frac{\partial y}{\partial x'}) sin \theta$$

and similar for v_y'.

Have you already done the gradient transforming as a vector in two dimensions? If not it might be helpful. It might also be helpful to try some examples.

5. Aug 28, 2007

### ehrenfest

Sorry, my initial post was wrong. The divergence transforms as a scalar and the gradiant transforms as a vector. I should have caught that earlier. I have done the gradiant as a vector already and that works fine:

$$\frac{\partial f}{\partial x'} = \frac{\partial f}{\partial x}\frac{\partial x}{\partial x'} + \frac{ \partial f}{\partial y} \frac{ \partial y}{\partial y'} = \frac{\partial f}{\partial x}cos \theta + \frac{ \partial f}{\partial y} sin \theta$$

Similar for df/dy'. I think it is the primed notation that is confusing me. x' is the transformed x but why are you transforming the function v_x as well? I thought we were working with the same function just with a transformation of its domain. Would it not make more sense to write v_x(x',y') instead of v_x'? Look at the equation above. There are no primed coordinates on the far RHS.

You wrote:
$$\frac{\partial v_x'}{\partial x'} = \frac{\partial v_x}{\partial x'}cos \theta + \frac{ \partial v_y}{\partial x'} sin \theta$$

Also, I do not see why you are using primed coordinates on the RHS of the above equation.

Last edited: Aug 28, 2007
6. Aug 28, 2007

### Mindscrape

Bars are more typical, and I did all my stuff on paper with bars, so I might have added or missed a prime at some point in translation, though I don't see any. I don't really know how to use bars in LaTeX, and primes are the next standard.

You want
$$\nabla \cdot \mathbf{v'} = (\frac{\partial}{\partial x'} + \frac{\partial}{\partial y'}) \cdot \mathbf{v'}$$

x' is the transformation of x, but the same principles work for transforming the divergence of v. Think of x as v_x, and y as v_y. So I have given the first primed partial with its transformation, you will need to figure out and add the other. Then you will have to start working towards showing that the original vector v, and the transformed vector v' have the same divergence.

7. Aug 28, 2007

### ehrenfest

I am aware that you are using primes instead of bars.

That is very confusing. Our original vector-valued function is v(x,y) = v_x(x,y) x-hat + v_y(x,y) y-hat. Its divergence is

$$\frac{\partial{v_x(x,y)}}{\partial{x}} + \frac{\partial{v_y(x,y)}}{\partial{y}}$$

After we transform the coordinates, we get

v(x',y') = v_x(x',y') x-hat + v_y(x',y') y-hat

correct?

v(x',y') has divergence

$$\partial{v_x(x',y')}/\partial{x'} + \partial{v_y(x',y')}/\partial{y'}$$

correct ?

$$\partial{v_x(x',y')}/\partial{x'} + \partial{v_y(x',y')}/\partial{y'}$$

=

$$\frac{\partial{v_x(x',y')}}{\partial{x}} \frac{\partial{x}}{\partial{x'}} + \frac{\partial{v_x(x',y')}}{\partial{y}}\frac{\partial{y}}{\partial{x'}}+ \frac{\partial{v_y(x',y')}}{\partial{y}}\frac{\partial{y}}{\partial{y'}} + \frac{\partial{v_y(x',y')}}{\partial{x}} \frac{\partial{x}}{\partial{y'}}$$

I can find expressions for the second partial derivatives in each term but it does not produce the correct answer because I cannot get rid of the sines and cosines.

Last edited: Aug 28, 2007
8. Aug 28, 2007

### Mindscrape

Yes, it all looks good. I dropped the function of (x',y') because it is sort of implicit and a pain to write over and over again.

Yes, when you plug it all in you should get a lot of sines and cosines with something to the effect of (sorry, I'm bound to make a LaTeX mistake):

$$\nabla \cdot \mathbf{v'} = \frac{\partial v_x}{\partial x} cos^2 \theta + \frac{\partial v_x}{\partial y} sin \theta cos \theta + \frac{\partial v_y}{\partial x} sin \theta cos \theta + \frac{\partial v_y}{\partial y} sin^2 \theta + \frac{\partial v_x}{\partial x} sin^2 \theta - \frac{\partial v_y}{ \partial z} sin \theta cos \theta - \frac{\partial v_y}{\partial x} sin \theta cos \theta + \frac{\partial v_y}{\partial y} cos^2 \theta$$

Check it against what you have. It will simplify after some factoring and cancellation.

Last edited: Aug 28, 2007
9. Aug 28, 2007

### ehrenfest

I am sure this will elucidate the flaw in my logic:

I got

x = cos(theta)x' -sin(theta)y'
y = sin(theta)x' + cos(theta)y'

just by applying the inverse rotation matrix. Then

$$\frac{\partial{x}}{\partial{x'}} = cos(\theta)$$
and similarly for the 3 other partials. Plugging that into the last equation of my previous post does not given me anything like what you posted. Indeed, none of the terms even have more than one sine or cosine term!!

Last edited: Aug 28, 2007
10. Aug 28, 2007

### Mindscrape

The inverse matrix will just switch a few signs around, but will ultimately give the same result. What did you get for your vector transformation of the gradient? Because you can basically just plug that in for your primed coordinates.

So, maybe you were confused on where I actually got
$$\frac{\partial v_x'}{\partial x'} = \frac{\partial v_x}{\partial x'}cos \theta + \frac{ \partial v_y}{\partial x'} sin \theta$$

it is merely that
$$v_x (x',y') = v_x (x,y) cos\theta + v_y (x,y) sin \theta$$

so
$$\frac{\partial v_x (x',y')}{\partial x'} = \frac{\partial}{\partial x'} (v_x (x,y) cos\theta + v_y (x,y) sin \theta)$$

the same approach works for v_y (x',y').

So get those two down, and do the chain rule stuff. Then look at the divergence so far, and see if there is anything in the divergence you have that relates to the gradient transform you did. Hopefully you will get on track.

11. Aug 28, 2007

### ehrenfest

Are you using $$v_x(x',y')$$ interchangeably with $$v'_x$$? If you are I understand completely now!!

Last edited: Aug 28, 2007
12. Aug 28, 2007

### Mindscrape

Yeah, I think it is annoying to write the function dependencies all the time.

13. Aug 28, 2007

### ehrenfest

I see. Thanks.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook