• Support PF! Buy your school textbooks, materials and every day products Here!

Gradient question

  • #1
634
1

Homework Statement


This is not a hw problem, just a question
[tex]\nabla[/tex](A.B) = (B.[tex]\nabla[/tex]) A +(A.[tex]\nabla[/tex])B+Bx([tex]\nabla[/tex]xA)+Ax([tex]\nabla[/tex]xB)

A,B are vectors

Homework Equations





The Attempt at a Solution



I can't make sense of the first 2 terms on the right hand side - is (B.[tex]\nabla[/tex])
just div of B?

Also, how do I solve, [tex]\nabla[/tex](e-xr-2 [tex]\widehat{r}[/tex])
Can I treat the unit vector [tex]\widehat{r}[/tex] as constant?
 

Answers and Replies

  • #2
gabbagabbahey
Homework Helper
Gold Member
5,002
6
I can't make sense of the first 2 terms on the right hand side - is (B.[tex]\nabla[/tex])
just div of B?
No, it's more complicated than that. In Cartesian Coordinates, the Del operator is

[tex]\mathbf{\nabla}=\hat{\mathbf{i}}\frac{\partial}{\partial x}+\hat{\mathbf{j}}\frac{\partial}{\partial y}+\hat{\mathbf{k}}\frac{\partial}{\partial z}[/tex]

So,

[tex]\textbf{B}\cdot\mathbf{\nabla}=B_x\frac{\partial}{\partial x}+B_y\frac{\partial}{\partial y}+B_x\frac{\partial}{\partial z}[/tex]

And so,

[tex](\textbf{B}\cdot\mathbf{\nabla})\textbf{A}=B_x\frac{\partial \textbf{A}}{\partial x}+B_y\frac{\partial \textbf{A}}{\partial y}+B_x\frac{\partial \textbf{A}}{\partial z}[/tex]


Also, how do I solve, [tex]\nabla[/tex](e-xr-2 [tex]\widehat{r}[/tex])
Can I treat the unit vector [tex]\widehat{r}[/tex] as constant?
The gradient of a vector is a second rank tensor. Is this really what you are trying to calculate? What is the original problem?
 
  • #3
634
1
The gradient of a vector is a second rank tensor. Is this really what you are trying to calculate? What is the original problem?
The original problem is :

Derive the Vector identity:

(Ji.[tex]\nabla'[/tex])[tex]\nabla'[/tex](e-[tex]\gamma[/tex]r/r) =

[[tex]\gamma[/tex]2((Ji.[tex]\hat{r}[/tex])[tex]\hat{r}[/tex]+3/r([tex]\gamma[/tex]+1/r)(Ji.[tex]\hat{r}[/tex])[tex]\hat{r}[/tex] - Ji/r([tex]\gamma[/tex]+1/r)]e-[tex]\gamma[/tex]r/r



I solved
[tex]\nabla'[/tex] (e-[tex]\gamma[/tex]r/r) and got

([tex]\gamma[/tex]+1/r) (e-[tex]\gamma[/tex]r/r2) [tex]\hat{r}[/tex]
 
Last edited:
  • #4
gabbagabbahey
Homework Helper
Gold Member
5,002
6
Derive the Vector identity:

(Ji.[tex]\nabla'[/tex])[tex]\nabla'[/tex](e-[tex]\gamma[/tex]r/r) =

[[tex]\gamma[/tex]2((Ji.[tex]\hat{r}[/tex])[tex]\hat{r}[/tex]+3/r([tex]\gamma[/tex]+1/r)(Ji.[tex]\hat{r}[/tex])[tex]\hat{r}[/tex] - Ji/r([tex]\gamma[/tex]+1/r)]e-[tex]\gamma[/tex]r/r
I'm having a difficult time reading your expression...

[tex](\textbf{J}\cdot\mathbf{\nabla})\left(\mathbf{\nabla}\frac{e^{-\gamma r}}{r}\right)=\left[\gamma^2(\textbf{J}\cdot\hat{\mathbf{r}})\hat{\mathbf{r}}+\frac{3}{r}\left(\gamma+\frac{1}{r}\right)(\textbf{J}\cdot\hat{\mathbf{r}})\hat{\mathbf{r}}-\frac{1}{r}\left(\gamma+\frac{1}{r}\right)\textbf{J}\right]\frac{e^{-\gamma r}}{r}[/tex]

^^^ Is this what you meant? I assume that [itex]\textbf{r}[/itex] is just the usual position vector, [itex]r[/itex] is its magnitude and [tex]\hat{\mathbf{r}}[/itex] is its direction? If so, why does your expression have primes next to the nablas?



I solved
[tex]\nabla'[/tex] (e-[tex]\gamma[/tex]r/r) and got

([tex]\gamma[/tex]+1/r) (e-[tex]\gamma[/tex]r/r2) [tex]\hat{r}[/tex]
Again, I can only assume that you mean

[tex]\mathbf{\nabla}\left(\frac{e^{-\gamma r}}{r}\right)=-\left(\gamma+\frac{1}{r}\right)\frac{e^{-\gamma r}}{r^2}\hat{\mathbf{r}}[/tex]

If so, then yes, that's correct.

Are you still having difficulty carrying out the derivative [tex](\textbf{J}\cdot\mathbf{\nabla})[/tex] on this expression?
 
  • #5
634
1
The expressions do have primes next to nabla. The nabla operates on primed coordinates. Source is at the primed coordinate and effect is at the unprimed coordinates.

Yes, the difficulty i was having was with the J.nabla
Also, say if you try to take the gradient of
[tex]-\left(\gamma+\frac{1}{r}\right)\frac{e^{-\gamma r}}{r^2}\hat{\mathbf{r}}
[/tex]
how would you deal with direction vector r^
 
  • #6
gabbagabbahey
Homework Helper
Gold Member
5,002
6
The expressions do have primes next to nabla. The nabla operates on primed coordinates. Source is at the primed coordinate and effect is at the unprimed coordinates.
Surely this means that [itex]\textbf{r}[/itex] isn't the position vector, but rather the separation vector between the position vectors of the field and source points;

[tex]\textbf{r}=(x-x')\hat{\mathbf{x}}+(y-y')\hat{\mathbf{y}}+(z-z')\hat{\mathbf{z}}[/itex]

...right?


Yes, the difficulty i was having was with the J.nabla
Also, say if you try to take the gradient of
[tex]-\left(\gamma+\frac{1}{r}\right)\frac{e^{-\gamma r}}{r^2}\hat{\mathbf{r}}
[/tex]
how would you deal with direction vector r^
First, the negative sign shouldn't be there if my above assumption is correct.

Second, you aren't taking the gradient of that (if you did, you would end up with a second rank tensor, not a vector). To take the partial derivative of something like [tex]f(r)\hat{\mathbf{r}}[/tex], you will have to use the product and chain rules. For example,

[tex]\begin{aligned}\frac{\partial}{\partial x'} \left[f(r)\hat{\mathbf{r}}\right] &= \frac{\partial f}{\partial x'}\hat{\mathbf{r}}+f(r)\frac{\partial \hat{\mathbf{r}}}{\partial x'} \\ &= \left(\frac{\partial f}{\partial r}\right)\left(\frac{\partial r}{\partial x'}\right)\hat{\mathbf{r}}+f(r)\frac{\partial}{\partial x'}\left[\frac{(x-x')\hat{\mathbf{x}}+(y-y')\hat{\mathbf{y}}+(z-z')\hat{\mathbf{z}}}{r}\right] \\ &= f'(r)\left(\frac{\partial r}{\partial x'}\right)\hat{\mathbf{r}}+f(r)\left[\frac{-1}{r}\left(\frac{\partial r}{\partial x'}\right)\hat{\mathbf{r}}-\frac{1}{r}\hat{\mathbf{x}}\right]\end{aligned}[/tex]
 
  • #7
634
1
Yes, you are right
[tex]
\textbf{r}=(x-xsingle-quote)\hat{\mathbf{x}}+(y-ysingle-quote)\hat{\mathbf{y}}+(z-zsingle-quote)\hat{\mathbf{z}}
[/tex]

and also no -ve sign in the expression
[tex]
\left(\gamma+\frac{1}{r}\right)\frac{e^{-\gamma r}}{r^2}\hat{\mathbf{r}}

[/tex]

Thanks for explaining the second rank tensor expression. I thought of the product rule, but then didn't pursue it coz I couldn't figure out how to evaluate d(r hat)/dt
 

Related Threads for: Gradient question

  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
4
Views
45K
  • Last Post
Replies
1
Views
648
  • Last Post
Replies
8
Views
61K
  • Last Post
Replies
5
Views
14K
  • Last Post
Replies
3
Views
1K
Replies
3
Views
2K
Replies
10
Views
7K
Top