Vector analysis problem about a gradient

  • Thread starter patric44
  • Start date
  • #1
136
11

Homework Statement:

if Φ=Φ(λx,λy,λz) = λ^n(x,y,z)
proof that r.grad(Φ) = nΦ

Relevant Equations:

Φ=Φ(λx,λy,λz) = λ^n(x,y,z)
r.grad(Φ) = nΦ
hi guys i saw this problem in my collage textbook on vector calculus , i don't know if the statement is wrong because it don't make sense to me
so if any one can help on getting a hint where to start i will appreciate it , basically it says :
$$ \phi =\phi(\lambda x,\lambda y,\lambda z)=\lambda^{n}(x,y,z) $$
prove that
$$\vec{r} . \vec{\nabla}\phi=n\phi$$
 

Answers and Replies

  • #2
wrobel
Science Advisor
Insights Author
595
312
The correct statement is $$f(\lambda x)=\lambda^s f(x),\quad \forall \lambda>0,\quad \forall x\in D\subset \mathbb{R}^m.$$
Here the domain ##D## is such that ##x\in\ D\Longrightarrow \lambda x\in D##.


Differentiate this equation in ##\lambda## and then put ##\lambda=1##:
$$\Big(x,\frac{\partial f}{\partial x}\Big)=sf(x).$$
 
  • Informative
Likes etotheipi
  • #3
136
11
The correct statement is $$f(\lambda x)=\lambda^s f(x),\quad \forall \lambda>0,\quad \forall x\in D\subset \mathbb{R}^m.$$
Here the domain ##D## is such that ##x\in\ D\Longrightarrow \lambda x\in D##.


Differentiate this equation in ##\lambda## and then put ##\lambda=1##:
$$\Big(x,\frac{\partial f}{\partial x}\Big)=sf(x).$$
why should i differentiate with respect to ##\lambda## the gradient of ##\phi## is with respect to the ##(x,y,z)## ?
and what about ##\vec{r}.\vec{\nabla}\phi##
 
  • #4
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
14,343
6,714
why should i differentiate with respect to ##\lambda## the gradient of ##\phi## is with respect to the ##(x,y,z)## ?
and what about ##\vec{r}.\vec{\nabla}\phi##
You can let ##g(\lambda) = f(\lambda x, \lambda y, \lambda z) = \lambda^n f(x, y, z)## and differentiate that with respect to ##\lambda##.
 
  • Informative
Likes etotheipi
  • #5
136
11
You can let ##g(\lambda) = f(\lambda x, \lambda y, \lambda z) = \lambda^n f(x, y, z)## and differentiate that with respect to ##\lambda##.
ok :
let ##g(\lambda) = f(\lambda x ,\lambda y,\lambda z)=\lambda^{n}f(x,y,z)## then by differentiating :
$$\frac{dg(\lambda)}{d\lambda}=n\lambda^{n-1}f(x,y,z)$$
and the gradient part :
$$\vec{r}.\vec{\nabla}f(\lambda x ,\lambda y,\lambda z) = \lambda^{n}(x\frac{df}{dx}+y\frac{df}{dy}+z\frac{df}{dz})$$
i don't get how this is related to my question!
 
  • #6
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
14,343
6,714
ok :
let ##g(\lambda) = f(\lambda x ,\lambda y,\lambda z)=\lambda^{n}f(x,y,z)## then by differentiating :
$$\frac{dg(\lambda)}{d\lambda}=n\lambda^{n-1}f(x,y,z)$$
and the gradient part :
$$\vec{r}.\vec{\nabla}f(\lambda x ,\lambda y,\lambda z) = \lambda^{n}(x\frac{df}{dx}+y\frac{df}{dy}+z\frac{df}{dz})$$
i don't get how this is related to my question!
There's another way you can differentiate ##g(\lambda)##.
 
  • #7
136
11
There's another way you can differentiate ##g(\lambda)##.
a little more help , you mean using something like the chain rule ?
 
  • #8
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
14,343
6,714
a little more help , you mean using something like the chain rule ?
Yes, because the chain rule invokes the gradient.
 
  • #9
136
11
Yes, because the chain rule invokes the gradient.
:smile: thanks , i guess i got it now :
if i let ##u=\lambda x , v = \lambda y , w = \lambda z , ##then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial x}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial y}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial z}$$
then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial \lambda x} x +\frac{\partial\phi}{\partial \lambda y}y+\frac{\partial\phi}{\partial \lambda z}z=n\lambda^{n-1}\phi(x,y,z)$$
if i let ##\lambda = 1## :
$$ \frac{\partial\phi}{\partial x} x +\frac{\partial\phi}{\partial y}y+\frac{\partial\phi}{\partial z}z=n\phi(x,y,z)=\vec{r}.\vec{\nabla}\phi$$
thanks guys so much .
 
  • #10
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
14,343
6,714
:smile: thanks , i guess i got it now :
if i let ##u=\lambda x , v = \lambda y , w = \lambda z , ##then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial x}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial y}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial z}$$
then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial \lambda x} x +\frac{\partial\phi}{\partial \lambda y}y+\frac{\partial\phi}{\partial \lambda z}z=n\lambda^{n-1}\phi(x,y,z)$$
if i let ##\lambda = 1## :
$$ \frac{\partial\phi}{\partial x} x +\frac{\partial\phi}{\partial y}y+\frac{\partial\phi}{\partial z}z=n\phi(x,y,z)=\vec{r}.\vec{\nabla}\phi$$
thanks guys so much .
This is not quite right. First, you should have:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial \lambda}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial \lambda}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial \lambda }$$
And then this becomes:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}x +\frac{\partial\phi}{\partial v} y +\frac{\partial\phi}{\partial w} z = \vec{\nabla}\phi(u, v, w) \cdot \vec r = \vec{\nabla}\phi(\lambda x, \lambda y, \lambda z) \cdot \vec r$$
 
  • Like
Likes etotheipi
  • #11
136
11
This is not quite right. First, you should have:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial \lambda}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial \lambda}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial \lambda }$$
And then this becomes:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}x +\frac{\partial\phi}{\partial v} y +\frac{\partial\phi}{\partial w} z = \vec{\nabla}\phi(u, v, w) \cdot \vec r = \vec{\nabla}\phi(\lambda x, \lambda y, \lambda z) \cdot \vec r$$
oh my bad , but the book stated it explicitly as ##\vec{r}.\vec{\nabla\phi}## so i thought the book was right , does it matter since i can change the order of the product of the chain rule terms , and get the other reversed expression , or i cannot do that ?
 
  • #12
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
14,343
6,714
You still have to set ##\lambda = 1##. That sorts everything out.
 
  • Like
Likes patric44
  • #13
136
11
thanks so much :smile:
 
  • Like
Likes PeroK
  • #14
wrobel
Science Advisor
Insights Author
595
312
the converse theorem also holds
 
Last edited:

Related Threads on Vector analysis problem about a gradient

  • Last Post
Replies
2
Views
1K
Replies
6
Views
4K
  • Last Post
Replies
11
Views
2K
Replies
7
Views
595
  • Last Post
Replies
11
Views
29K
  • Last Post
Replies
3
Views
3K
  • Last Post
Replies
2
Views
848
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
21
Views
2K
  • Last Post
Replies
2
Views
1K
Top