Vector analysis problem about a gradient

In summary, The conversation is about proving the statement $$f(\lambda x)=\lambda^s f(x),\quad \forall \lambda>0,\quad \forall x\in D\subset \mathbb{R}^m.$$ The correct way to differentiate it is using the chain rule and invoking the gradient, as shown in the conversation.
  • #1
patric44
296
39
Homework Statement
if Φ=Φ(λx,λy,λz) = λ^n(x,y,z)
proof that r.grad(Φ) = nΦ
Relevant Equations
Φ=Φ(λx,λy,λz) = λ^n(x,y,z)
r.grad(Φ) = nΦ
hi guys i saw this problem in my collage textbook on vector calculus , i don't know if the statement is wrong because it don't make sense to me
so if anyone can help on getting a hint where to start i will appreciate it , basically it says :
$$ \phi =\phi(\lambda x,\lambda y,\lambda z)=\lambda^{n}(x,y,z) $$
prove that
$$\vec{r} . \vec{\nabla}\phi=n\phi$$
 
Physics news on Phys.org
  • #2
The correct statement is $$f(\lambda x)=\lambda^s f(x),\quad \forall \lambda>0,\quad \forall x\in D\subset \mathbb{R}^m.$$
Here the domain ##D## is such that ##x\in\ D\Longrightarrow \lambda x\in D##.Differentiate this equation in ##\lambda## and then put ##\lambda=1##:
$$\Big(x,\frac{\partial f}{\partial x}\Big)=sf(x).$$
 
  • Informative
Likes etotheipi
  • #3
wrobel said:
The correct statement is $$f(\lambda x)=\lambda^s f(x),\quad \forall \lambda>0,\quad \forall x\in D\subset \mathbb{R}^m.$$
Here the domain ##D## is such that ##x\in\ D\Longrightarrow \lambda x\in D##.Differentiate this equation in ##\lambda## and then put ##\lambda=1##:
$$\Big(x,\frac{\partial f}{\partial x}\Big)=sf(x).$$
why should i differentiate with respect to ##\lambda## the gradient of ##\phi## is with respect to the ##(x,y,z)## ?
and what about ##\vec{r}.\vec{\nabla}\phi##
 
  • #4
patric44 said:
why should i differentiate with respect to ##\lambda## the gradient of ##\phi## is with respect to the ##(x,y,z)## ?
and what about ##\vec{r}.\vec{\nabla}\phi##
You can let ##g(\lambda) = f(\lambda x, \lambda y, \lambda z) = \lambda^n f(x, y, z)## and differentiate that with respect to ##\lambda##.
 
  • Informative
Likes etotheipi
  • #5
PeroK said:
You can let ##g(\lambda) = f(\lambda x, \lambda y, \lambda z) = \lambda^n f(x, y, z)## and differentiate that with respect to ##\lambda##.
ok :
let ##g(\lambda) = f(\lambda x ,\lambda y,\lambda z)=\lambda^{n}f(x,y,z)## then by differentiating :
$$\frac{dg(\lambda)}{d\lambda}=n\lambda^{n-1}f(x,y,z)$$
and the gradient part :
$$\vec{r}.\vec{\nabla}f(\lambda x ,\lambda y,\lambda z) = \lambda^{n}(x\frac{df}{dx}+y\frac{df}{dy}+z\frac{df}{dz})$$
i don't get how this is related to my question!
 
  • #6
patric44 said:
ok :
let ##g(\lambda) = f(\lambda x ,\lambda y,\lambda z)=\lambda^{n}f(x,y,z)## then by differentiating :
$$\frac{dg(\lambda)}{d\lambda}=n\lambda^{n-1}f(x,y,z)$$
and the gradient part :
$$\vec{r}.\vec{\nabla}f(\lambda x ,\lambda y,\lambda z) = \lambda^{n}(x\frac{df}{dx}+y\frac{df}{dy}+z\frac{df}{dz})$$
i don't get how this is related to my question!
There's another way you can differentiate ##g(\lambda)##.
 
  • #7
PeroK said:
There's another way you can differentiate ##g(\lambda)##.
a little more help , you mean using something like the chain rule ?
 
  • #8
patric44 said:
a little more help , you mean using something like the chain rule ?
Yes, because the chain rule invokes the gradient.
 
  • #9
PeroK said:
Yes, because the chain rule invokes the gradient.
:smile: thanks , i guess i got it now :
if i let ##u=\lambda x , v = \lambda y , w = \lambda z , ##then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial x}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial y}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial z}$$
then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial \lambda x} x +\frac{\partial\phi}{\partial \lambda y}y+\frac{\partial\phi}{\partial \lambda z}z=n\lambda^{n-1}\phi(x,y,z)$$
if i let ##\lambda = 1## :
$$ \frac{\partial\phi}{\partial x} x +\frac{\partial\phi}{\partial y}y+\frac{\partial\phi}{\partial z}z=n\phi(x,y,z)=\vec{r}.\vec{\nabla}\phi$$
thanks guys so much .
 
  • #10
patric44 said:
:smile: thanks , i guess i got it now :
if i let ##u=\lambda x , v = \lambda y , w = \lambda z , ##then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial x}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial y}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial z}$$
then
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial \lambda x} x +\frac{\partial\phi}{\partial \lambda y}y+\frac{\partial\phi}{\partial \lambda z}z=n\lambda^{n-1}\phi(x,y,z)$$
if i let ##\lambda = 1## :
$$ \frac{\partial\phi}{\partial x} x +\frac{\partial\phi}{\partial y}y+\frac{\partial\phi}{\partial z}z=n\phi(x,y,z)=\vec{r}.\vec{\nabla}\phi$$
thanks guys so much .
This is not quite right. First, you should have:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial \lambda}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial \lambda}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial \lambda }$$
And then this becomes:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}x +\frac{\partial\phi}{\partial v} y +\frac{\partial\phi}{\partial w} z = \vec{\nabla}\phi(u, v, w) \cdot \vec r = \vec{\nabla}\phi(\lambda x, \lambda y, \lambda z) \cdot \vec r$$
 
  • Like
Likes etotheipi
  • #11
PeroK said:
This is not quite right. First, you should have:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}\frac{\partial u}{\partial \lambda}+\frac{\partial\phi}{\partial v}\frac{\partial v}{\partial \lambda}+\frac{\partial\phi}{\partial w}\frac{\partial w}{\partial \lambda }$$
And then this becomes:
$$\frac{dg}{d\lambda} = \frac{\partial\phi}{\partial u}x +\frac{\partial\phi}{\partial v} y +\frac{\partial\phi}{\partial w} z = \vec{\nabla}\phi(u, v, w) \cdot \vec r = \vec{\nabla}\phi(\lambda x, \lambda y, \lambda z) \cdot \vec r$$
oh my bad , but the book stated it explicitly as ##\vec{r}.\vec{\nabla\phi}## so i thought the book was right , does it matter since i can change the order of the product of the chain rule terms , and get the other reversed expression , or i cannot do that ?
 
  • #12
You still have to set ##\lambda = 1##. That sorts everything out.
 
  • Like
Likes patric44
  • #13
thanks so much :smile:
 
  • Like
Likes PeroK
  • #14
the converse theorem also holds
 
Last edited:

1. What is a gradient in vector analysis?

A gradient in vector analysis is a mathematical concept that represents the rate of change of a function in a specific direction. It is a vector that points in the direction of the steepest increase of the function and its magnitude represents the rate of change.

2. How is a gradient calculated?

A gradient is calculated by taking the partial derivatives of a multivariable function with respect to each variable and arranging them into a vector. The resulting vector is the gradient of the function.

3. What is the significance of a gradient in vector analysis?

The gradient is a crucial tool in vector analysis as it helps in determining the direction and magnitude of the steepest increase of a function. It is also used in optimization problems to find the maximum or minimum value of a function.

4. How is a gradient used in physics and engineering?

In physics and engineering, the gradient is used to represent physical quantities such as temperature, pressure, and velocity, which vary in space. It is also used in fields like fluid mechanics, electromagnetism, and thermodynamics to analyze and solve problems.

5. Can a gradient have a negative value?

Yes, a gradient can have a negative value. The sign of the gradient indicates the direction of the steepest increase of the function. A negative gradient means that the function is decreasing in that direction, while a positive gradient means that the function is increasing.

Similar threads

  • Calculus and Beyond Homework Help
Replies
9
Views
766
  • Calculus and Beyond Homework Help
Replies
8
Views
468
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
557
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
970
Back
Top