Proving that a function is gradient vector of another function

tewaris
Messages
2
Reaction score
0
Trying to prove that the gradient of a scalar field is symmetric(?) Struggling with the formatting here. Please see the linked image. Thanks.

http://i.imgur.com/9ZelT.png
 
Last edited:
Physics news on Phys.org
so you have shown if g is the gradient of a scalar function then
\frac{\partial g_i}{\partial x_j} = \frac{\partial g_j}{\partial x_i}

this is because the partial derivatives coummte, ie
\frac{\partial^2 V}{\partial x_j x_i} = \frac{\partial^2 V}{\partial x_i x_j}

now you need to conisder the other direction of the proof
 
Unfortunately it's the other direction that has me baffled. Any help would be highly appreciated considering I am already on an extension. Thanks!
 
how about trying to construct the scalar function by integrating the partials you have? or you could maybe try a contradiction... though I'm not convinced on that one
 
Integrating the partials is a good idea.

In two dimensions for example, we want to find a function V(x,y) such that
\frac{\partial V}{\partial x} = g_1(x,y)
and
\frac{\partial V}{\partial y} = g_2(x,y)

We can integrate the first equation w.r.t. x and the second w.r.t. y and get by the fundamental theorem of calculus
V(x,y) = \int g_1(x,y)dx + F(y)
V(x,y) = \int g_2(x,y)dy + G(x)
The constant of integration when you integrate w.r.t. x is really an arbitrary function of y, and vice versa. (here the integration is really just any choice of antiderivative that you want, because we're writing out the constant of integration explicitly). So we're set as long as we can find functions G(x) and F(y) such that
\int g_1(x,y)dx + F(y)=\int g_2(x,y)dy + G(x)

So the question boils down to why do these functions F and G exist given the condition on the partial derivatives of g1 and g2?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top