Proving that a function is gradient vector of another function

Trying to prove that the gradient of a scalar field is symmetric(?) Struggling with the formatting here. Please see the linked image. Thanks.

http://i.imgur.com/9ZelT.png

Last edited:

lanedance
Homework Helper
so you have shown if g is the gradient of a scalar function then
$$\frac{\partial g_i}{\partial x_j} = \frac{\partial g_j}{\partial x_i}$$

this is because the partial derivatives coummte, ie
$$\frac{\partial^2 V}{\partial x_j x_i} = \frac{\partial^2 V}{\partial x_i x_j}$$

now you need to conisder the other direction of the proof

Unfortunately it's the other direction that has me baffled. Any help would be highly appreciated considering I am already on an extension. Thanks!

lanedance
Homework Helper
how about trying to construct the scalar function by integrating the partials you have? or you could maybe try a contradiction... though i'm not convinced on that one

Office_Shredder
Staff Emeritus
Gold Member
Integrating the partials is a good idea.

In two dimensions for example, we want to find a function V(x,y) such that
$$\frac{\partial V}{\partial x} = g_1(x,y)$$
and
$$\frac{\partial V}{\partial y} = g_2(x,y)$$

We can integrate the first equation w.r.t. x and the second w.r.t. y and get by the fundamental theorem of calculus
$$V(x,y) = \int g_1(x,y)dx + F(y)$$
$$V(x,y) = \int g_2(x,y)dy + G(x)$$
The constant of integration when you integrate w.r.t. x is really an arbitrary function of y, and vice versa. (here the integration is really just any choice of antiderivative that you want, because we're writing out the constant of integration explicitly). So we're set as long as we can find functions G(x) and F(y) such that
$$\int g_1(x,y)dx + F(y)=\int g_2(x,y)dy + G(x)$$

So the question boils down to why do these functions F and G exist given the condition on the partial derivatives of g1 and g2?