Partial derivatives (question I am grading).

MathematicalPhysicist
Science Advisor
Gold Member
Messages
4,662
Reaction score
372
We have a function f:R^2->R and it has partial derivative of 2nd order.
Show that f_{xy}=0 \forall (x,y)\in \mathbb{R}^2 \Leftrightarrow f(x,y)=g(x)+h(y)

The <= is self explanatory, the => I am not sure I got the right reasoning.

I mean we know that from the above we have: f_x=F(x) (it's a question before this one), but now besides taking an integral I don't see how to show the consequent.

Any thoughts how to show this without invoking integration?
 
Physics news on Phys.org
My first thought is the mean value theorem which states that if f&#039;(x)=0 then f\equivconstant., apply this to x when y yields \partial_{y}f\equivconstant, but this constant will be dependent on y and therefor is a function of y.

Get the general idea now?
 
That's basically what I have written, so I know that f_x =F(x) and f_y=G(y). But this is where I am not sure how to procceed.

I mean I know that: if F(x)=h'(x), then G(x,y)= f(x,y)-h(x) then G_x=0 and then G=g(y).

The problem is how do I know that F(x)=h'(x) I am assuming that F is of this form, aren't I?
 
I think that as you effectively have the equation f&#039;(x)=g(x) then you want to show that there is a function F(x) with the property F&#039;(x)=f(x), and I think that this follows from a version of the fundamental theorem of calculus. By extension G(x)=F(x)+C will also satisfy this equation. I think you can say this because f\in C^{1} as you have f&#039;(x).
 

Similar threads

Replies
6
Views
2K
Replies
1
Views
2K
Replies
2
Views
2K
Replies
4
Views
3K
Replies
3
Views
3K
Replies
3
Views
5K
Replies
9
Views
2K
Back
Top