Quantumpencil
- 96
- 0
Homework Statement
Prove that if a map f: R^n -> R if f(0)=0 satisfies f differentiable, tf(x)=f(tx) for all x in R^n, then f is linear. (Prove additivity)
Homework Equations
The Attempt at a Solution
So the first thing I tried was taking the derivative using the Matrix (or vector in this case)
We have that f'(x+y) = f'(x)+f'(y) by the linearity of the derivative.
Could one then take line integrals to get the desired equality (A path from 0 to x, a path from 0 to y), we should have, added together should enter a path from 0 to (x+y), so long as we are integrating from the same starting point (What's bothering me about this is it's not important f(0)=0, since we can just make the starting point of the path integral equal...)
However we haven't defined the line integral in class, so regardless I'm thinking this is probably not what the instructor wants us to gain from the problem. Anyone have any insights about how I might approach this (Other Ideas I have had include, Showing that the partials are all constant (If they are all constant, than the original function which produced them must have contained only linear dependence on each of the n co-ordinates... that would imply the original vector field was linear together with f(0)=0 yes?)
EDIT: So I explored the second Idea. If we differentiate (gradient) the second term we get that
tf'(tx)=tf'(x)-> f'(tx)=f'(x), so we know that if homogeneity holds, f'(x) is constant, which shows that all of the partials are constant. If all the partials are constant, then in each co-ordinate, we have linearity (partial f by partial x_1 = c, so that f at most depended only on x_1 in its first power). f is however a scalar field though, so while I have convinced myself of the truth of this and drawn pictures, how might I rigorously arrive at the equality I need (That f(x_i+y_i)=f(x_i)+f(y_i) for all i, where x_i denotes the projection of x onto the vector x_i,
EDIT: I think I got it.
Define a function O(x,y)=f(x+y)-f(x)-f(y)
Then O'(x,y)= f'(x+y)-f'(x)-f'(y), but since f'(x)=f'(y), f'(x+y)=f'(x)+f'(y) so that O'(x,y) = 0.
Then, since f(0)=0, O(0,0)=0, so that O(x) must be zero everywhere.
Last edited: