Minimum distance of functions in a metric space

copacetic
Messages
4
Reaction score
0

Homework Statement


A metric on C[0,1] is defined by:

d(f,g) = ( \int_0^1 \! (f(x) - g_t(x))^2 \, dx )^{1/2}

Find t e R such that the distance between the functions f(x) = e^x - 1 and g_t(x) = t * x is minimal.


Homework Equations


Given above


The Attempt at a Solution


The first thing I did was multiply the inner part of the integral out then evaluate it:
( \int_0^1 \! (e^x-1) - (tx))^2 dx )^{1/2}

= ( \int_0^1 \! e^{2x} - 2e^x - 2te^xx + 1 + 2tx + t^2x^2 dx )^{1/2}

= (\frac{1}{2}e^2-\frac{1}{2} - (2e - 2) - 2t(e) + 1 + 2t(\frac{1}{2}) + t^2(\frac{1}{3}) )^{1/2}

= ( \frac{1}{2}e^2 + \frac{5}{2} - 2et + t + \frac{1}{3}t^2)^{1/2}

But I'm not sure I did that right, because now I don't know where to go from here. Any tips?
 
Physics news on Phys.org
So the distance between f and g depends on t. You might even say it's a function of t... do you know how to minimize a function that has only a single variable?
 
I would need to find the first derivative, find the roots and then check their sign in the second derivative. Can I do this without the square root around the entire function? Because with it the derivative is enormous and I'm not sure how I'd go about solving it for 0.
 
It's a pretty common technique to use that \sqrt{a}<\sqrt{b} is true exactly when a<b (assuming everything is positive here), so to minimize a function \sqrt{f(x)} it's enough to minimize f(x) itself (the same thing holds true for maximizing)

Even without using that though, by the chain rule the only difference is that you would have the function again in the denominator, and you can clear that out when you try to solve for the derivative equal to zero
 
You should not need to integrate- you find the derivative of a vector by differentiating and by the "Fundamental Theorem of Calculus", the derivative of \int_a^x f(t)dt is just f(x) itself.
 
HallsofIvy said:
You should not need to integrate- you find the derivative of a vector by differentiating and by the "Fundamental Theorem of Calculus", the derivative of \int_a^x f(t)dt is just f(x) itself.

This isn't really a fundamental theorem of calculus situation
 
Office_Shredder said:
It's a pretty common technique to use that \sqrt{a}<\sqrt{b} is true exactly when a<b (assuming everything is positive here), so to minimize a function \sqrt{f(x)} it's enough to minimize f(x) itself (the same thing holds true for maximizing)

Even without using that though, by the chain rule the only difference is that you would have the function again in the denominator, and you can clear that out when you try to solve for the derivative equal to zero
Ok thanks, in the end I have

\frac{2}{3}t - 2e + 1 = 0

t = 3e - \frac{3}{2}
 
copacetic said:
Ok thanks, in the end I have

\frac{2}{3}t - 2e + 1 = 0

t = 3e - \frac{3}{2}

Double check that. It looks like you are integrating x*e^x to get e. That's not right.
 
Dick said:
Double check that. It looks like you are integrating x*e^x to get e. That's not right.
Woops! You're right, I had x*e^x instead of x*e^x-e^x. Well that got rid of the nasty e now I'm left with t=3/2. Thanks!
 
Back
Top