Question regarding Gateaux Derivative

  • Thread starter Thread starter naericson
  • Start date Start date
  • Tags Tags
    Derivative
naericson
Messages
1
Reaction score
0

Homework Statement


I am trying to solve the following problem:

Let X be space of continuous functions on [0,1] and let F:X\rightarrow\mathbb{R} be defined by F(f)=\max\limits_{0\leq x\leq 1} f(x) for any f\in X. Show that the Gateaux Derivative does not exist if f achieves a maximum at two different points x_1,x_2 in [0,1].

Homework Equations


The Gateaux Derivative for f,h\in X is given by

\lim\limits_{t\to0}\frac{1}{t}\left(F(f+th)-F(f)\right)

if the above limit exists for any increment h.

The Attempt at a Solution


Using the limit definition of the Gateaux Derivative, we see that

\lim\limits_{t\to0}\frac{1}{t}\left(F(f+th)-F(f)\right)<br /> =\lim\limits_{t\to0}\frac{1}{t}\left(\max\limits_{x}(f+th)(x)-\max\limits_{x}f(x)\right)
=\lim\limits_{t\to0}\frac{1}{t}\left(\max\limits_{x}f(x)+\max_{x}th(x)-\max\limits_{x}f(x)\right)
=\lim\limits_{t\to0}t\frac{\max\limits_{x}h(x)}{t}
=\max\limits_{x}h(x).

This seems to work regardless of whether or not the function has a unique maximum, so that is the part I don't understand. Any help would be appreciated.
 
Last edited:
Physics news on Phys.org
naericson said:

Homework Statement


I am trying to solve the following problem:

Let X be space of continuous functions on [0,1] and let F:X\rightarrow\mathbb{R} be defined by F(f)=\max\limits_{0\leq x\leq 1} f(x) for any f\in X. Show that the Gateaux Derivative does not exist if f achieves a maximum at two different points x_1,x_2 in [0,1].

Homework Equations


The Gateaux Derivative for f,h\in X is given by

\lim\limits_{t\to0}\frac{1}{t}\left(F(f+th)-F(f)\right)

if the above limit exists for any increment h.

The Attempt at a Solution


Using the limit definition of the Gateaux Derivative, we see that

\lim\limits_{t\to0}\frac{1}{t}\left(F(f+th)-F(f)\right)<br /> =\lim\limits_{t\to0}\frac{1}{t}\left(\max\limits_{x}(f+th)(x)-\max\limits_{x}f(x)\right)
=\lim\limits_{t\to0}\frac{1}{t}\left(\max\limits_{x}f(x)+\max_{x}th(x)-\max\limits_{x}f(x)\right)
=\lim\limits_{t\to0}t\frac{\max\limits_{x}h(x)}{t}
=\max\limits_{x}h(x).

This seems to work regardless of whether or not the function has a unique maximum, so that is the part I don't understand. Any help would be appreciated.

While I'm still not clear on the whole problem, I'll tell you one thing that's wrong. max(f(x)+th(x)) is not generally equal to max(f(x))+max(th(x)). Try it with a simple example like f(x)=x and h(x)=(-x).
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top