Proving |f(x)-f(y)| ≤ K ||x-y|| for f:R^2→R

  • Context: MHB 
  • Thread starter Thread starter onie mti
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around proving the inequality |f(x)-f(y)| ≤ K ||x-y|| for a differentiable function f: R² → R, under the assumption that the 2-norm of the gradient of f is bounded by a constant K. The scope includes mathematical reasoning and application of the mean value theorem and Cauchy-Schwarz inequality.

Discussion Character

  • Mathematical reasoning
  • Technical explanation
  • Homework-related

Main Points Raised

  • One participant proposes applying the mean value theorem to the function g(t) = f((1-t)x + ty) to prove the desired inequality.
  • Another participant suggests defining a function k(t) = (1-t)x + ty and using the chain rule to express g'(t) in terms of the gradient of f.
  • It is noted that the Cauchy-Schwarz inequality can be applied to show that |g'(t)| is bounded by the product of the norms of the gradient and the derivative of k(t).
  • Some participants express difficulty in demonstrating that |f(x)-f(y)| can be shown to be less than or equal to K multiplied by the 2-norm of x-y, despite using the Cauchy-Schwarz inequality.
  • A later reply reiterates the importance of the mean value theorem in connecting the values of f at x and y through g(t).

Areas of Agreement / Disagreement

Participants generally agree on the approach of using the mean value theorem and the chain rule, but there remains uncertainty and difficulty in fully proving the inequality, indicating that the discussion is unresolved.

Contextual Notes

Some participants have not fully resolved the application of the mean value theorem and the implications of the Cauchy-Schwarz inequality in this context, leaving certain mathematical steps and assumptions unaddressed.

onie mti
Messages
42
Reaction score
0
Suppose that f:R^2 to R is differentiable on R^{2}. Also assume that there exists a real number K(greater that or equal to) 0, so that 2-norm of the (gradient of (f(x)) )is less than or equal to K for all x,y in R^{2}. Prove that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y) for all x,y in R^2.

i tried applying the mean value theorem to the function g(t)= f((1-t)x+ty) t is in [0,1] but I cannot move forward

it is no 2 on the uploaded files
 

Attachments

Physics news on Phys.org
Re: converging maps

onie mti said:
Suppose that f:R^2 to R is differentiable on R^{2}. Also assume that there exists a real number K(greater that or equal to) 0, so that 2-norm of the (gradient of (f(x)) )is less than or equal to K for all x,y in R^{2}. Prove that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y) for all x,y in R^2.

i tried applying the mean value theorem to the function g(t)= f((1-t)x+ty) t is in [0,1] but I cannot move forward

it is no 2 on the uploaded files
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.
 
Re: converging maps

Opalg said:
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.

thank you :) let me get on it (Bow)
 
Re: converging maps

Opalg said:
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.

i managed to use the schwarz inequality inequality but i can not show that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y)
 
Re: converging maps

onie mti said:
Opalg said:
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.

i managed to use the schwarz inequality inequality but i can not show that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y)
Have you used the hint about the mean value theorem? It says that $g(1) - g(0) = g'(t)$ for some $t$. Then $$|f(\mathbf{y}) - f(\mathbf{x})| = |g(1) - g(0)| = |g'(t) \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2 \leqslant K\| \mathbf{y} - \mathbf{x}\|_2.$$
 
Last edited:

Similar threads

  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 24 ·
Replies
24
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
11
Views
2K