Proving |f(x)-f(y)| ≤ K ||x-y|| for f:R^2→R

  • Context: MHB 
  • Thread starter Thread starter onie mti
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on proving the inequality |f(x) - f(y)| ≤ K ||x - y|| for a differentiable function f: R² → R, where K is a non-negative constant representing the upper bound of the 2-norm of the gradient of f. Participants utilized the Mean Value Theorem and the Cauchy-Schwarz inequality to derive the necessary conditions for the proof. The key steps involve defining a function k(t) and applying the chain rule to relate the derivatives of g(t) to the gradient of f, ultimately leading to the desired inequality.

PREREQUISITES
  • Differentiation in multivariable calculus
  • Understanding of the Mean Value Theorem
  • Application of the Cauchy-Schwarz inequality
  • Knowledge of gradient and 2-norm concepts
NEXT STEPS
  • Study the Mean Value Theorem in the context of multivariable functions
  • Explore the properties of gradients and their implications in optimization
  • Learn about the Cauchy-Schwarz inequality and its applications in analysis
  • Investigate the implications of Lipschitz continuity in mathematical functions
USEFUL FOR

Mathematicians, students of calculus, and anyone interested in advanced topics in analysis and optimization will benefit from this discussion.

onie mti
Messages
42
Reaction score
0
Suppose that f:R^2 to R is differentiable on R^{2}. Also assume that there exists a real number K(greater that or equal to) 0, so that 2-norm of the (gradient of (f(x)) )is less than or equal to K for all x,y in R^{2}. Prove that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y) for all x,y in R^2.

i tried applying the mean value theorem to the function g(t)= f((1-t)x+ty) t is in [0,1] but I cannot move forward

it is no 2 on the uploaded files
 

Attachments

Physics news on Phys.org
Re: converging maps

onie mti said:
Suppose that f:R^2 to R is differentiable on R^{2}. Also assume that there exists a real number K(greater that or equal to) 0, so that 2-norm of the (gradient of (f(x)) )is less than or equal to K for all x,y in R^{2}. Prove that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y) for all x,y in R^2.

i tried applying the mean value theorem to the function g(t)= f((1-t)x+ty) t is in [0,1] but I cannot move forward

it is no 2 on the uploaded files
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.
 
Re: converging maps

Opalg said:
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.

thank you :) let me get on it (Bow)
 
Re: converging maps

Opalg said:
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.

i managed to use the schwarz inequality inequality but i can not show that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y)
 
Re: converging maps

onie mti said:
Opalg said:
Hi onie mti, and welcome to MHB! Suppose you define a function $k:\mathbb{R}\to \mathbb{R}^2$ by $k(t) = (1-t)\mathbf{x} + t\mathbf{y}$, with derivative $k'(t) = \mathbf{y} - \mathbf{x}$. Then $g$ is the composition $f\circ k$, and the chain rule says that $g'(t) = \nabla f\bigl(k(t)\bigr)\cdot k'(t)$. Now apply the Cauchy–Schwarz inequality to see that $|g'(t)| \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2$.

i managed to use the schwarz inequality inequality but i can not show that |f(x)-f(y)| is less than or equal to K(multiply by the 2-norm of x-y)
Have you used the hint about the mean value theorem? It says that $g(1) - g(0) = g'(t)$ for some $t$. Then $$|f(\mathbf{y}) - f(\mathbf{x})| = |g(1) - g(0)| = |g'(t) \leqslant \|\nabla f\bigl(k(t)\bigr)\|_2\| k'(t)\|_2 \leqslant K\| \mathbf{y} - \mathbf{x}\|_2.$$
 
Last edited:

Similar threads

  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
11
Views
2K