From a proof on directional derivatives

Click For Summary
SUMMARY

This discussion focuses on proving the limit of a directional derivative for a function \( f(x,y) \) with continuous partial derivatives. The limit is expressed as \( \lim_{h\to 0}\frac{f(x+hv_x,y+hv_y)-f(x,y+hv_y)}{h}=v_x\frac{\partial f}{\partial x}(x,y) \). Participants clarify that the total derivative \( Df(x,y) \) can be represented as \( \Bigl(\frac{\partial f(x,y)}{\partial x}, \frac{\partial f(x,y)}{\partial y}\Bigr) \), which aids in understanding the relationship between directional derivatives and the gradient. The discussion emphasizes the importance of Fréchet and Gateaux derivatives in this context.

PREREQUISITES
  • Understanding of directional derivatives in multivariable calculus
  • Familiarity with total derivatives and their definitions
  • Knowledge of Fréchet and Gateaux derivatives
  • Basic concepts of linear algebra, particularly matrix-vector multiplication
NEXT STEPS
  • Study the properties of Fréchet and Gateaux derivatives in detail
  • Learn about the implications of total differentiability in multivariable functions
  • Explore the relationship between directional derivatives and gradients in depth
  • Investigate the application of linear maps in the context of multivariable calculus
USEFUL FOR

Mathematicians, students of calculus, and anyone interested in advanced topics in multivariable analysis, particularly those focusing on derivatives and their applications in higher dimensions.

Delta2
Homework Helper
Insights Author
Messages
6,002
Reaction score
2,628
TL;DR
-
Given that the partial derivatives of a function ##f(x,y)## exist and are continuous, how can we prove that the following limit
$$\lim_{h\to 0}\frac{f(x+hv_x,y+hv_y)-f(x,y+hv_y)}{h}=v_x\frac{\partial f}{\partial x}(x,y)$$

I can understand why the factor ##v_x## (which is viewed as a constant ) appears in front of there, my difficulty in understanding is that inside the function the argument is ##y+hv_y## if it was just y, then everything would be fine.
 
Physics news on Phys.org
There seems all right iny+hv_y \rightarrow y
 
anuttarasammyak said:
There seems all right iny+hv_y \rightarrow y
Yes I can see that and it makes some sort of intuitive proof but I am looking for a rigorous proof.
 
Delta2 said:
Summary:: -

Given that the partial derivatives of a function ##f(x,y)## exist and are continuous, how can we prove that the following limit
$$\lim_{h\to 0}\frac{f(x+hv_x,y+hv_y)-f(x,y+hv_y)}{h}=v_x\frac{\partial f}{\partial x}(x,y)$$

I can understand why the factor ##v_x## (which is viewed as a constant ) appears in front of there, my difficulty in understanding is that inside the function the argument is ##y+hv_y## if it was just y, then everything would be fine.
Are you allowed to use that it follows from your assumptions on existence of continuous partial derivatives that the total derivative ##Df(x,y) = \Bigl(\frac{\partial f(x,y)}{\partial x} \, \frac{\partial f(x,y)}{\partial y}\Bigr)##?

If yes, can you see how that helps?

EDIT: To be more explicit, if yes, then
$$
\begin{aligned}
\frac{f(x+hv_x,y+hv_y)-f(x,y+hv_y)}{h} &=\frac{f(x+hv_x,y+hv_y)-f(x,y)}{h} - \frac{f(x,y+hv_y) - f(x,y) }{h}\\
&\to Df(x,y)(v_x,v_y) - \frac{\partial f(x,y)}{\partial y}v_y = \frac{\partial f(x,y)}{\partial x}v_x,
\end{aligned}
$$
as ##h \to 0##.
 
Last edited:
  • Like
Likes   Reactions: fresh_42
S.G. Janssens said:
Are you allowed to use that it follows from your assumptions on existence of continuous partial derivatives that the total derivative ##Df(x,y) = \Bigl(\frac{\partial f(x,y)}{\partial x} \, \frac{\partial f(x,y)}{\partial y}\Bigr)##?

If yes, can you see how that helps?
Sorry I can't understand how the total derivative helps here. What I am trying to prove is that the directional derivative (with respect to a vector ##\vec{v}=(v_x,v_y)##) is equal to the dot product of gradient and the vector ##\vec{v}##.
 
Delta2 said:
Sorry I can't understand how the total derivative helps here. What I am trying to prove is that the directional derivative (with respect to a vector ##\vec{v}=(v_x,v_y)##) is equal to the dot product of gradient and the vector ##\vec{v}##.
The total derivative ##Df(x,y)## of ##f : \mathbb{R}^2 \to \mathbb{R}## at the point ##(x,y)## is a linear map from ##\mathbb{R}^2## to ##\mathbb{R}##. I regard the gradient as the coordinate representation of ##Df(x,y)## with respect to the standard basis of ##\mathbb{R}^2##.

(By tradition abuse of notation (of which I am also regularly guilty, for instance in post #4), the distinction between ##Df(x,y)## and its coordinate representation is ignored, but this is not always good practice.)

If ##v## is a direction vector in ##\mathbb{R}^2## with standard coordinate representation ##(v_x,v_y)##, then application of the linear map ##Df(x,y)## to ##v## is identical to matrix-vector multiplication (in this case: the dot product) of the gradient with ##(v_x,v_y)##. (That is a fact from linear algebra, more so than from calculus.)
 
If, on the other hand, you are asking why application of the coordinate-free total derivative ##Df(x,y)## to the coordinate-free vector ##v## gives you the directional derivative of ##f## at ##(x,y)## in the direction of ##v##, then note that
$$
\|f((x,y) + hv) - f(x,y) - Df(x,y)hv\| = o(\|hv\|)
$$
by total differentiability of ##f## at ##(x,y)##. So,
$$
\lim_{h \to 0}\frac{\|f((x,y) + hv) - f(x,y) - hDf(x,y)v\|}{h} = 0
$$
as well. (This is just "Fréchet differentiability implies Gateaux differentiability".)
 
Last edited:
@S.G. Janssens I admit I am a bit lost. I haven't heard before about Frechet and Gateaux derivatives, but anyway let me ask this
What is the primary definition of the total derivative and when you say:
S.G. Janssens said:
If v is a direction vector in R2 with standard coordinate representation (vx,vy), then application of the linear map Df(x,y) to v is identical to matrix-vector multiplication (in this case: the dot product) of the gradient with (vx,vy). (That is a fact from linear algebra, more so than from calculus.)
How do we prove the above
 
Delta2 said:
I haven't heard before about Frechet and Gateaux derivatives
Fréchet = total, Gateaux = directional.
Delta2 said:
What is the primary definition of the total derivative
A function ##f : \mathbb{R}^n \to \mathbb{R}^m## is differentiable at a point ##\mathbf{x} \in \mathbb{R}^n## if there exists a linear map ##Df(\mathbf{x}) : \mathbb{R}^n\to \mathbb{R}^m## such that
$$
\|f(\mathbf{x} + \mathbf{z}) - f(\mathbf{x}) - Df(\mathbf{x})\mathbf{z}\| = o(\|\mathbf{z}\|)
$$
for ##\|\mathbf{z}\| \to 0##.
Delta2 said:
How do we prove the above
This is linear algebra: Applying a linear map to a vector is equivalent to applying the representation of that map (in some chosen basis) to the representation of that vector (in the same basis).
 
  • Like
Likes   Reactions: Delta2
  • #10
Also, it was not my intention to make this more difficult than necessary, and I am sorry if that happened anyway, but I cannot resist the coordinate-free definition, for a variety of reasons, such as: It keeps a clean separation between linear maps and their representations, and it generalizes directly from operators on ##\mathbb{R}^n## to operators on infinite-dimensional normed linear spaces. Here "directly" means that the proofs of the standard theorems carry over almost verbatim. (As long as these proofs do not rely on the local compactness of ##\mathbb{R}^n##.)
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K