Undergrad Derivative of a Vector-Valued Function of a Real Variable ...

Click For Summary
The discussion revolves around understanding the proof of Proposition 9.1.2 from Junghenn's "A Course in Real Analysis," specifically regarding the differentiation of vector-valued functions. Participants seek clarification on the inequalities presented in the proposition and how they relate to the definition of differentiability for functions from \mathbb{R} to \mathbb{R}. A key point is the use of placeholder variables in Junghenn's notation, which allows for a more general argument before establishing the existence of derivatives. The conversation highlights the importance of understanding the relationship between differentiability and the inequalities that arise in the context of vector-valued functions. Overall, the thread emphasizes the nuances of mathematical notation and the logical progression in proving differentiation concepts.
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Hugo D. Junghenn's book: "A Course in Real Analysis" ...

I am currently focused on Chapter 9: "Differentiation on \mathbb{R}^n"

I need some help with the proof of Proposition 9.1.2 ...

Proposition 9.1.2 and the preceding relevant Definition 9.1.1 read as follows:
Junghenn - 1 -  Proposition 9.1.2  ... PART 1 ... .png

Junghenn - 2 -  Proposition 9.1.2  ... PART 2 ... .png

In the above text from Junghenn we read the following:

" ... ... The assertions follow directly from the inequalities

## \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \le \left\| \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right\|^2####\le \sum_{ i = 1 }^m \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2## ...

... ... "
Can someone please show why the above inequalities hold true ... and further how they lead to the proof of Proposition 9.1.2 ... ...Help will be much appreciated ...

Peter
 

Attachments

  • Junghenn - 1 -  Proposition 9.1.2  ... PART 1 ... .png
    Junghenn - 1 - Proposition 9.1.2 ... PART 1 ... .png
    32.8 KB · Views: 882
  • Junghenn - 2 -  Proposition 9.1.2  ... PART 2 ... .png
    Junghenn - 2 - Proposition 9.1.2 ... PART 2 ... .png
    25.6 KB · Views: 514
Last edited:
Physics news on Phys.org
Let ##\mathcal{E}## note the expression in the norm, resp. absolute value (because of laziness, not because it has a certain meaning). If you have all the ##f_j## differentiable, then ##|\mathcal{E}(f_j)|^2 < \frac{1}{n}\varepsilon## in a small neighborhood of ##a## which gives you the estimation for ##||\mathcal{E}(f)||^2## and vice versa.

Now the first question is, why is ##|x_j|^2 \leq ||x||^2 \leq \sum_j |x_j|^2## for any vector ##x=(x_1,\ldots,x_n) \in \mathbb{R}^n##.
Can you prove this?
 
  • Like
Likes Math Amateur
fresh_42 said:
Let ##\mathcal{E}## note the expression in the norm, resp. absolute value (because of laziness, not because it has a certain meaning). If you have all the ##f_j## differentiable, then ##|\mathcal{E}(f_j)|^2 < \frac{1}{n}\varepsilon## in a small neighborhood of ##a## which gives you the estimation for ##||\mathcal{E}(f)||^2## and vice versa.

Now the first question is, why is ##|x_j|^2 \leq ||x||^2 \leq \sum_j |x_j|^2## for any vector ##x=(x_1,\ldots,x_n) \in \mathbb{R}^n##.
Can you prove this?
Thanks fresh_42 ...

To prove ##|x_j|^2 \leq ||x||^2 \leq \sum_j |x_j|^2## for any vector ##x=(x_1,\ldots,x_n) \in \mathbb{R}^n##.

... well ... we have ...

##\| x \|^2 = ( x_1^2 + x_2^2 + \ ... \ ... \ + x_m^2 ) ##

... so clearly ...

## \mid x_j \mid^2 \le \| x \|^2 ## ... ... ... ... ... (1)

Now ...

## \sum_j \mid x_j \mid^2 \ = \ \mid x_1 \mid^2 + \mid x_2 \mid^2 + \ ... \ ... \ + \mid x_m \mid^2 \ = \ x_1^2 + x_2^2 + \ ... \ ... \ + x_m^2 ##

hence

## \sum_j \mid x_j \mid^2 = \| x \|^2## ... ... ... ... ... (2)and hence (2) implies

##\| x \|^2 \le \sum_j \mid x_j \mid^2 ## ... ... ... ... ... (3)So ... (1) and (3) imply that ##\mid x_j \mid^2 \le \| x \|^2 \le \sum_j \mid x_j \mid^2## ...
Is that correct?

Peter
 
Yes, that's correct, and the other claim is even easier, as we only have to change the order of reasoning. In one direction with the first inequality and in the other with the second, depending on what is given. We only have to use the fact, that the dimension ##n## is a constant, so it doesn't really affect the ##\varepsilon##.
 
  • Like
Likes Math Amateur
fresh_42 said:
Yes, that's correct, and the other claim is even easier, as we only have to change the order of reasoning. In one direction with the first inequality and in the other with the second, depending on what is given. We only have to use the fact, that the dimension ##n## is a constant, so it doesn't really affect the ##\varepsilon##.
Thanks for all your help on this matter fresh_42 ...

But I am still struggling to relate what you are saying to differentiation of functions from \mathbb{R} to \mathbb{R} ... that is when yu write:

" ... ... If you have all the ##f_j## differentiable, then ##|\mathcal{E}(f_j)|^2 < \frac{1}{n}\varepsilon## in a small neighborhood of ##a## which gives you the estimation for ##||\mathcal{E}(f)||^2## and vice versa. ... ..."

... how does this arise out of the definition of differentiation of functions from ##\mathbb{R}## to ##\mathbb{R}## ... ? This seems to me to be important since the only definition/discussion of differentiation in Junghenn before the above case of a the derivative of a vector-valued function of a real variable is the case of functions from ##\mathbb{R}## to ##\mathbb{R}## ... ...

Junghenn's introduction to the differentiation of functions from ##\mathbb{R}## to ##\mathbb{R}## reads as follows:
Junghenn - 1 -  Differention on R   ... PART 1 ... .png

Junghenn - 2 -  Differention on R   ... PART 2 ... .png

Junghenn - 3 -  Differention on R   ... PART 3 ... .png

Junghenn - 4 -  Differention on R   ... PART 4 ... .png


Peter
 

Attachments

  • Junghenn - 1 -  Differention on R   ... PART 1 ... .png
    Junghenn - 1 - Differention on R ... PART 1 ... .png
    24.8 KB · Views: 518
  • Junghenn - 2 -  Differention on R   ... PART 2 ... .png
    Junghenn - 2 - Differention on R ... PART 2 ... .png
    24.1 KB · Views: 470
  • Junghenn - 3 -  Differention on R   ... PART 3 ... .png
    Junghenn - 3 - Differention on R ... PART 3 ... .png
    26.3 KB · Views: 484
  • Junghenn - 4 -  Differention on R   ... PART 4 ... .png
    Junghenn - 4 - Differention on R ... PART 4 ... .png
    15.7 KB · Views: 483
Differentiability of ##g(x)## at a point ##x=a## according to definition 9.1.1 means
$$
\lim_{h \to 0} \dfrac{g(a+h)-g(a)}{h} = g\,'(a)
$$
which means that for all ##\varepsilon > 0## we can find a ##\delta(\varepsilon) > 0## such that - as soon as ##|h|< \delta(\varepsilon)## - we have ##||\dfrac{g(a+h)-g(a)}{h} - g\,'(a)|| < \varepsilon##. Now we can set likewise ##g=f_j## or ##g=f##.
If the ##f_j## are differentiable, we find for every ##\varepsilon > 0## a ##\delta_j(\varepsilon) > 0## with ##||\dfrac{f_j(a+h)-f_j(a)}{h} - f_j\,'(a)|| < \varepsilon## and ##||\dfrac{f(a+h)-f(a)}{h} - f\,'(a)||^2 \leq \sum_j |\dfrac{f_j(a+h)-f_j(a)}{h} - f_j\,'(a)|^2 < \sum_j \varepsilon^2 = n\cdot \varepsilon^2 = \varepsilon' ## for all ##|h| < \delta := \min\{\,\delta_j(n \cdot \varepsilon^2) \,\}##. So for every ##\varepsilon' ## there is a ##\delta = \delta(\varepsilon' )## with the required property. The other direction, if we start with the differentiability of ##f## is according.
 
fresh_42 said:
Differentiability of ##g(x)## at a point ##x=a## according to definition 9.1.1 means
$$
\lim_{h \to 0} \dfrac{g(a+h)-g(a)}{h} = g\,'(a)
$$
which means that for all ##\varepsilon > 0## we can find a ##\delta(\varepsilon) > 0## such that - as soon as ##|h|< \delta(\varepsilon)## - we have ##||\dfrac{g(a+h)-g(a)}{h} - g\,'(a)|| < \varepsilon##. Now we can set likewise ##g=f_j## or ##g=f##.
If the ##f_j## are differentiable, we find for every ##\varepsilon > 0## a ##\delta_j(\varepsilon) > 0## with ##||\dfrac{f_j(a+h)-f_j(a)}{h} - f_j\,'(a)|| < \varepsilon## and ##||\dfrac{f(a+h)-f(a)}{h} - f\,'(a)||^2 \leq \sum_j |\dfrac{f_j(a+h)-f_j(a)}{h} - f_j\,'(a)|^2 < \sum_j \varepsilon^2 = n\cdot \varepsilon^2 = \varepsilon' ## for all ##|h| < \delta := \min\{\,\delta_j(n \cdot \varepsilon^2) \,\}##. So for every ##\varepsilon' ## there is a ##\delta = \delta(\varepsilon' )## with the required property. The other direction, if we start with the differentiability of ##f## is according.

Hi fresh_42,

Just now reflecting on yur above reply ...

... BUT ... just a minor clarification ...

In the above post, you are dealing with expressions like ## \left\| \dfrac{f_j(a+h)-f_j(a)}{h} - f_j\,'(a) \right\| ## while Junghenn's inequalities deal with expressions like ## \left\| \dfrac{f_j(a+h)-f_j(a)}{h} - x_j \right\|## ... ...

Can you explain this apparent difference...

I find the difference quite perplexing ... how do they mean the same ...?

Peter
 
Math Amateur said:
I find the difference quite perplexing ... how do they mean the same ...?
I have chosen the derivatives to concentrate on the argument. Junghenn's notation is a bit better. The point is, that at the start, we do not know whether all derivatives exist, so he has chosen ##x_j,x## as placeholders. The arguments formally go:
##f_j## differentiable ##\Rightarrow |\ldots - f_j'(a)| < \varepsilon_j \Rightarrow ||\ldots - (f_1'(a),\ldots ,f_n'(a)) || < \varepsilon ## with the according choices for the ##\varepsilon_j,\varepsilon## and deltas and so on. Now finally, by the uniqueness of the derivative, we get that ##f\,'(a) = (f_1'(a),\ldots ,f_n'(a))##. Conversely, we have ##||\ldots - f\,'(a)|| < \varepsilon \Rightarrow |\ldots - (f\,'(a))_j| < \varepsilon_j## again with the according choices, and uniqueness again gives us ##f_j'(a) = (f\,'(a))_j \,. ## By writing ##x_j , x## instead, he just saved all these details and could write the two inequalities in one line. Otherwise, you might have objected: "But we don't know the existence of the derivative yet! We want to prove it!" The dummy variables for the derivatives is a short notation, and mine with the derivatives, was rigorously wrong, as I assumed their existence from the start, in order to show how the inequalities work. But in detail it is a combination of the inequalities and the uniqueness argument.
 
  • Like
Likes Math Amateur
fresh_42 said:
I have chosen the derivatives to concentrate on the argument. Junghenn's notation is a bit better. The point is, that at the start, we do not know whether all derivatives exist, so he has chosen ##x_j,x## as placeholders. The arguments formally go:
##f_j## differentiable ##\Rightarrow |\ldots - f_j'(a)| < \varepsilon_j \Rightarrow ||\ldots - (f_1'(a),\ldots ,f_n'(a)) || < \varepsilon ## with the according choices for the ##\varepsilon_j,\varepsilon## and deltas and so on. Now finally, by the uniqueness of the derivative, we get that ##f\,'(a) = (f_1'(a),\ldots ,f_n'(a))##. Conversely, we have ##||\ldots - f\,'(a)|| < \varepsilon \Rightarrow |\ldots - (f\,'(a))_j| < \varepsilon_j## again with the according choices, and uniqueness again gives us ##f_j'(a) = (f\,'(a))_j \,. ## By writing ##x_j , x## instead, he just saved all these details and could write the two inequalities in one line. Otherwise, you might have objected: "But we don't know the existence of the derivative yet! We want to prove it!" The dummy variables for the derivatives is a short notation, and mine with the derivatives, was rigorously wrong, as I assumed their existence from the start, in order to show how the inequalities work. But in detail it is a combination of the inequalities and the uniqueness argument.
Thanks fresh_42 ...

... just reflecting on what you have written ...

Thanks for all your help ... it is much appreciated...

Peter
 

Similar threads

Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K