Differentiating Vector-Valued Function: Junghenn Prop 9.1.2 - Peter seeks help

In summary: This is exactly the first inequality mentioned in the text. The second inequality, \left\| \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \
  • #1
Math Amateur
Gold Member
MHB
3,998
48
Derivative of a Vector-Valued Function of a Real Variable - Junghenn Propn 9.1.2 ...

I am reading Hugo D. Junghenn's book: "A Course in Real Analysis" ...

I am currently focused on Chapter 9: "Differentiation on \mathbb{R}^n"

I need some help with the proof of Proposition 9.1.2 ...

Proposition 9.1.2 and the preceding relevant Definition 9.1.1 read as follows:
https://www.physicsforums.com/attachments/7861
View attachment 7862In the above text from Junghenn we read the following:

" ... ... The assertions follow directly from the inequalities

\(\displaystyle \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \le \left\| \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right\|^2 \)

\(\displaystyle \le \sum_{ i = 1 }^m \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \) ...

... ... "
Can someone please show why the above inequalities hold true ... and further how they lead to the proof of Proposition 9.1.2 ... ...Help will be much appreciated ...

Peter
 
Last edited:
Physics news on Phys.org
  • #2
Re: Derivative of a Vector-Valued Function of a Real Variable - Junghenn Propn 9.1.2 ...

Peter said:
I am reading Hugo D. Junghenn's book: "A Course in Real Analysis" ...

I am currently focused on Chapter 9: "Differentiation on \mathbb{R}^n"

I need some help with the proof of Proposition 9.1.2 ...

Proposition 9.1.2 and the preceding relevant Definition 9.1.1 read as follows:

In the above text from Junghenn we read the following:

" ... ... The assertions follow directly from the inequalities

\(\displaystyle \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \le \left\| \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right\|^2 \)

\(\displaystyle \le \sum_{ i = 1 }^m \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \) ...

... ... "
Can someone please show why the above inequalities hold true ... and further how they lead to the proof of Proposition 9.1.2 ... ...Help will be much appreciated ...

Peter

After some reflection I think that the following is a proof that the above inequalities hold true ...Put \(\displaystyle x_j = \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j\) and \(\displaystyle x = \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \)then we have to show \(\displaystyle |x_j|^2 \leq ||x||^2 \leq \sum_j |x_j|^2\) for any vector \(\displaystyle x=(x_1,\ldots,x_n) \in \mathbb{R}^n\).... well ... we have ...\(\displaystyle \| x \|^2 = ( x_1^2 + x_2^2 + \ ... \ ... \ + x_m^2 ) \)... so clearly ...\(\displaystyle \mid x_j \mid^2 \le \| x \|^2\) ... ... ... ... ... (1)Now ...

\(\displaystyle
\sum_j \mid x_j \mid^2 \ = \ \mid x_1 \mid^2 + \mid x_2 \mid^2 + \ ... \ ... \ + \mid x_m \mid^2 \ = \ x_1^2 + x_2^2 + \ ... \ ... \ + x_m^2 \)hence \(\displaystyle \sum_j \mid x_j \mid^2 = \| x \|^2 \) ... ... ... ... ... (2)

and hence (2) implies\(\displaystyle \| x \|^2 \le \sum_j \mid x_j \mid^2\) ... ... ... ... ... (3)

So ... (1) and (3) imply that \(\displaystyle \mid x_j \mid^2 \le \| x \|^2 \le \sum_j \mid x_j \mid^2 \) ... ...
Is that correct?
But even if the above is correct I still have a problem ... and that is how the above inequalities actually prove Proposition 9.1.2 ...

... indeed ... further to the proof of the proposition ... I am somewhat perplexed at expression like \(\displaystyle \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2\) ... particularly the \(\displaystyle x_j\) ... what is \(\displaystyle x_j\) and how do we interpret \(\displaystyle \left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \) in terms of a derivative ...?Note that we must prove this proposition from the definition of differentiation of functions from \(\displaystyle \mathbb{R}\) to \(\displaystyle \mathbb{R}\) and the derivative of a vector valued function of a real variable ... ... since the only definition previous to the definition above of the derivative of a vector valued function of a real variable ... is the definition of differentiation of functions from \(\displaystyle \mathbb{R}\) to \(\displaystyle \mathbb{R}\) ... ...Hope someone can help ...Peter
 
Last edited:
  • #3


Hello Peter,

I would be happy to help you with the proof of Proposition 9.1.2. Let's start by breaking down the inequalities mentioned in the text.

The first inequality,

\left\vert \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\vert^2 \le \left\| \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right\|^2

can be rewritten as

\left( \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right)^2 \le \left( \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right)^2

This is simply the Cauchy-Schwarz inequality for vectors, which states that for any two vectors u and v,

|u \cdot v| \leq \|u\| \|v\|

Applying this to our case, we have

\left| \left( \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right) \cdot \left( \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right) \right| \leq \left\| \frac{f_j ( a + h ) - f_j (a)}{ h } - x_j \right\| \left\| \frac{ f( a + h ) - f(a) }{ h } - ( x_1, \ ... \ ... \ , x_m ) \right\|

Simplifying this, we get

\left| \frac{1}{h} \sum_{i=1}^m (f_i (a+h) - f_i(a)) (x_i - f_i(a)) \right| \leq \left\| \frac{f(a+h) - f(a)}{h} \right\| \left\| x - f(a) \right\|

Now, using the definition of the norm,
 

FAQ: Differentiating Vector-Valued Function: Junghenn Prop 9.1.2 - Peter seeks help

1. What is a vector-valued function?

A vector-valued function is a mathematical function that takes in one or more input variables and outputs a vector of values. In other words, it maps a set of numbers to a set of vectors in a coordinate space.

2. What is differentiation?

Differentiation is a mathematical operation that calculates the rate of change of a function with respect to its input variables. It is used to find the slope of a curve at a specific point and is an important tool in calculus and other branches of mathematics.

3. How is differentiation applied to vector-valued functions?

When differentiating a vector-valued function, we calculate the derivative of each component of the vector separately with respect to the input variables. This results in a new vector-valued function that represents the rate of change of the original function in each direction.

4. Why is the Junghenn Prop 9.1.2 important in differentiating vector-valued functions?

The Junghenn Prop 9.1.2 is an important theorem in vector calculus that provides a method for differentiating vector-valued functions. It allows us to differentiate a vector-valued function by differentiating each component separately and then combining them into a single vector-valued function.

5. How can I use the Junghenn Prop 9.1.2 to differentiate vector-valued functions?

To use the Junghenn Prop 9.1.2, you need to first identify the input variables and the components of the vector-valued function. Then, you can apply the derivative rules for each component and combine them into a new vector-valued function. It is important to also consider the domain and range of the original function to ensure that the resulting function is well-defined.

Back
Top