Differentials in R^n .... Remark by Brwder, Section 8.2 .... ....

In summary: L was a good approximation to f near p ...... that is ... f is differentiable at p if it can be approximated near p by a linear transformation/function L ... Is the above correct?I suspect the above interpretation is wrong in some sense ...... because Browder is asserting ... and you have proved that a function f is differentiable at p if it can be approximated near p by an affine function ...Can you clarify... PeterYes, your interpretation is incorrect. The differential/derivative at p is a linear transformation L such that the best linear approximation to f near p is given by the affine function $f(p) + L(x-p)$, where $x$ is the variable of the function
  • #1
Math Amateur
Gold Member
MHB
3,990
48
I am reading Andrew Browder's book: "Mathematical Analysis: An Introduction" ... ...

I am currently reading Chapter 8: Differentiable Maps and am specifically focused on Section 8.2 Differentials ... ...

I need some further help in fully understanding a remark by Browder after Definition 8.9 ...

The relevant text from Browder reads as follows:
View attachment 9402

At the end of the above text from Browder, we read the following:"... ... Thus Definition 8.9 says roughly that a function is differentiable at \(\displaystyle p\) if it can be approximated near \(\displaystyle p\) by an affine function ... "
My question is as follows: Can someone please demonstrate formally and rigorously that Definition 8.9 implies that a function is differentiable at \(\displaystyle p\) if it can be approximated near \(\displaystyle p\) by an affine function ... ...
Help will be much appreciated ... ...

Peter
 

Attachments

  • Browder - Definition 8.9  ... Differentials ....png
    Browder - Definition 8.9 ... Differentials ....png
    21.4 KB · Views: 94
Physics news on Phys.org
  • #2
Peter, what are you looking for here? Can you prove this in the special case of $\mathbf c =\mathbf 0$? (Then can you show why this implies the general case?)

I know this is technically a different problem, but it feels in spirit to be basically the same as this recent one
https://mathhelpboards.com/analysis-50/diiferentiability-functions-complex-variable-markushevich-theorem-7-1-a-26712.html

the point in both cases is that you can make $\Big \vert \epsilon ( z, z_0 ) \Big \vert$ arbitrarily small -- or changing notation, I'd say $\Big \vert \epsilon ( \mathbf x, \mathbf x_0 ) \Big \vert$ may be made arbitrarily small
 
  • #3
That says explicitely that, near p, f(x) can be approximated by f(p)- L(x) where L is linear. That is the "affine function" approximating f(x).
 
  • #4
HallsofIvy said:
That says explicitely that, near p, f(x) can be approximated by f(p)- L(x) where L is linear. That is the "affine function" approximating f(x).
My thanks to steep and HallsofIvy for the help ...I will try to make clear my dilemma/issues ...As I see it ...... if \(\displaystyle f\) is approximated near \(\displaystyle p\) by \(\displaystyle c + Lx\) then \(\displaystyle f(p) \approx c + Lp\) ... ... ... ... ... (1)
Now I think ... ? ... that (1) is equivalent to \(\displaystyle f(p + h) - f(p)\) is approximated by \(\displaystyle c + Lh\) ... ... ... (2) ... ... ... (is that correct?)so that ... ... ... \(\displaystyle \lim_{ h \to 0 } \frac{1}{ \| h \| } ( [ f(p + h) - f(p) ] - [ c + Lh ] ) = 0\) ... ... ... (3)
BUT ... how does (3) reduce (exactly) to Browders 8.4 ...Must say that not even sure that (1) \(\displaystyle \Longrightarrow\) (2) ... ... let alone how (3) \(\displaystyle \Longrightarrow\) 8.4 ...Can someone please comment on and clarify the above ...

Peter
 
  • #5
Peter said:
My thanks to steep and HallsofIvy for the help ...I will try to make clear my dilemma/issues ...As I see it ...... if \(\displaystyle f\) is approximated near \(\displaystyle p\) by \(\displaystyle c + Lx\) then \(\displaystyle f(p) \approx c + Lp\) ... ... ... ... ... (1)
Now I think ... ? ... that (1) is equivalent to \(\displaystyle f(p + h) - f(p)\) is approximated by \(\displaystyle c + Lh\) ... ... ... (2) ... ... ... (is that correct?)so that ... ... ... \(\displaystyle \lim_{ h \to 0 } \frac{1}{ \| h \| } ( [ f(p + h) - f(p) ] - [ c + Lh ] ) = 0\) ... ... ... (3)
BUT ... how does (3) reduce (exactly) to Browders 8.4 ...Must say that not even sure that (1) \(\displaystyle \Longrightarrow\) (2) ... ... let alone how (3) \(\displaystyle \Longrightarrow\) 8.4 ...Can someone please comment on and clarify the above ...

Peter

you didn't do the estimation correctly-- it should be

$\lim_{ h \to 0 } \frac{1}{ \| h \| } \Big \vert \big( f(p + h) - f(p) \big) - \big( c + L(p+h) - \{c - L(p)\} \big) \Big \vert = \lim_{ h \to 0 } \frac{1}{ \| h \| } \big \vert \big( f(p + h) - f(p) \big) - L(h) \big \vert = 0$
i.e. the $c$'s net -- what you had was a common bug with affine functions which is why I suggested first considering c = 0. Equivalently, your statement (2) is false
 
  • #6
steep said:
you didn't do the estimation correctly-- it should be

$\lim_{ h \to 0 } \frac{1}{ \| h \| } \Big \vert \big( f(p + h) - f(p) \big) - \big( c + L(p+h) - \{c - L(p)\} \big) \Big \vert = \lim_{ h \to 0 } \frac{1}{ \| h \| } \big \vert \big( f(p + h) - f(p) \big) - L(h) \big \vert = 0$
i.e. the $c$'s net -- what you had was a common bug with affine functions which is why I suggested first considering c = 0. Equivalently, your statement (2) is false

Hi steep ...

... thanks for a very helpful post indeed! ...But ... sorry to belabour the point but I'm trying to make sure I understand the issue ...

... it seems to challenge my (possibly wrongheaded ...) interpretation of the differential/derivative ...I have previously thought that the differential/derivative at \(\displaystyle p\) was the linear transformation \(\displaystyle L\)

such that \(\displaystyle L\) was a good approximation to \(\displaystyle f\) near \(\displaystyle p\) ...

... that is ... \(\displaystyle f\) is differentiable at \(\displaystyle p\) if it can be approximated near \(\displaystyle p\) by a linear transformation/function \(\displaystyle L\) ... Is the above correct?I suspect the above interpretation is wrong in some sense ...

... because Browder is asserting ... and you have proved that a function \(\displaystyle f\) is differentiable at \(\displaystyle p\)

if it can be approximated near p by an affine function ...
Can you clarify...

Peter
 
  • #7
Peter said:
Hi steep ...

... thanks for a very helpful post indeed! ...But ... sorry to belabour the point but I'm trying to make sure I understand the issue ...

... it seems to challenge my (possibly wrongheaded ...) interpretation of the differential/derivative ...I have previously thought that the differential/derivative at \(\displaystyle p\) was the linear transformation \(\displaystyle L\)

such that \(\displaystyle L\) was a good approximation to \(\displaystyle f\) near \(\displaystyle p\) ...

... that is ... \(\displaystyle f\) is differentiable at \(\displaystyle p\) if it can be approximated near \(\displaystyle p\) by a linear transformation/function \(\displaystyle L\) ... Is the above correct?
sounds good to me
Peter said:
I suspect the above interpretation is wrong in some sense ...

... because Browder is asserting ... and you have proved that a function \(\displaystyle f\) is differentiable at \(\displaystyle p\)

if it can be approximated near p by an affine function ...

Unless I'm misunderstanding you, you are hung up on whether the function approximation (called a derivative) is linear or affine... is that right?
Let me try to run this forward and backward.

Forward case: geometrically, if you consider the derivative at a point x and orient yourself so f(x) is your origin of the space that is the image of your domain under your function, then the derivative is a linear approximation. Of course when I say "orient yourself to f(x) is your origin" all I'm talking about is an affine translation (i.e. by a constant), -- the addition of an affine translation and a linear map is an affine map.

this is another way of saying "a function is differentiable if it is locally approximately linear... a function is differentiable at a point x if it is locally approximately linear, with an error which decreases to zero faster than linearly, as we approach x"

Backward case: I think "we all know" that if a derivative exists and has magnitude zero everywhere in a path connected domain then the function is constant in said domain. (Good proof: use mean value inequality,) This implies that if two functions have the same derivative in such a domain, then they are the same up to a constant, i.e. f(x) = g(x) +b. So the derivative is the linear map, and the constant is really just a tweak used to specify a particular detail of the function.

- - - -
your book's layout and formatting remind me of Koerner's A Companion to Analysis (where I got that quote from) --- I think it is a nice "companion" to another book on analysis.
 
  • #8
steep said:
sounds good to me Unless I'm misunderstanding you, you are hung up on whether the function approximation (called a derivative) is linear or affine... is that right?
Let me try to run this forward and backward.

Forward case: geometrically, if you consider the derivative at a point x and orient yourself so f(x) is your origin of the space that is the image of your domain under your function, then the derivative is a linear approximation. Of course when I say "orient yourself to f(x) is your origin" all I'm talking about is an affine translation (i.e. by a constant), -- the addition of an affine translation and a linear map is an affine map.

this is another way of saying "a function is differentiable if it is locally approximately linear... a function is differentiable at a point x if it is locally approximately linear, with an error which decreases to zero faster than linearly, as we approach x"

Backward case: I think "we all know" that if a derivative exists and has magnitude zero everywhere in a path connected domain then the function is constant in said domain. (Good proof: use mean value inequality,) This implies that if two functions have the same derivative in such a domain, then they are the same up to a constant, i.e. f(x) = g(x) +b. So the derivative is the linear map, and the constant is really just a tweak used to specify a particular detail of the function.

- - - -
your book's layout and formatting remind me of Koerner's A Companion to Analysis (where I got that quote from) --- I think it is a nice "companion" to another book on analysis.
Thanks so much for the above post, steep ...

... ... just now working through what you have written and reflecting on it ...

Thanks again,

Peter
 

1. What is a differential in R^n?

A differential in R^n is a mathematical concept that describes the rate of change of a function with respect to its independent variables. It is used to calculate small changes in the value of a function as its input variables change.

2. How is a differential calculated in R^n?

In R^n, a differential is calculated using partial derivatives. This involves taking the derivative of a function with respect to each of its independent variables and then multiplying them together.

3. What is the purpose of studying differentials in R^n?

Studying differentials in R^n allows us to understand how small changes in the input variables of a function affect its output. This is useful in many areas of science and engineering, such as in optimization problems and modeling complex systems.

4. How does the concept of differentials relate to the concept of derivatives?

Differentials and derivatives are closely related concepts. Differentials are essentially the infinitesimal change in a function, while derivatives are the rate of change of a function. In other words, differentials can be thought of as the "change in y" and derivatives as the "change in y over the change in x".

5. Can differentials in R^n be applied to real-world problems?

Yes, differentials in R^n have many real-world applications. For example, they can be used to calculate the sensitivity of a system to small changes, to optimize functions and find maximum or minimum values, and to model complex systems such as weather patterns or financial markets.

Similar threads

Replies
2
Views
1K
  • Topology and Analysis
Replies
2
Views
1K
Replies
2
Views
2K
  • Topology and Analysis
Replies
2
Views
1K
Replies
2
Views
1K
Replies
3
Views
2K
Replies
2
Views
1K
Replies
2
Views
1K
  • Topology and Analysis
Replies
2
Views
1K
Back
Top