How is Equation (1) Equivalent to the Derivative Definition in Theorem 7.1?

Click For Summary

Discussion Overview

The discussion revolves around the equivalence of two definitions of the derivative as presented in Theorem 7.1 of "Theory of Functions of a Complex Variable" by A. I. Markushevich. Participants explore the formal and rigorous connections between the definitions, particularly focusing on the expression of the derivative in terms of a difference quotient and the implications of the limit process.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions how equation (1) is equivalent to the expression involving the error term $$\epsilon(z, z_0)$$, suggesting that a derivation from equation (1) should be made explicit.
  • Another participant notes that the notation is unclear and points out the absence of an explicit limit in the definition of the derivative as presented, which contrasts with their understanding of derivative definitions.
  • One participant proposes using the triangle inequality to estimate the difference between the two definitions, leading to a limit argument that could clarify the relationship between them.
  • A later reply acknowledges a typo in the original post regarding the definition of the derivative, suggesting that the limit should be explicitly included in equation (1).
  • Participants discuss the historical context of the derivative definition, noting that the limiting difference quotient is the original definition from single-variable calculus, while the second definition emphasizes the derivative as the best linear approximation in a neighborhood.
  • There is mention of the relevance of these definitions to higher dimensions and the nature of complex analysis as a one-variable analysis that relates to multivariable concepts.
  • One participant shares information about an upcoming release of a book on complex analysis, suggesting that consulting multiple texts can aid in understanding mathematical topics.

Areas of Agreement / Disagreement

Participants express differing views on the clarity and completeness of the definitions presented. There is no consensus on whether the definitions are equivalent without further derivation, and the discussion remains unresolved regarding the formal connections between the two expressions.

Contextual Notes

Limitations in the discussion include the unclear notation and the need for explicit limits in the definitions. The exploration of the definitions is contingent on the participants' interpretations and understanding of derivative concepts.

Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading the book: "Theory of Functions of a Complex Variable" by A. I. Markushevich (Part 1) ...

I need some help with an aspect of the proof of Theorem 7.1 ...The statement of Theorem 7.1 reads as follows:
View attachment 9330At the start of the above proof by Markushevich we read the following:

"If $$f(z)$$ has a derivative $$f'_E(z_0)$$ at $$z_0$$, then by definition

$$\frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$

where $$\epsilon ( z, z_0 ) \to 0$$ as $$\Delta z \to 0$$. ... ... "Now previously in Equation 7.1 at the start of Chapter 7, Markushevich has defined $$f'_E(z_0)$$ as follows:

$$f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)How exactly (formally and rigorously) is equation (1) exactly the same as $$\frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ...... strictly speaking, shouldn't Markushevich be deriving $$\frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ... from equation (1) ... Peter
 

Attachments

  • Markushevich - Theorem 7.1 and Proof ... .png
    Markushevich - Theorem 7.1 and Proof ... .png
    22 KB · Views: 172
Physics news on Phys.org
Peter said:
I am reading the book: "Theory of Functions of a Complex Variable" by A. I. Markushevich (Part 1) ...
At the start of the above proof by Markushevich we read the following:

"If $$f(z)$$ has a derivative $$f'_E(z_0)$$ at $$z_0$$, then by definition

$$\frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$

where $$\epsilon ( z, z_0 ) \to 0$$ as $$\Delta z \to 0$$. ... ... "Now previously in Equation 7.1 at the start of Chapter 7, Markushevich has defined $$f'_E(z_0)$$ as follows:

$$f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)How exactly (formally and rigorously) is equation (1) exactly the same as $$\frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ...... strictly speaking, shouldn't Markushevich be deriving $$\frac{ \Delta_E f(z) }{ \Delta z } = f'_E(z_0) + \epsilon ( z, z_0 )$$ ... from equation (1) ... Peter

I think I know what is being said, though the notation is a bit all over the place here. E.g. you wrote
$$f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$
but I didn't see a limit explicitly taken so it doesn't seem like a definition of a derivative that I'm familiar with.

Anyway, as is often the case, why not try to estimate the difference between those two definitions, and use the oh so important triangle inequality.

This gives

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert $

by triangle inequality. Now pass limits
i.e. consider for any $\epsilon \gt 0$ we can select a $\delta_1$ neighborhood (i.e. for all $\big \vert z - z_0 \big \vert \lt \delta_1$) such that
$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert \lt \frac{\epsilon}{2}$

and $\delta_2$ neighborhood such that
$ \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2}$

and select $\delta = \min \big(\delta_1, \delta_2\big)$ so you have

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$
 
steep said:
I think I know what is being said, though the notation is a bit all over the place here. E.g. you wrote
$$f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$
but I didn't see a limit explicitly taken so it doesn't seem like a definition of a derivative that I'm familiar with.

Anyway, as is often the case, why not try to estimate the difference between those two definitions, and use the oh so important triangle inequality.

This gives

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert $

by triangle inequality. Now pass limits
i.e. consider for any $\epsilon \gt 0$ we can select a $\delta_1$ neighborhood (i.e. for all $\big \vert z - z_0 \big \vert \lt \delta_1$) such that
$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert \lt \frac{\epsilon}{2}$

and $\delta_2$ neighborhood such that
$ \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2}$

and select $\delta = \min \big(\delta_1, \delta_2\big)$ so you have

$\Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - \big(f'_E(z_0) + \epsilon ( z, z_0 )\big) \Big \vert \leq \Big \vert \frac{ f(z) - f(z_0) }{ z - z_0 } - f'_E(z_0) \Big \vert + \Big \vert \epsilon ( z, z_0 ) \Big \vert \lt \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon$
Hi steep...

Thanks so much for your post ...

I am still reflecting on what you have written ...I must apologize for a serious typo in equation (1) ...

I wrote ...$$f'_E(z_0) = \frac{ f(z) - f(z_0) }{ z - z_0 } = \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)when I should have written $$ f'_E(z_0) = \lim_{ z \to z_0} \frac{ f(z) - f(z_0) }{ z - z_0 } = \lim_{ \Delta z \to 0} \frac{ \Delta_E f(z) }{ \Delta z }$$ ... ... ... (1)I should have also posted the beginnings of Markushevich's start to Chapter 7 to give readers access o his definitions ... so I am posting that now ... as follows:View attachment 9332
View attachment 9333
Hope that helps ...

Peter
 

Attachments

  • Markushevich - 1 - Start of Ch. 7, Section 28 ... PART 1 .png
    Markushevich - 1 - Start of Ch. 7, Section 28 ... PART 1 .png
    25.4 KB · Views: 134
  • Markushevich - 2 - Start of Ch. 7, Section 28 ... PART 2 ... .png
    Markushevich - 2 - Start of Ch. 7, Section 28 ... PART 2 ... .png
    17.5 KB · Views: 156
btw it seems worth pointing out that the (limiting) difference quotient is in some sense the 'original' derivative definition for single variable calc.

However the second definition introduced here is, in effect, that the derivative (if it exists) is the best linear approximation of a function over a sufficiently small neighborhood, period. This definition / interpretation is one that generalizes to higher dimensions. And since complex analysis is one variable analysis, but is 'kind of like' multivariable analysis ($\mathbb R^2$ looms), any perceived difference between the definitions is a good thing to dwell on.

Another thing-- a long overdue release / update to Beardon's "Complex Analysis: The Argument Principle" is coming next month as a Dover book. I think you may have a bigger math library than me but I thought I'd mention it.
 
steep said:
btw it seems worth pointing out that the (limiting) difference quotient is in some sense the 'original' derivative definition for single variable calc.

However the second definition introduced here is, in effect, that the derivative (if it exists) is the best linear approximation of a function over a sufficiently small neighborhood, period. This definition / interpretation is one that generalizes to higher dimensions. And since complex analysis is one variable analysis, but is 'kind of like' multivariable analysis ($\mathbb R^2$ looms), any perceived difference between the definitions is a good thing to dwell on.

Another thing-- a long overdue release / update to Beardon's "Complex Analysis: The Argument Principle" is coming next month as a Dover book. I think you may have a bigger math library than me but I thought I'd mention it.

Thanks for your most helpful posts, steep ...

I'll definitely keep a watch out for the release of Beardon's book ... I find being able consult a number of texts treatment of mathematical topics is helpful to learning ...

Thanks again ...

Peter
 

Similar threads

  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 2 ·
Replies
2
Views
4K
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K