- #1
gauss mouse
- 28
- 0
I have the following statement:
Let [itex]A\subseteq \mathbb{C}[/itex] be open and let [itex]f\colon A \to \mathbb{C}[/itex] be holomorphicic (in [itex]A[/itex]). Suppose that [itex]D(z_0,R)\subseteq A.[/itex] Then
[itex]f(z)=\sum_{k=0}^\infty a_k(z-z_0)^k\ \forall z\in D(z_0,R),[/itex] where
[itex]a_k=\displaystyle\frac{1}{2\pi i}\int_{|\zeta-z_0|=r}\frac{f(\zeta)}{(\zeta-z_0)^{k+1}}d\zeta[/itex] and [itex]0<r<R[/itex].
My problem is that it now seems, plugging in [itex]z_0[/itex], that [itex]f(z_0)=0[/itex] and since this can be done for all [itex]z_0\in A[/itex], we have [itex]f(z_0)=0[/itex] for all [itex]z_0 \in A,[/itex] which is absurd. Can anybody tell me what's going on here?
Sorry I haven't formatted this in the usual coursework question way but I don't think it would suit it.
Let [itex]A\subseteq \mathbb{C}[/itex] be open and let [itex]f\colon A \to \mathbb{C}[/itex] be holomorphicic (in [itex]A[/itex]). Suppose that [itex]D(z_0,R)\subseteq A.[/itex] Then
[itex]f(z)=\sum_{k=0}^\infty a_k(z-z_0)^k\ \forall z\in D(z_0,R),[/itex] where
[itex]a_k=\displaystyle\frac{1}{2\pi i}\int_{|\zeta-z_0|=r}\frac{f(\zeta)}{(\zeta-z_0)^{k+1}}d\zeta[/itex] and [itex]0<r<R[/itex].
My problem is that it now seems, plugging in [itex]z_0[/itex], that [itex]f(z_0)=0[/itex] and since this can be done for all [itex]z_0\in A[/itex], we have [itex]f(z_0)=0[/itex] for all [itex]z_0 \in A,[/itex] which is absurd. Can anybody tell me what's going on here?
Sorry I haven't formatted this in the usual coursework question way but I don't think it would suit it.