MisterX
- 758
- 71
Homework Statement
Given that g(z) = ln(1-z^2), defined on \mathbb{C}\backslash \left(-\infty, 1\right], i.e. the branch cut is from -\infty to 1 along the real axis. Find g(-i) given g(i) = ln(2).
Homework Equations
The Attempt at a Solution
I tried drawing it out but I'm having trouble making sense of this. I understand how (-\infty, 0] works as a branch cut for ln(z). If ln(z) = a + bi we restrict -\pi<b < \pi so there is a single value for ln(z). This means every time z passes through (-\infty, 0] there is a discontinuous jump in the value of ln(z).
However I've not been successful for explaining how -\infty to 1 works for g(z) = ln(1-z^2).
From my drawing I think this means that the branch cut for ln(g) is not (-\infty, 0]. But maybe that's not the right way to think about this.
It seems like if \left| z\right|^2 > 1, then making the argument of z go around a circle causes the argument of 1 - z^2 to go around a circle twice.