# Finding Roots of Polynomials in C

## Homework Statement

Show the the equation f(z) = z^4 + iz^2 + 2 = 0 has two roots with |z|=1 and two roots with |z|=sqrt(2), without actually solving the equation.

## Homework Equations

Rouche's theorem, the argument principle?

## The Attempt at a Solution

This is what I have done so far: First show that no solutions lie outside of the circle |z|=sqrt(2). This is true because if we suppose there was a solution, say p, then 0 = |p^4 + ip^2 + 2| >= |p|^4 - |p|^2 - 2 > 0 since |p| > sqrt(2). This contradiction implies that all solutions must lie inside of the disk |z| <= sqrt(2). After this, I wanted to show that f(z) has only two solutions INSIDE the circle |z|=sqrt(2), which would imply that two solutions must exist ON the circle. I tried to use Rouche's theorem, but I could not get the strict inequality to hold in any case. Then I realized that in the statement of the theorem, we require that no zeros or poles lie on the circle, so I could not use Rouche's theorem anyway. Okay, now we can also show that no solutions exist inside the unit circle in exactly the same way as for outside of |z|=sqrt(2). Suppose there was a solution, say q. Then 0 = |q^4 + iq^2 + 2| >= 2 - |q|^4 - |q|^2 > 0 since |q| < 1. Hence, all solutions must lie on the set {z : 1 <= |z| <= sqrt(2)}. This is where I am stuck. I am not sure how to show that solutions cannot exist somewhere in between. Any hints would be greatly appreciated. I feel like I should be using some theorem in this part, because for the parts I have done, I have only used some basic properties of the absolute value. Thanks!!