SrEstroncio
- 60
- 0
Homework Statement
Let \lambda_0 \in \mathbb{C} be an eingenvalue of the n \times n matrix A with algebraic multiplicity m, that is, is an m-nth zero of \det{A-\lambda I}. Consider the perturbed matrix A+ \epsilon B, where |\epsilon | \ll 1 and B is any n \times n matrix.
Show that given \delta \gt 0, \alpha \gt 0 exists so that, for | \epsilon | \lt \alpha, the matrix A + \epsilon B has exactly m eigenvalues (with algebraic multiplicity) inside | z - \lambda | \lt \delta
Homework Equations
Rouché's theorem states that if f is holomorphic in a region and |g(z)| \lt |f(z)| on a curve (suitable for integration) inside the open region, then f and f+ g have exactly the same amount of zeros (with multiplicity) inside the curve.
The Attempt at a Solution
I first expanding the \det function in power series and tried applying the integral which counts the number of zeros inside a region but I got stuck with several terms I couldn't get rid of. The TA told me to apply Rouché's theorem, but I can't figure out a way to exact the "sum" out of the determinant.
Any ideas? Any help would be appreciated.