Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Complex Analysis

  1. Mar 26, 2005 #1

    Gza

    User Avatar

    I was just wondering what exactly complex analysis is, and what type of applications it's study can be applied to. By the way, an excellent discussion/class on differential forms is taking place here , if anyone would be interested in starting a similar type forum on complex analysis, that would seem like a good idea. Which brings up another point, which I should really post in feedback, about making a section that'll act as a sort of collection of virtual classrooms, where one can read the posts and get help from knowledgeable people on the topics covered within the "classrooms.", as well as get notes on an agreed upon text/reference material. Okay, i'm rambling now, time to get some :zzz: , any input is greatly accepted, thanks.
     
  2. jcsd
  3. Mar 26, 2005 #2

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Complex analysis is the study of complex differentiable functions of complex variables. This isn't quite the same as studying real differentiable functions even of many variables (the complex numbers are after all just like R^2).

    One can, though I won't in case I get it wrong, examine what f'(z) ought to be as a function of z, a complex number, in terms of limits, and then consider it as a function of two real variables z=x+iy and examine

    f(x,y) = u(x,y)+iv(x,y)

    where u and v are real valued functions of two variables.

    From this we see that we are interested in functions that satisfy the Cuachy-Riemann equations.

    [tex]\frac{\partial u}{\partial x} = \frac{\partial v}{\partial y}[/tex]

    and

    [tex]\frac{ \partial v}{\partial x} = -\frac{\partial u}{\partial y}[/tex]

    we call these analytic (others may call them something else)

    This simple restriction has amazing implications.

    Firstly note that if you taek partial derivatives of both sides in there you'll find that both u and v are harmonic functions, and that is where the real world use of complex analysis comes in. Most of the time we seek harmonic solutions to a lot of physical models.

    Secondly, there is the conformal mapping theorem (this is the deep end but it allows us to explain its uses). It roughly implies that if we want to solve the equations of fluid flow through a pipe that has an odd cross section, say the bottom has a step in it, then we only need take the trivial solution to the flow through a smooth pipe and apply a transformation of the domain to get the solutions in the other shape. This is a far from trivial theorem.

    Less trivial ones are that analytic functions aren't just differentiable, but infinitely differentiable, if two analytic functions agree on a locally dense set of points they agree everywhere. This is a consequence, I seem to recall of the Liouville theorem: any bounded analytic function is constant.

    Complex analysis is a rich and wonderful source of problems and theorems, and really makes ordinary analysis look like the dull hand maiden it is.


    I suppose I ought to provide examples of analytic functions: all polynomials, trig functions, hyperbolic trig, exponentials, the usual suspects.

    And how to use them to produce harmonic functions and check they aer analytic:

    z^2 = (x+iy)^2 = x^2-y^2+2ixy, so u(x,y)=x^2-y^2 v=2ixy

    it is easy to check ther cauchy riemann equations are true, and hence that x^2-y^2 is harmonic, and so is xy.
     
    Last edited: Mar 26, 2005
  4. Mar 26, 2005 #3

    shmoe

    User Avatar
    Science Advisor
    Homework Helper

    Complex integration is very powerfull. If you have an analytic function on a disc, then the integral around any closed curve inside the disc is zero. There's Cauchy's integral formula which says that if f is analytic on an open disc then

    [tex]f(z)=\frac{1}{2\pi i}\int \frac{f(s)}{s-z}ds[/tex]

    where this integral is taken counterclockwise around any closed curve in the disc. In otherwords, we can evaluate the function inside a region by an integral over it's boundary. Cauchy's integral formula gives a different representation of our analytic function. It can be shown that this integral can be differentiated with respect to z as many times as you like, and this derivative can 'pass through' the integral:

    [tex]f^{(n)}(z)=\frac{n!}{2\pi i}\int \frac{f(s)}{(s-z)^{n+1}}ds[/tex]

    showing that an analytic function is infinitely differentiable (as matt mentioned). If we assume f is analytic on the entire plane and is bounded, say by M, then the above gives:

    [tex]|f^{(1)}(z)|\leq\frac{1!}{2\pi}\int \frac{|f(s)|}{|s-z|^{2}}ds\leq MR^{-1}[/tex]

    Where we've taken the circle of radius R about the point z as our contour of integration. Since this bound for f is assumed true everywhere, we can take our circle as large as we like and we must have [tex]f^{(1)}(z)=0[/tex] for all z. So f's derivative is zero everywhere, and f must be constant. This is Liouville's.

    A nice consequence of Liouville's though is the fundamental theorem of algebra-every non-constant polynomial f(z) has a zero. If f has no zeros, then 1/f(z) will be analytic on the whole plane, and will be bounded, since it ->0 as z->infinity. Therefore Liouville says 1/f(z) is bounded and we get a contradiction.

    Also unlike the real case, if you know that f is analytic in a disc centered at a, then it's Taylor series converges to f on this entire disc. So in some sense analytic functions are really just 'infinite polynomials'. If you have an analytic function that doesn't grow 'too fast' as z->infinty then you can even express it nicely as a product over it's zeros (even if there are infinitely many of them). For example:

    [tex]\sin \pi z=\pi z\prod_{n=1}^{\infty}(1-z^2/n^2)[/tex]

    Now, if you have an analytic function that's zero on a dense set of points, look at it's Taylor series at a limit point of this set. Analytic implies continuous, so the function is zero at this limit point. Likewise it's first derivative will be zero, and in fact all derivatives will be zero. Therefore your function itself must be identically zero, since all the coefficients in the Taylor series are zero. This is equivalent to matt's statement about two functions equal on a dense set being equal everywhere. I'm not sure this can be deduced from Liouville's bounded theorem?


    This was all pretty random and glossed over a hundred or so pages worth of details. Maybe latter I'll say something about the glory of residues and their many uses. A classroom sort of thread is an interesting idea, but sounds like piles of work.
     
    Last edited: Mar 26, 2005
  5. Mar 26, 2005 #4

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Liouville's theorem also has another method of proof: that a non-constant analytic function on any region attains its inf and sup on the boundary. Somewhere in the back of my mind this rings a bell for showing that functions with the same taylor series (in some domain ie locally equivalent) are globally equivalent. Though I can't recall details and I' almost certainly wrong. I haven't looked at complex analysis for some years (5 I think). And I'm not necessarily thinking of functions that are defined on the whole of C, just some domain, such as log, or if you like functions defined on the extended complex plane.
     
  6. Mar 26, 2005 #5

    Hurkyl

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You need to specify the domain too: consider two different branch cuts of x^(1/2).

    I think the theorem is this:

    If two analytic functions are defined on an open connected set, and there exists a point in that set at which their taylor series are equal, then the two functions are equal on the entire set.
     
  7. Mar 26, 2005 #6

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    another of the many amazing uses of complex analysis is application to number theory, via Dirichlet series. One can apply the incredible principle of analytic continuation Hurkyl mentioned above to prove dirichlet's theorem on primes in arithmetic progression. here are notes from a course i taught years ago.

    433/633: Summary of proof of Dirichlet's theorem
    Introduction: Except for 2 and 5, all primes end in 1, 3, 7 or 9. We know there are infinitely many primes, and we could ask if there are infinitely many that end in each of those four integers. This can be phrased as: are there an infinite number of primes p such that p is congruent to a(mod 10), for every a such that gcd(a,10) = 1?

    The answer is yes, and the idea for the proof is to show a prime is "equally likely" to have one ending as another, i.e. that given any choice a among the four numbers 1,3,7,9, the proportion of all primes ≤ n, and ending in a, approaches 1/4 as n --> infinity. The actual proof grows out of generalizing the fact that Summation 1/p, summed over all primes p, diverges.

    A fancy way to say this is that g(s) = Summation of 1/p^s is asymptotic to log(1/(s-1));

    i.e.as s-->1+, g(s) approaches infinity like log(1/(s-1)). Recall this gives a proof that there are infinitely many primes, since otherwise Summation 1/p^s as a finite sum of exponential functions, would be finite everywhere, hence also at s = 1.

    Suppose A is a subset of primes consisting say of "half" of all primes in some sense. then we might expect that the sum gA(s) = Sum over A of 1/p^s, would only go to infinity "half as fast" as the full sum of 1/p^s , i.e. we might expect that gA(s) is asymptotic to (1/2) log(1/(s-1)).

    Let's turn this intuition around and make this a definition:

    I.e. A consists of "half" of all primes if the quotient gA(s)/[(1/2) log(1/(s-1))] approaches 1, as s-->1+, and we write this as gA(s) is asymptotic to
    (1/2) log(1/(s-1)).

    More generally if A is any subset of primes, we say that A has density k, where 0 ≤ k ≤ 1, if Sum over A of 1/p^s is asymptotic to k log(1/(s-1)).

    Easy Remark: If density(A) > 0, then A is infinite.

    Dirichlet`s theorem: Given m ≥ 2, and a with gcd(a,m) = 1,

    if Pa = {p : p cong to a(mod m)},

    then Density (Pa ) = 1/phi(m) > 0, where phi(m) is the number of positive integers less than m and relatively prime to m. In particular there infinitely many primes congruent to a mod m.

    the proof method is roughly to apply analytic continuation to the function Summation of 1/p^s, as a function of the complex exponent s!!!

    were these guys not geniuses?
     
    Last edited: Mar 26, 2005
  8. Apr 1, 2005 #7
    the coolest thing i ever saw in complex analysis was solving real integrals with residues. it makes solving an impossible-looking integral pretty routine.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Complex Analysis
  1. Complex Analysis (Replies: 3)

  2. Complex Analysis (Replies: 8)

Loading...