Analytic functions on simple connected region (complex analysis)

Click For Summary

Homework Help Overview

The problem involves analytic functions f and g defined on a simply connected domain Ω, with the condition that f²(z) + g²(z) = 1 for all z in Ω. The goal is to demonstrate the existence of an analytic function h such that f(z) = cos(h(z)) and g(z) = sin(h(z)).

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the implications of the equation f² + g² = 1 and explore the definition of H as a composition of analytic functions. There is an attempt to connect the definition of H to the existence of an antiderivative h. Some participants express uncertainty about how to show the relationships between f, g, and h without circular reasoning.

Discussion Status

Some participants have suggested using a theorem regarding analytic and non-zero functions in simply connected domains to simplify the problem. Others are exploring the implications of defining h in terms of H and the exponential function, indicating a productive direction in the discussion.

Contextual Notes

There is mention of constraints related to the continuity of the logarithm function and the avoidance of zero in the domain, which are relevant to the problem's setup.

lonewolf5999
Messages
33
Reaction score
0
Here's the problem:
Let f and g be analytic functions on a simply connected domain Ω such that f2(z) + g2(z) = 1 for all z in Ω. Show that there exists an analytic function h such that f(z) = cos (h(z)) and g(z) = sin(h(z)) for all z in Ω.

Here's my attempt at a solution:
f2 + g2 = 1 on Ω, so (f+ig)(f-ig) = 1 on Ω, so neither of those expressions are ever 0 on Ω. Then, defining H = (1/i)(f' + ig')/(f+ig), H is a composition of analytic functions, and hence is analytic on Ω, a simply connected domain. Therefore H has an antiderivative h on Ω.

Now I'd like to argue that cos(h(z)) = f(z) and similarly for g(z), but it's not clear to me how to do this. If we integrate H directly (I know this is not allowed, but supposing it is), we get something like h = (1/i) log(f+ig), so that exp(ih) = exp(log(f+ig)) = f+ig, but on the other hand exp(ih) = cos(h) + i sin(h), which is more or less what I want. I've checked that cos(h(z)) = f(z), the other equality follows, and conversely as well, so my problem is in showing that one of those equality holds without using the other.

I'd really appreciate any help on how to proceed from my definition of H to argue that its antiderivative h satisfies f(z) = cos (h(z)) and g(z) = sin(h(z)). Thanks in advance!
 
Physics news on Phys.org
lonewolf5999 said:
Here's the problem:
Let f and g be analytic functions on a simply connected domain Ω such that f2(z) + g2(z) = 1 for all z in Ω. Show that there exists an analytic function h such that f(z) = cos (h(z)) and g(z) = sin(h(z)) for all z in Ω.

Here's my attempt at a solution:
f2 + g2 = 1 on Ω, so (f+ig)(f-ig) = 1 on Ω, so neither of those expressions are ever 0 on Ω. Then, defining H = (1/i)(f' + ig')/(f+ig), H is a composition of analytic functions, and hence is analytic on Ω, a simply connected domain. Therefore H has an antiderivative h on Ω.

Now I'd like to argue that cos(h(z)) = f(z) and similarly for g(z), but it's not clear to me how to do this. If we integrate H directly (I know this is not allowed, but supposing it is), we get something like h = (1/i) log(f+ig), so that exp(ih) = exp(log(f+ig)) = f+ig, but on the other hand exp(ih) = cos(h) + i sin(h), which is more or less what I want. I've checked that cos(h(z)) = f(z), the other equality follows, and conversely as well, so my problem is in showing that one of those equality holds without using the other.

I'd really appreciate any help on how to proceed from my definition of H to argue that its antiderivative h satisfies f(z) = cos (h(z)) and g(z) = sin(h(z)). Thanks in advance!

There is a theorem that if a(z) is analytic and nonzero in a simply connected domain then there is an analytic function b(z) such that a(z)=exp(b(z)). That's what you are looking for isn't it?
 
I can't find that theorem in my book, but if I use it, the answer comes out very easily: since f+ig is analytic and non-zero on Ω, let H be an analytic function such that f+ig = exp(H), then define h = iH, and I can show that f = cos(h), g = sin(h). I guess I'll just prove it for my problem. Thanks for the help!
 
lonewolf5999 said:
I can't find that theorem in my book, but if I use it, the answer comes out very easily: since f+ig is analytic and non-zero on Ω, let H be an analytic function such that f+ig = exp(H), then define h = iH, and I can show that f = cos(h), g = sin(h). I guess I'll just prove it for my problem. Thanks for the help!

It's really pretty easy to see if you know how the log function works. As long as your domain stays way from zero and doesn't contain any loops around zero, you can define a continuous log function on the domain.
 

Similar threads

Replies
7
Views
3K
  • · Replies 17 ·
Replies
17
Views
4K
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K