# Region of convergence of a Laplace transform

• I
• mjtsquared
In summary, the Laplace transform of a function is defined as the integral of the function multiplied by the exponential of a complex variable. The region of convergence for a Laplace transform is a right half-plane where the real part of the complex variable is greater than a certain value. In some cases, the Laplace transform can be extended to a larger region through analytic continuation. In the case of the Laplace transform of 1, it exists at s=i since it is analytic everywhere in the complex plane except at s=0. This is due to the fact that 1/s has a Taylor series that converges for |s-i|<1. In basic electronics problems, the Laplace transform is often a rational function, making

#### mjtsquared

If a Laplace transform has a region of convergence starting at Re(s)=0, does the Laplace transform evaluated at the imaginary axis exist? I.e. say that the Laplace transform of 1 is 1/s. Does this Laplace transform exist at say s=i?

I know this thread is old, but in case you still interested here is an answer.

I am assuming you are using the single-sided Laplace transform, with the standard definition
$$F(s) = \int_0^\infty f(t) e^{-s t} dt$$
so the region of convergence a right half-plane ##\Re(s)>s_0## (where the notation ##\Re(s)## indicates the real part of ##s##). This means that ##F(s)## is analytic in that half-plane, and in general is not defined outside of that half-plane. However, in many cases ##F(s)## can be extended so that it is analytic in a larger region of the complex plane (this is called analytic continuation). In your example you have ##F(s) = 1/s## with ##\Re(s)>0##. Since ##1/s## is analytic everywhere in the complex plane except at ##s=0##, ##1/s## is (trivially) the analytic continuation of ##F(s)## into the entire plane except the origin. So yes, in your example ##1/s## is defined at ##s=i##.

If the term analytic is new to you, a function ##F(s)## is called analytic at a point if there is a converging Taylor series about that point. In your example, ##1/s## has a Taylor series about ##i##: ##F(s) = -i + (s-i) + i (s-i)^2 + \ldots##, which converges for ##|s-i|<1## .

Note that in some cases the process of analytic continuation is much more complicatd. However, in the types of problems you are likely to find in basic electronics, ##F(s)## is often a rational function, so is trivially its own analytic continuation into the entire complex plane except at the poles of ##F(s)##.

Jason