How to abstractly prove a Laplace transform identity?

Eclair_de_XII
Messages
1,082
Reaction score
91

Homework Statement


"Suppose that ##F(s) = L[f(t)]## exists for ##s > a ≥ 0##.
(a) Show that if c is a positive constant, then
##L[f(ct)]=\frac{1}{c}F(\frac{s}{c})##

Homework Equations


##L[f(t)]=\int_0^\infty f(t)e^{-st}dt##

The Attempt at a Solution


##L[f(ct)]=\int_0^\infty f(ct)e^{-st}dt##

D: ##e^{-st}## I: ##f(ct)##
D: ##-e^{-st}## I: ##\frac{1}{c}F(ct)##
I: ##F(ct)##

##L[f(ct)]=\int_0^\infty f(ct)e^{-st}dt=(\frac{1}{c}e^{-st}F(ct))+\frac{1}{c}\int_0^\infty e^{-st}F(ct)dt##

D: ##F(ct)## I: ##e^{-st}##
D: ##cf(ct)## I: ##-\frac{1}{s}e^{-st}##

##L[f(ct)]=\int_0^\infty f(t)e^{-st}dt=(\frac{1}{c}e^{-st}F(t))+\frac{1}{c}(-\frac{1}{s}e^{-st}F(ct))+\frac{c}{s}\int_0^\infty e^{-st}f(ct)dt##
##\frac{s-c}{s}\int_0^\infty f(ct)e^{-st}dt=(\frac{1}{c}e^{-st}F(ct))+(-\frac{1}{cs}e^{-st}F(ct))##
##L[f(ct)]=\frac{s}{s-c}(\frac{1}{c}(e^{-st}F(ct)-\frac{1}{s}e^{-st}F(ct))=\frac{s}{s-c}(e^{-st}F(ct))(\frac{1}{c}-\frac{1}{s})##
##L[f(ct)]=\frac{s}{s-c}(e^{-st}F(ct))(\frac{s}{cs}-\frac{c}{cs})=\frac{s}{c}e^{-st}F(ct)##

Can anyone tell me what it is I'm doing wrong?
 
Physics news on Phys.org
Try ∫0f(ct)e-stdt = ∫0f(ct)e-s/c⋅ctdt = 1/c∫0f(h)e-s/c⋅hdh, where h=ct.

Note: If c is negative, you have to take into account the change of limits of the integral. Since it was stated that c>0, that is not a concern here.
 
Last edited:
This is a straightforward consequence of integration by substitution. First, to avoid confusion, define g(t) = f(ct). Then L[g(t)] = G(s) = \int_0^\infty g(t)e^{-st}\,dt = \int_0^\infty f(ct)e^{-st}\,dt. Now set ct = u and compare to <br /> F(p) = \int_0^\infty f(u)e^{-pu}\,du.
 
Thanks for the help, everyone. I couldn't have finished the homework without it.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top