# Range of wavelengths from a laser pulse?

## Homework Statement

"A 1 fs pulse of laser light would be 0.3 um long. What is the range of wavelengths in a 0.3 um long pulse of (approximately) 600nm laser light?"

## Homework Equations

(delta omega)(delta t) >= 1/2
c = (lambda)(frequency)

## The Attempt at a Solution

I replaced (delta omega) with 2*Pi*(Delta F), which I then replaced with 2*Pi*c/(Delta Lambda).

Then, solving for (Delta Lambda), I got:

Delta lambda = 4*Pi*c*(delta T).

This gives me 3.77*10^(-6) m, which is 3.77 um.

However the book claims the answer is 95 nm.... I'm not really getting what I did wrong? Is this completely the wrong approach?

fzero
Homework Helper
Gold Member
You need to use some calculus to relate the variations. If

$$f = \frac{c}{\lambda},$$

then

$$\delta f = - \frac{c}{\lambda^2} \delta \lambda.$$

You can deal with the minus sign by computing absolute values.

Hmm, so I gave that a try:

$$\Delta \omega \Delta t \geq \frac{1}{2}$$

$$\delta (\omega )= \delta ( 2 \pi f)$$

$$\delta \omega = 2 \pi \frac{-c}{\lambda^2} \delta \lambda$$

Then, if I solve that for d(lambda), and plug everything in, I get about .95 um, which is an order of magnitude higher than what I wanted. Did I miss a power of 10 somewhere, or is that just a coincidence?

fzero