- #1

- 35

- 0

## Homework Statement

"A 1 fs pulse of laser light would be 0.3 um long. What is the range of wavelengths in a 0.3 um long pulse of (approximately) 600nm laser light?"

## Homework Equations

(delta omega)(delta t) >= 1/2

c = (lambda)(frequency)

## The Attempt at a Solution

I replaced (delta omega) with 2*Pi*(Delta F), which I then replaced with 2*Pi*c/(Delta Lambda).

Then, solving for (Delta Lambda), I got:

Delta lambda = 4*Pi*c*(delta T).

This gives me 3.77*10^(-6) m, which is 3.77 um.

However the book claims the answer is 95 nm.... I'm not really getting what I did wrong? Is this completely the wrong approach?