## Multipath dispersion of a pulse of light in an optical fiber.

1. The problem statement, all variables and given/known data
Multipath dispersion of a pulse of light in an optical fiber.
How is this problem solved?

 PhysOrg.com science news on PhysOrg.com >> 'Whodunnit' of Irish potato famine solved>> The mammoth's lament: Study shows how cosmic impact sparked devastating climate change>> Curiosity Mars rover drills second rock target
 Multipath dispersion is can be solved by: 1) Making the fibre (core) very narrow. 2) By making the cladding which surrounds the core very close to the core refractive index. The closer the two refractive indexes the better as any light that is less than a certain angle will be lost therefore only light that is at the right angle, preferably straight to the optical fibre, reaches the receiver.
 Why does making the core narrower reduce multipath dispersion?

Recognitions:
Gold Member

## Multipath dispersion of a pulse of light in an optical fiber.

At first I thought that it was obvious why reducing the width of the fibre would reduce the multipath dispersion. But having tried the problem with maths, I'm not so sure.

I've tried to calculate the maximum possible path difference along a fibre of length l and width d that has a maximim transmission angle of $$\theta$$ to the normal of the core cladding boundary. I've come up with an expression for this path difference in terms of the length of the fibre... <Attached are my scribbles>

The problem with this formulae is that the path difference becomes a larger fraction of the total length of the fibre as the width of the fibre gets smaller.

Can anyone tell me where my maths has gone wrong?

Thanks
Attached Thumbnails