# Voltage Controlled Variable Delay?

1. May 17, 2004

### Pfft

How can I introduce a voltage controlled variable delay in the path of a mono-chromatic light beam? That is, how can I phase modulate the light?

2. May 17, 2004

### faust9

So, are you just trying to turn a light on/off with a slight delay based on a voltage input? How accurate does this need to be? PIC's, AVR's H8's and many other microcontrollers have built in 8bit ADCs which could easily be used as voltage controlled delay circuits. A 12 series (tiny 8 pin 12C671) PIC running with the internal oscillator could implement a simple delay using a handfull (less than 5) components.

3. May 17, 2004

### Pfft

The source of the light is a laser and will not be controlled. What I need is a device which will vary the velocity of the laser beam passing through. That device will be voltage controlled.

Such a device might be a first surface mirror made with a piezo electric crystal such as quartz. As the voltage is varied, the thickness of the mirror will vary and the path length of the reflected laser beam will vary. The net effect will be a phase shift at the distant receiver. There are mechanical problems with such a solution, and I would prefer to identify some liquid crystal which will vary the velocity of the light beam while passing through.

Pfft

4. May 17, 2004

### enigma

Staff Emeritus
I'm pretty sure you can rig up a 555 timer to be variable time with voltage.

5. May 17, 2004

### Averagesupernova

You want to phase modulate the atual frequency of the light? In other words, the color? Or is the laser light already pulsed and you want to vary the phase of the pulses mid-stream?

6. May 18, 2004

### Pfft

The frequency of the light wave, the color if you wish, will vary during the shift. Once the shift has been completed, the "color" would return to the original value. This is a direct parallel to phase modulation of a radio wave.

Think of it as two optical fibers passing through the room. You cut one fiber and insert the device. In the next room, the lights from the two fibers are compared and the phase difference is noted. The color change is only there while the phase is being shifted. Once shifted, if the shift equals one wave length, it would be impossible to detect the change.

7. May 18, 2004

### enigma

Staff Emeritus
Uhm... yeah... disregard my previous post. I misread what you were asking for.

8. May 18, 2004

### Averagesupernova

What is the frequency of modulation?

9. May 19, 2004

### Pfft

My face is red!

Ten megahertz for starters. (Sorry, I made a mistake in calculating the delay. Let me get some sleep, and I will run it through again.)
Incidentally, the light source is continuous, not pulsed.

Last edited: May 19, 2004
10. May 19, 2004

### Njorl

30 microseconds?!?!?!

That is nuts for light.

You'd need to physically switch light into different delay lines of optical fiber.

You could use a planar waveguide imaging splitter with an absorption modulator on each output. Hook up a variable length fiber to each output. Only allow light to pass through the channel with the appropriate delay. It would not be true phase modulation. It would have digitization error. You're not going to get that kind of variable delay electro-optically, or with any single channel, physical device.

Just looking at your reqs (30 microsecond delay at 10 megahertz) makes me think you don't really know what you need.

Njorl

Last edited: May 19, 2004
11. May 19, 2004

### Pfft

I am very very sorry for the error. The delay is not anything like 30 microsecs. I was horrified when I reviewed my message.