Voltage Controlled Variable Delay?

  1. How can I introduce a voltage controlled variable delay in the path of a mono-chromatic light beam? That is, how can I phase modulate the light?

    Thank you in advance for your input!
  2. jcsd
  3. So, are you just trying to turn a light on/off with a slight delay based on a voltage input? How accurate does this need to be? PIC's, AVR's H8's and many other microcontrollers have built in 8bit ADCs which could easily be used as voltage controlled delay circuits. A 12 series (tiny 8 pin 12C671) PIC running with the internal oscillator could implement a simple delay using a handfull (less than 5) components.
  4. The source of the light is a laser and will not be controlled. What I need is a device which will vary the velocity of the laser beam passing through. That device will be voltage controlled.

    Such a device might be a first surface mirror made with a piezo electric crystal such as quartz. As the voltage is varied, the thickness of the mirror will vary and the path length of the reflected laser beam will vary. The net effect will be a phase shift at the distant receiver. There are mechanical problems with such a solution, and I would prefer to identify some liquid crystal which will vary the velocity of the light beam while passing through.

    Your thoughts would be appreciated.
  5. enigma

    enigma 1,817
    Staff Emeritus
    Science Advisor
    Gold Member

    I'm pretty sure you can rig up a 555 timer to be variable time with voltage.
  6. You want to phase modulate the atual frequency of the light? In other words, the color? Or is the laser light already pulsed and you want to vary the phase of the pulses mid-stream?
  7. The frequency of the light wave, the color if you wish, will vary during the shift. Once the shift has been completed, the "color" would return to the original value. This is a direct parallel to phase modulation of a radio wave.

    Think of it as two optical fibers passing through the room. You cut one fiber and insert the device. In the next room, the lights from the two fibers are compared and the phase difference is noted. The color change is only there while the phase is being shifted. Once shifted, if the shift equals one wave length, it would be impossible to detect the change.
  8. enigma

    enigma 1,817
    Staff Emeritus
    Science Advisor
    Gold Member

    Uhm... yeah... disregard my previous post. I misread what you were asking for.
  9. What is the frequency of modulation?
  10. My face is red!

    Ten megahertz for starters. (Sorry, I made a mistake in calculating the delay. Let me get some sleep, and I will run it through again.)
    Incidentally, the light source is continuous, not pulsed.
    Last edited: May 19, 2004
  11. Njorl

    Njorl 875
    Science Advisor

    30 microseconds?!?!?!

    That is nuts for light.

    You'd need to physically switch light into different delay lines of optical fiber.

    You could use a planar waveguide imaging splitter with an absorption modulator on each output. Hook up a variable length fiber to each output. Only allow light to pass through the channel with the appropriate delay. It would not be true phase modulation. It would have digitization error. You're not going to get that kind of variable delay electro-optically, or with any single channel, physical device.

    Just looking at your reqs (30 microsecond delay at 10 megahertz) makes me think you don't really know what you need.

    Last edited: May 19, 2004
  12. I am very very sorry for the error. The delay is not anything like 30 microsecs. I was horrified when I reviewed my message.
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?