Assembly language time delay

  1. I need to write a program that will create a constant time delay in x86 assembly, can anybody help?
  2. jcsd
  3. NoTime

    NoTime 1,570
    Science Advisor
    Homework Helper

    There are at least 3 ways to do this depending on just what you are trying to do.
    Loop cycle count - high resolution but cpu dependent.

    Or not cpu dependent.
    Hardware clock byte fetch - with or without interrupt overlay about 100th second.

    Bios timer interrupt - low resolution 18.7ms IIRC.

    Any ideas on which one is more suitable?
  4. Hardware clock byte fetch and bios timer interrupt would be perfect, I'm trying to write an assembly code that would initialize an LCD so I need a delay of 200ms
  5. btw is there a link on how to use any of them.
  6. berkeman

    Staff: Mentor

    Using the hardware timers and interrupts should be described in your x86 programmer's manual and the datasheet for the chip. Is it not there?
  7. actually I'm doing it on a PC.
  8. berkeman

    Staff: Mentor

    Oh, that's different. You're not going to get consistent timing on a PC running a non-real-time OS like Windows. The jitter is aweful due to the scheduling of tasks and interrupts from all over the place. If you want consistent timing with a PC, you'll need to make external hardware that makes the real-time waveform and timings, and then just do overall control and monitoring from the PC's jittery responses.

    You can get real-time operating systems that you can run on the PC, and they'll probably have plenty of documentation on how to get consistent timings and execution.

    You sure you want to do it on a Windows PC?
  9. berkeman, there seems to be a way but it's not working for me, the 5th bit of the 61h port toggles every 15us, which can be used for CPU-clock dependent time delay but like I said it's not working for me.

    will it work on a non-multi-tasking operating system like DOS for example ?
  10. berkeman

    Staff: Mentor

    DOS is a good idea, but I think there are still interrupts from system functions (like keyboard and mice, etc.) that cause jitter. There are real-time OS available, but sorry, I'm not that familiar with them. I use uCs mostly, where I control the whole shebang.

    I checked, and they have a pretty good entry on real-time OS:
  11. thanks, berkeman.
  12. Typical Pentium software delay loops can be written using MOV and LOOP instructions.
    For example, the following instruction sequence can be used for a delay loop: MOV CX,count DELAY: LOOP DELAY
    The initial loop counter value of “count” can be calculated using the cycles required to execute the following Pentium instructions: MOV reg/imm (1 cycle) LOOP label (5/6 cycles)
    Note that the Pentium LOOP instruction requires two different execution times. LOOP requires six cycles when the Pentium branches if the CX is not equal to zero after autodecrementing CX by 1. However, the Pentium goes to the next instruction and does not branch when CX = 0 after autodecrementing CX by 1, and this requires five cycles.
    This means that the DELAY loop will require six cycles for (count - 1) times, and the last iteration will take five cycles.
    For a 100-MHz Pentium clock, each cycle is 10 ns. For 2 ms, total cycles =2ms/10ns= 200,000. The loop will require six cycles for (count - 1) times when CX + 0, and five cycles will be required when no branch is taken (CX = 0). Thus, total cycles including the MOV = 1 + 6 x (count - 1) + 5 = 200,000. Hence, count = 33,333,0. Therefore, CX must be loaded with 33,33310
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook