Assembly language time delay

In summary, using MOV and LOOP instructions, you can create a delay loop that will take 33,333,010 cycles to complete.
  • #1
abdo375
133
0
I need to write a program that will create a constant time delay in x86 assembly, can anybody help?
 
Engineering news on Phys.org
  • #2
There are at least 3 ways to do this depending on just what you are trying to do.
Loop cycle count - high resolution but cpu dependent.

Or not cpu dependent.
Hardware clock byte fetch - with or without interrupt overlay about 100th second.

Bios timer interrupt - low resolution 18.7ms IIRC.

Any ideas on which one is more suitable?
 
  • #3
Hardware clock byte fetch and bios timer interrupt would be perfect, I'm trying to write an assembly code that would initialize an LCD so I need a delay of 200ms
 
  • #4
btw is there a link on how to use any of them.
 
  • #5
Using the hardware timers and interrupts should be described in your x86 programmer's manual and the datasheet for the chip. Is it not there?
 
  • #6
actually I'm doing it on a PC.
 
  • #7
abdo375 said:
actually I'm doing it on a PC.

Oh, that's different. You're not going to get consistent timing on a PC running a non-real-time OS like Windows. The jitter is aweful due to the scheduling of tasks and interrupts from all over the place. If you want consistent timing with a PC, you'll need to make external hardware that makes the real-time waveform and timings, and then just do overall control and monitoring from the PC's jittery responses.

You can get real-time operating systems that you can run on the PC, and they'll probably have plenty of documentation on how to get consistent timings and execution.

You sure you want to do it on a Windows PC?
 
  • #8
berkeman, there seems to be a way but it's not working for me, the 5th bit of the 61h port toggles every 15us, which can be used for CPU-clock dependent time delay but like I said it's not working for me.

will it work on a non-multi-tasking operating system like DOS for example ?
 
  • #9
DOS is a good idea, but I think there are still interrupts from system functions (like keyboard and mice, etc.) that cause jitter. There are real-time OS available, but sorry, I'm not that familiar with them. I use uCs mostly, where I control the whole shebang.

I checked wikipedia.org, and they have a pretty good entry on real-time OS:

http://en.wikipedia.org/wiki/Real_time_operating_system
 
  • #10
thanks, berkeman.
 
  • #11
abdo375 said:
I need to write a program that will create a constant time delay in x86 assembly, can anybody help?

Typical Pentium software delay loops can be written using MOV and LOOP instructions.
For example, the following instruction sequence can be used for a delay loop: MOV CX,count DELAY: LOOP DELAY
The initial loop counter value of “count” can be calculated using the cycles required to execute the following Pentium instructions: MOV reg/imm (1 cycle) LOOP label (5/6 cycles)
Note that the Pentium LOOP instruction requires two different execution times. LOOP requires six cycles when the Pentium branches if the CX is not equal to zero after autodecrementing CX by 1. However, the Pentium goes to the next instruction and does not branch when CX = 0 after autodecrementing CX by 1, and this requires five cycles.
This means that the DELAY loop will require six cycles for (count - 1) times, and the last iteration will take five cycles.
For a 100-MHz Pentium clock, each cycle is 10 ns. For 2 ms, total cycles =2ms/10ns= 200,000. The loop will require six cycles for (count - 1) times when CX + 0, and five cycles will be required when no branch is taken (CX = 0). Thus, total cycles including the MOV = 1 + 6 x (count - 1) + 5 = 200,000. Hence, count = 33,333,0. Therefore, CX must be loaded with 33,33310
 

1. What is an assembly language time delay?

An assembly language time delay is a technique used in programming to pause the execution of a program for a specific amount of time. This can be useful for controlling the timing of events or for creating animations.

2. How is a time delay implemented in assembly language?

In assembly language, a time delay can be implemented using a loop that repeats a certain number of instructions. The number of repetitions determines the length of the delay. Alternatively, some processors have specific instructions for creating time delays.

3. How accurate is an assembly language time delay?

The accuracy of an assembly language time delay depends on the speed of the processor and the number of instructions or repetitions used in the delay. Generally, it can be accurate to within a few milliseconds.

4. Are there any drawbacks to using assembly language time delays?

One potential drawback of using assembly language time delays is that they can be processor-specific, meaning they may not work on all types of processors. Additionally, relying too heavily on time delays can lead to inefficient code and may not be the best solution for all timing needs.

5. Can an assembly language time delay be interrupted?

Yes, an assembly language time delay can be interrupted by other processes or interrupts. This can cause the delay to be inaccurate or may even cause the program to crash. It is important to carefully consider the use of time delays in a program and ensure they will not be interrupted by other processes.

Similar threads

  • Electrical Engineering
Replies
8
Views
672
Replies
22
Views
2K
  • Electrical Engineering
Replies
6
Views
2K
Replies
4
Views
910
  • Electrical Engineering
Replies
13
Views
1K
  • Computing and Technology
Replies
9
Views
1K
  • Electrical Engineering
Replies
1
Views
1K
  • Electrical Engineering
Replies
9
Views
1K
  • Electrical Engineering
Replies
1
Views
535
  • Electrical Engineering
Replies
6
Views
900
Back
Top