Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Time Delay in C++.

  1. Jan 21, 2006 #1
    Hello. I'm wondering how to make a time delay of a few seconds using C++. The simplest way that came up to my mind is just to make loop for a few millions time or so, but the delay would change from computer to computer.
  2. jcsd
  3. Jan 21, 2006 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You could simply use the ordinary C time functions. See "time" and "clock" in the "ctime" header.
  4. Jan 21, 2006 #3
    Wait approximately 5 seconds with something like this:

    for (time_t t = time() + 5; time() < t; ) {}

    If you need more accuracy, use clock_t and clock() instead, along with the CLK_TCK value. If you are programming for Windows, use Sleep(5000) to let the process sleep for 5000 milliseconds without wasting all those CPU cycles. For other OSes, check the documentation for a similar function.
  5. Jan 21, 2006 #4


    User Avatar
    Staff Emeritus

    sleep(<time in ms>) is the standard way of doing it.
  6. Jan 21, 2006 #5
    Are those functions in C or C++? If the latter, what library should I include?
  7. Jan 21, 2006 #6


    User Avatar
    Staff Emeritus

    Under Unix sleep is in unistd.h. I'm not sure about windows.
  8. Jan 21, 2006 #7
    No such library in windows. There is a time.h windows though, but no sleep function. Is there a way to view the source file?
  9. Jan 21, 2006 #8

    error C2660: 'time' : function does not take 0 parameters
  10. Jan 21, 2006 #9
    I found this piece of code on the net:

    time_t start_time, cur_time;

    while((cur_time - start_time) < n);

    where n is the number of seconds. when i run my programme, it first does the delay, then run the rest.

    i mean that the test programme should cout my first name, delay 3 seconds and the output my last name, but it delays 3 seconds then outputs my full name. weird.
  11. Jan 21, 2006 #10


    User Avatar
    Staff Emeritus

    Ugh, I did a google search and it said windows.h
  12. Jan 21, 2006 #11
    void sleep( clock_t wait )
    clock_t goal;
    goal = wait + clock();
    while( goal > clock() )

    Found the sleep function on the net, but still, the programme does the delay, then outputs the full name.

  13. Jan 21, 2006 #12
    Oh right, that should have been time(0) instead of just time().

    But are you using Unix or Windows? Because #include <windows.h> with Sleep(milliseconds) in Windows or #include <unistd.h> with sleep(milliseconds) in Unix is simpler and more reliable. The sleep function you show that uses clock_t instead of time_t does not count in seconds or milliseconds but in system clock ticks. This varies from system to system so the pause duration is not consistent across systems (if consistency matters).

    Your first name may be in cout's output buffer but it won't show until the buffer is flushed. Try using cout.flush() to force it out before the pause.
  14. Jan 21, 2006 #13
    I have both windows and linux, but i usually programme on windows since i'm new to linux. Flushing the buffer did take care of the problem. Thanks guys!
  15. Jan 21, 2006 #14


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Sure it is -- if you remember to convert using CLOCKS_PER_SEC! (I think that's how the macro is spelled)
  16. Jan 23, 2006 #15

    jim mcnamara

    User Avatar

    Staff: Mentor

    Edit: the C standard does not require any fine granularity for clock(). It has to "tick" at least once per second, even if CLOCKS_PER_SEC is 1000000. So if you choose 50 ms depending on the implementation of C/C++ you may still get 1 full second.

    Sleep(DWORD milliseconds) is the Windows api call you want, in
    windows.h Consider using it.

    Implementing a roll-your-own time delay means the time delay will not port - and this may mean going from Windows XP on a PIII 500MHz to Windows XP on a P4 3.2GHz will cause the code not to behave correctly.

    Plus, turning on optimization may allow a compiler optimize away a few of these kinds of homegrown loops

    Use the api - all time delay calls are hardware/OS dependent anyway, so you are not losing portability. You are probably gaining some.
    Last edited: Jan 23, 2006
  17. Jan 23, 2006 #16


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    And if you really want, demarcate a platform-dependent block of your source code, and define your own "my_sleep" which is merely a wrapper for whatever the best method of sleeping is on a given system.

    (So, when porting, the only code that ought to need changing is the code in this block)
  18. Mar 8, 2007 #17
    Sleep(milliseconds) in c++

    You can simply use the Sleep(milliseconds) function which must have windows.h included...
    I'm a little late, but just in case you still check it a year after the fact :)


    #include <windows.h>
    //this will sleep for 1 second (1000) milliseconds
    //It of course needs to be put into some function, but I'm a smidge too lazy :)
  19. Mar 8, 2007 #18

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    The sleep function in Unix is in seconds. This might create a problem if the Windows program is ported to Unix. A better solution might be to use the Posix usleep function, which takes integer microseconds as an argument (but the time needs to be less than 1 million microseconds).

    Never, ever do something like this. Sending the CPU into a busy loop is a very, very bad thing to do.
  20. Mar 8, 2007 #19


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Let's not resurrect ancient threads. Thanks.

    - Warren
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Similar Threads for Delay Date
Time delay for Persistant HTTP Oct 26, 2013
C/ C++time delay problem Mar 22, 2013
Write file delay in Fortran Feb 22, 2013
C++ Time Delay Nov 18, 2009
A timer Interrupt to cause delay Nov 27, 2008