C - precision of clock() function in time.h on different OS

scottmso

Limp Gawd
Joined
Apr 15, 2004
Messages
380
Hey all:

I just finished a programming assignment (written in C) for one of my courses. The assignment involves running sorting algorithms and measuring the runtime of each. Precision is important here, as many of these will complete in a fraction of a second since I am sorting fairly small data sets. Therefore, I used the clock() function in time.h, which is typically accurate to millionths of a second.

I compiled the program on my OS X box using the built-in gcc compiler. The program ran completely fine, and I received results that showed the time being accurate to millionths of a second.

Next, I compiled and ran the program on my CS department's Linux system using GCC 4.1.2 (a newer version than on my Mac...forgot the exact version though). When I ran it, the times came out accurate only to hundredths of a second...although a tick was the same value of time (1 millionth of a second). I tried a couple of other Linux-based systems, and got the same result.

What could account for the difference here? Would it be an OS/runtime issue, or could could it be the compiler? Maybe different versions of the libraries?

Any advice appreciated. Thanks!

-scott
 
Timer accuracy is OS-dependent.

Even more than that, it's system dependent. Most operating systems provide a facility (often compile-time) to modify how often the system timer fires, which is usually when these things are managed.

I've never used clock(), but the manpage indicates that it keeps track of CPU time for the process, not wallclock time. It doesn't make a ton of sense to update this at a more granular level than processes can be switched, so using the system timer makes sense for this purpose.

You can get better precision with gettimeofday(), which is in POSIX, but the equivalent on Windows is different (I can't remember what it's called...easy enough to find...then you can write a wrapper macro IIRC).
 
If you just want to time a program, there's the "time" command on unix systems. "time a.out" will display how much time the program took after termination.
 
Just out of curiosity what were your results? For completely random arrays in stack space I found Shellsort for less than a few hundred, Quicksort for less than tens of thousands, and Mergesort after that.
 
Back
Top