Why are .NET timers limited to 15 ms resolution?


.Net Problem Overview

Note that I'm asking about something that will call a callback function more often than once every 15 ms using something like System.Threading.Timer. I'm not asking about how to accurately time a piece of code using something like System.Diagnostics.Stopwatch or even QueryPerformanceCounter.

Also, I've read the related questions:



Neither of which supplies a useful answer to my question.

In addition, the recommended MSDN article, Implement a Continuously Updating, High-Resolution Time Provider for Windows, is about timing things rather than providing a continuous stream of ticks.

With that said. . .

There's a whole lot of bad information out there about the .NET timer objects. For example, System.Timers.Timer is billed as "a high performance timer optimized for server applications." And System.Threading.Timer is somehow considered a second class citizen. The conventional wisdom is that System.Threading.Timer is a wrapper around Windows Timer Queue Timers and that System.Timers.Timer is something else entirely.

The reality is much different. System.Timers.Timer is just a thin component wrapper around System.Threading.Timer (just use Reflector or ILDASM to peek inside System.Timers.Timer and you'll see the reference to System.Threading.Timer), and has some code that will provide automatic thread synchronization so you don't have to do it.

System.Threading.Timer, as it turns out is not a wrapper for the Timer Queue Timers. At least not in the 2.0 runtime, which was used from .NET 2.0 through .NET 3.5. A few minutes with the Shared Source CLI shows that the runtime implements its own timer queue that is similar to the Timer Queue Timers, but never actually calls the Win32 functions.

It appears that the .NET 4.0 runtime also implements its own timer queue. My test program (see below) provides similar results under .NET 4.0 as it does under .NET 3.5. I've created my own managed wrapper for the Timer Queue Timers and proved that I can get 1 ms resolution (with quite good accuracy), so I consider it unlikely that I'm reading the CLI source wrong.

I have two questions:

First, what causes the runtime's implementation of the timer queue to be so slow? I can't get better than 15 ms resolution, and accuracy seems to be in the range of -1 to +30 ms. That is, if I ask for 24 ms, I'll get ticks anywhere from 23 to 54 ms apart. I suppose I could spend some more time with the CLI source to track down the answer, but thought somebody here might know.

Second, and I realize that this is harder to answer, why not use the Timer Queue Timers? I realize that .NET 1.x had to run on Win9x, which didn't have those APIs, but they've existed since Windows 2000, which if I remember correctly was the minimum requirement for .NET 2.0. Is it because the CLI had to run on non-Windows boxes?

My timers test program:

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Threading;

namespace TimerTest
    class Program
        const int TickFrequency = 5;
        const int TestDuration = 15000;   // 15 seconds

        static void Main(string[] args)
            // Create a list to hold the tick times
            // The list is pre-allocated to prevent list resizing
            // from slowing down the test.
            List<double> tickTimes = new List<double>(2 * TestDuration / TickFrequency);

            // Start a stopwatch so we can keep track of how long this takes.
            Stopwatch Elapsed = Stopwatch.StartNew();

            // Create a timer that saves the elapsed time at each tick
            Timer ticker = new Timer((s) =>
                }, null, 0, TickFrequency);

            // Wait for the test to complete

            // Destroy the timer and stop the stopwatch

            // Now let's analyze the results
            Console.WriteLine("{0:N0} ticks in {1:N0} milliseconds", tickTimes.Count, Elapsed.ElapsedMilliseconds);
            Console.WriteLine("Average tick frequency = {0:N2} ms", (double)Elapsed.ElapsedMilliseconds / tickTimes.Count);

            // Compute min and max deviation from requested frequency
            double minDiff = double.MaxValue;
            double maxDiff = double.MinValue;
            for (int i = 1; i < tickTimes.Count; ++i)
                double diff = (tickTimes[i] - tickTimes[i - 1]) - TickFrequency;
                minDiff = Math.Min(diff, minDiff);
                maxDiff = Math.Max(diff, maxDiff);

            Console.WriteLine("min diff = {0:N4} ms", minDiff);
            Console.WriteLine("max diff = {0:N4} ms", maxDiff);

            Console.WriteLine("Test complete.  Press Enter.");

.Net Solutions

Solution 1 - .Net

Perhaps the document linked here explains it a bit. It's kinda dry so I only browsed it quickly :)

Quoting the intro:

> The system timer resolution determines > how frequently Windows performs two > main actions: > > - Update the timer tick > count if a full tick has elapsed. > - Check whether a scheduled timer object > has expired. > > A timer tick is a notion of elapsed > time that Windows uses to track the > time of day and thread quantum times. > By default, the clock interrupt and > timer tick are the same, but Windows > or an application can change the clock > interrupt period. > > The default timer > resolution on Windows 7 is 15.6 > milliseconds (ms). Some applications > reduce this to 1 ms, which reduces the > battery run time on mobile systems by > as much as 25 percent.

Originally from: Timers, Timer Resolution, and Development of Efficient Code (docx).

Solution 2 - .Net

The timer resolution is given by the system heartbeat. This typically defaults to 64 beats/s which is 15.625 ms. However there are ways to modify these system wide settings to achieve timer resolutions down to 1 ms or even to 0.5 ms on newer platforms:

1. Going for 1 ms resolution by means of the multimedia timer interface:

The multimedia timer interface is able to provide down to 1 ms resolution. See About Multimedia Timers (MSDN), Obtaining and Setting Timer Resolution (MSDN), and this answer for more details about timeBeginPeriod. Note: Don't forget to call the timeEndPeriod to switch back to the default timer resolution when done.

How to do:

#define TARGET_RESOLUTION 1         // 1-millisecond target resolution

UINT     wTimerRes;

if (timeGetDevCaps(&tc, sizeof(TIMECAPS)) != TIMERR_NOERROR) 
   // Error; application can't continue.

wTimerRes = min(max(tc.wPeriodMin, TARGET_RESOLUTION), tc.wPeriodMax);

//       do your stuff here at approx. 1 ms timer resolution


Note: This procedure is availble to other processes as well and the obtained resolution applies system wide. The highest resoltion requested by any process will be active, mind the consequences.

2. Going to 0.5 ms resolution:

You may obtain 0.5 ms resolution by means of the hidden API NtSetTimerResolution(). NtSetTimerResolution is exported by the native Windows NT library NTDLL.DLL. See How to set timer resolution to 0.5ms ? on MSDN. Nevertheless, the true achievable resoltion is determined by the underlying hardware. Modern hardware does support 0.5 ms resolution. Even more details are found in Inside Windows NT High Resolution Timers. The supported resolutions can be obtained by a call to NtQueryTimerResolution().

How to do:


// after loading NtSetTimerResolution from ntdll.dll:

// The requested resolution in 100 ns units:
ULONG DesiredResolution = 5000;  
// Note: The supported resolutions can be obtained by a call to NtQueryTimerResolution()

ULONG CurrentResolution = 0;

// 1. Requesting a higher resolution
// Note: This call is similar to timeBeginPeriod.
// However, it to to specify the resolution in 100 ns units.
if (NtSetTimerResolution(DesiredResolution ,TRUE,&CurrentResolution) != STATUS_SUCCESS) {
    // The call has failed

printf("CurrentResolution [100 ns units]: %d\n",CurrentResolution);
// this will show 5000 on more modern platforms (0.5ms!)

//       do your stuff here at 0.5 ms timer resolution

// 2. Releasing the requested resolution
// Note: This call is similar to timeEndPeriod 
switch (NtSetTimerResolution(DesiredResolution ,FALSE,&CurrentResolution) {
        printf("The current resolution has returned to %d [100 ns units]\n",CurrentResolution);
        printf("The requested resolution was not set\n");   
        // the resolution can only return to a previous value by means of FALSE 
        // when the current resolution was set by this application      
        // The call has failed

Note: The functionality of NtSetTImerResolution is basically mapped to the functions timeBeginPeriod and timeEndPeriod by using the bool value Set (see Inside Windows NT High Resolution Timers for more details about the scheme and all its implications). However, the multimedia suite limits the granularity to milliseconds and NtSetTimerResolution allows to set sub-millisecond values.

Solution 3 - .Net

All replays here are about system timer resolution. But .net timers not respect it. As author notice by himself:

> that the runtime implements its own timer queue that is similar to > the Timer Queue Timers, but never actually calls the Win32 functions.

And Jan pointed in comment.

So, answers above are good info, but not directly correlated to .net timers and therefore misleading people :(

Short answer to both author questions is by design. Why did they decide to go this way? Feared about whole system performance? Who knows...
To not duplicate, see more info on both questions (and ways to implement precise timers on .net) in Jan's correlated topic.


All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJim MischelView Question on Stackoverflow
Solution 1 - .NetArnold SpenceView Answer on Stackoverflow
Solution 2 - .NetArnoView Answer on Stackoverflow
Solution 3 - .NetKirsanView Answer on Stackoverflow