This simulator simulates worst-case latency when presenting assuming that:
- The CPU takes cpu_time +/- cpu_time_variance to prepare the commands (randomized)
- The GPU takes gpu_time +/- gpu_time_variance to render (randomized)
- VSync is Enabled and in FIFO mode
Please note vblank_interval is 16ms, not 16.667 which corresponds to a 62.50hz monitor instead of 60hz.
The main purpose of this simulator is to understand the effects of buffer_count & swapchain_count and how they vary on different conditions.
Actual lag can be reduced if the CPU waits to prepare commands instead of starting as soon as possible. For more info see my reply on Ask Ubuntu and see
Controller to Display Latency in Call of Duty.
It is also not meant to be 100% accurate. GPU drivers can use tricks (e.g. often called "Anti Lag" or "Low Latency") to forcefully induce CPU sleeping, or may have implementation quirks that are not considered by the simulator.
Observe that in general:
- Double Buffer (swapchain_count = 2) reduces latency unless we can't hit VBLANK.
- Triple Buffer (swapchain_count = 3) only reduces latency if we frequently miss VBLANK.
- Higher buffer_count can increase framerate considerably if gpu_time is much higher than cpu_time, but latency suffers a lot.
- Higher buffer_count dampens the stutter mess caused by very large cpu_time_variance and thus miss fewer VBLANKs. This reduces the Lag Std. Deviation.
- Higher swapchain_count has the same effect when gpu_time_variance is large and miss fewer VBLANKs.
- Don't just observe average framerate & lag. Min 1% and lag variance is very important. Plot it in a graph if you have to.
Source Code can be found on Github. You can also find more info there.