Exploring “Frame time” measurement – Part 1 – Is Fraps a good tool?
Test Configuration
Test Configuration – Hardware
- Intel Core i7 3770K (overclocked to 4.5GHz); Turbo is on.
- EVGA Z77 FTW motherboard (Intel Z77 chipset, latest beta BIOS, PCIe 3.0 specification; CrossFire/SLI 16x+16x using Plex chip.)
- 8GB Kingston DDR3 PC1866 Kingston RAM (4×2 GB, dual-channel at 1866MHz; supplied by Kingston)
- Power Color Radeon HD 7970 (3GB, overclocked to GHz Edition speeds 1050/6000MHz)
- Nvidia GTX 680 (2GB, 1006/6008MHz, reference clocks), supplied by Nvidia
- Onboard Realtek Audio
- Two identical 500 GB Seagate Barracuda 7200.10 hard drives configured and set up identically from drive image; one partition for Nvidia GeForce drivers and one for ATI Catalyst drivers
- Kingston 240GB HyperX SSD used for AMD partition, supplied by Kingston.
- Kingston 240GB HyperX 3K SSD used for Nvidia partition, supplied by Kingston
- OWC 240GB Mercury EXTREME Pro 4G used for both AMD and Nvidia, on loan from OWC.
- Thermaltake ToughPower 775 W power supply unit supplied by Thermaltake
- Thermaltake Overseer RX-I full tower case, supplied by Thermaltake
- Sapphire Vapor-X CPU cooler on loan from Sapphire
- Philips DVD SATA writer
- ASUS VG278 27″ Light Boost enabled 3D Vision 2 ready 120Hz display
Test Configuration – Software
- Fraps 3.5.9 full version
- ATi Catalyst 12-11 Beta11 drivers; highest quality mip-mapping set in the driver; use application settings; surface performance optimizations are off. Latest CAPs used.
- Catalyst Control Center used to set power draw to maximum for the HD 7970; clocks, voltage and fan profiles are stock
- EVGA PrecisionX used to set power draw to maximum for Nvidia cards; clocks voltage and fan profile are stock
- NVIDIA GeForce WHQL 310.70 High Quality
- Windows 7 64-bit; very latest updates
- Latest DirectX
- All games are patched to their latest versions.
- vsync is forced off in the control panels.
- Varying AA enabled as noted in games; all in-game settings are specified with 16xAF always applied; 16xAF forced in control panel for Crysis.
- All results show average frame rates and frame times
- Highest quality sound (stereo) used in all games.
- Windows 7 64, all DX9 titles were run under DX9 render paths, DX10 titles were run under DX10 render paths and DX11 titles under DX11 render paths.
The Benchmarks
-
DX11
- STALKER, Call of Pripyat
- Metro 2033
- Far Cry 2
Let’s head to the charts and see the frame time comparison between Fraps results and the Call of Pripyat and Metro 2033 engine results.
wow. These results are extremely similar. More so than i ever thought. Fraps result do differ by a tiny amount but this is because the game engine and fraps grab their data completely different. Fraps monitors DX/openCL calls after the fact and it being an external program throws in its own latencies, however small they may be. But through all this the trend graphs show near identical trends for fraps vs the internal game engines. Its amazingly accurate and the results are like mirror images.
Some people may not know the point in these testings so to be clear i might add it. Most games do not have any way to gather frame times from the engine at all. Fraps can do this rather easily anytime while playing or benchmarking. The question was whether fraps was accurate enough to rely on at all. Whether the spikes meant what they were being represented as. Having these test with the game engines producing near identical trend graphs gives high credence to fraps being an acceptable tool to capture frame times. We also have the reviewers experience which seemed to align with the frame time data very nicely. He seen the small hitching or studdering that the trend charts represent.
It’s remarkably easy, actually. Fraps and in game timing work by time stamping the ‘swap buffer’ command which indicates end of frame. As the timestamp is an internal CPU counter the only difference would be either side of procedure call so both would differ by at most a few nanoseconds. Fraps would be at most a few kilobytes of code mostly being a stub for forwarding calls to the ‘real’ driver.
The question “whether fraps was accurate enough to rely on at all” was not answered “at all”, as Fraps and games used the same measure of “frame time”: time between calls to “swap buffer”, as Pancake mentioned here.
But the real question: does it measure correctly all the stuttering we see on the screen? And the answer is almost certainly: “No”. As after call to Present() driver can do a quite a lot of work before finally presenting the frame to the display device.
And although it doesn’t matter in the terms of average frame counts it does matter when you want to measure “jitter”.
Sadly, the only way to correctly measure the frame time is to use profiling tools from the vendor (Nvidia or AMD), and the game needs to support this “profiling” mode (none of them obviously ships with it “on”) or some clever way to circumvent it by making game to run with “hacked” D3D library.