Exploring “Frame time” measurement – Part 1 – Is Fraps a good tool?

3 Responses

  1. ocre says:

    wow. These results are extremely similar. More so than i ever thought. Fraps result do differ by a tiny amount but this is because the game engine and fraps grab their data completely different. Fraps monitors DX/openCL calls after the fact and it being an external program throws in its own latencies, however small they may be. But through all this the trend graphs show near identical trends for fraps vs the internal game engines. Its amazingly accurate and the results are like mirror images.

    Some people may not know the point in these testings so to be clear i might add it. Most games do not have any way to gather frame times from the engine at all. Fraps can do this rather easily anytime while playing or benchmarking. The question was whether fraps was accurate enough to rely on at all. Whether the spikes meant what they were being represented as. Having these test with the game engines producing near identical trend graphs gives high credence to fraps being an acceptable tool to capture frame times. We also have the reviewers experience which seemed to align with the frame time data very nicely. He seen the small hitching or studdering that the trend charts represent.

  2. Pancake says:

    It’s remarkably easy, actually. Fraps and in game timing work by time stamping the ‘swap buffer’ command which indicates end of frame. As the timestamp is an internal CPU counter the only difference would be either side of procedure call so both would differ by at most a few nanoseconds. Fraps would be at most a few kilobytes of code mostly being a stub for forwarding calls to the ‘real’ driver.

  3. sorcerer_ says:

    The question “whether fraps was accurate enough to rely on at all” was not answered “at all”, as Fraps and games used the same measure of “frame time”: time between calls to “swap buffer”, as Pancake mentioned here.
    But the real question: does it measure correctly all the stuttering we see on the screen? And the answer is almost certainly: “No”. As after call to Present() driver can do a quite a lot of work before finally presenting the frame to the display device.
    And although it doesn’t matter in the terms of average frame counts it does matter when you want to measure “jitter”.
    Sadly, the only way to correctly measure the frame time is to use profiling tools from the vendor (Nvidia or AMD), and the game needs to support this “profiling” mode (none of them obviously ships with it “on”) or some clever way to circumvent it by making game to run with “hacked” D3D library.

Leave a Reply

Your email address will not be published.