SLI vs. CrossFire, Part 1 – mid-range multi-GPU scaling & value
Crysis Warhead
Crysis Warhead is a science fiction first-person shooter computer game developed by the Hungarian studio Crytek and published by Electronic Arts. Crysis Warhead is a stand-alone expansion to Crysis that was released in 2008. It is optimized better than the original Crysis to look as good with less hardware resources required to render it.
We test first at 1920×1200 with 2xAA/16xAF with maxed-out in-game “Enthusiast” (very high) settings:
And now the same settings at 1680×1050:
The GTX 560 is slightly faster than the HD 6870 and as in Crysis, we see all of our cards scaling well. The GTS 450 gets a nice performance boost but is still too weak to manage Warhead in SLI. However, even GTX 460 SLI manages to beat GTX 580 and GTX 560 Ti SLI is much faster still although it is surpassed by HD 5870 CrossFire.
I’m not 100% certain, but to analyze microstuttering, place a check in the box next to “Frametimes” in Fraps. Then when you press the hotkey, it will create a log file with a timestamp when each single frame was outputted. Only a few seconds is enough to make the log file really, really long. Then take a portion out of the log file and make a chart out of it, that measures the time between each timestamp, to see if the frames are consistent with each other in similar intervals, or if every other frame is too close to the other one.
If a game runs at say, 45fps with your SLI or CF setup, but feels more like 23-30fps, then definitely analyze this with FRAPS.
Great review so far.
How do the numbers change, if at all, if Split Frame Rendering is used instead of Alternate Frame Rendering?
The last time I used SLI was with my Voodoo2 3000s. It was a gigantic waste of $200, in 1996 dollars.
If SFR eliminates micro-stutter without too much of a performance penalty I might have to try SLI again.
why don’t they add BF:BC2?
and also 6950 n 6970 crossfire?
Concerning the microstutter, frames time (using that fraps option) is supposed to fluctuate more erratically on crossfire/sli than what it would be on a single card. I think instead of testing a moving scene, it would make more sense to test it on a completely still scene for a few seconds and see how they compare in the excel output file. You don’t want a moving scene because then you won’t be able to differentiate between the erracticness you would get from a moving scene and the erraticness you would get from microstutter.
Another interest option would be to downclock a sli/crossfire setup to a point where it matches the average framerate of the single card. This way you could could see if the multi-gpu setup looks choppier than a single card despite having the same average frame rate.
Excellent work! At the end, simple recommendations would have been nice. =)
Please include Civilization 5 if possible the next time you benchmark.
It is an important game which will test the tesselation feature and its scaling ability in multi-gpu configurations.
Civilization 5 has been added to my benching suite along with DiRT 3 and Total War, Shogun 2.
You’ve done a great job of benchmarking gaming performance, but including charts with FPS vs $$, and $$ vs wattage would be much more useful.
The wattage (both idle and load) figures can be especially important, as some of these cards can easily draw more juice than all but the most powerful (and expensive) power supplies can provide — and that definitely factors into the cost analysis.