SLI vs. CrossFire, Part 1 – mid-range multi-GPU scaling & value
Unreal Tournament 3 (UT3)
Unreal Tournament 3 (UT3) is the fourth game in the Unreal Tournament series. UT3 is a first-person shooter and online multiplayer video game by Epic Games. Unreal Tournament 3 provides a good balance between image quality and performance, rendering complex scenes well even on lower-end PCs. Of course, on high-end graphics cards you can really turn up the detail. UT3 is primarily an online multiplayer title offering several game modes and it also includes an offline single-player game with a campaign. For our tests, we used the very latest game patch for Unreal Tournament 3. The game doesn’t have a built-in benchmarking tool, so we used FRAPS and did a fly-by of a chosen level. Here we note that performance numbers reported are a bit higher than compared to in-game. The map we use is called “Containment” and it is one of the most demanding of the fly-bys.
Our tests were run at resolutions of 2560 x 1600 and 1920 x 1200 with UT3’s in-game graphics options set to their maximum values. One drawback of the way the UT3 engine is designed is that there is no support for anti-aliasing built in. We forced 4xAA for 2560×1600 and 8xAA for 1920×1200 in each vendor’s control panel; 8xQ for Nvidia to match AMD Graphics’ 8xMSAA settings. We record a demo in the game and a set number of frames are saved in a file for playback. When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see it playing it back more quickly than you would actually play the game. Here is Containment Demo, first at 2560×1600 with 4xAA forced in each vendor’s control panel:
Now at 1920 x 1200 and with 8xAA (8xQ in Nvidia’s Control Panel) forced
There is absolutely no problem playing this game fully maxed-out with any of our graphics configurations except for the GTS 450 at the highest resolution and even then, adding a second one in SLI makes it playable. The HD 5870 catches and passes even the GTX 480 at 1920×1200 although the GTX 580 puts in the best showing at 2560×1600 and the GTX 570 and the GTX 480 trade blows with each other in a pretty even match up. The GTX 560 Ti has no trouble handling the HD 6870 and all SLI and CrossFired configurations give decent scaling beating the flagship video cards.
I’m not 100% certain, but to analyze microstuttering, place a check in the box next to “Frametimes” in Fraps. Then when you press the hotkey, it will create a log file with a timestamp when each single frame was outputted. Only a few seconds is enough to make the log file really, really long. Then take a portion out of the log file and make a chart out of it, that measures the time between each timestamp, to see if the frames are consistent with each other in similar intervals, or if every other frame is too close to the other one.
If a game runs at say, 45fps with your SLI or CF setup, but feels more like 23-30fps, then definitely analyze this with FRAPS.
Great review so far.
How do the numbers change, if at all, if Split Frame Rendering is used instead of Alternate Frame Rendering?
The last time I used SLI was with my Voodoo2 3000s. It was a gigantic waste of $200, in 1996 dollars.
If SFR eliminates micro-stutter without too much of a performance penalty I might have to try SLI again.
why don’t they add BF:BC2?
and also 6950 n 6970 crossfire?
Concerning the microstutter, frames time (using that fraps option) is supposed to fluctuate more erratically on crossfire/sli than what it would be on a single card. I think instead of testing a moving scene, it would make more sense to test it on a completely still scene for a few seconds and see how they compare in the excel output file. You don’t want a moving scene because then you won’t be able to differentiate between the erracticness you would get from a moving scene and the erraticness you would get from microstutter.
Another interest option would be to downclock a sli/crossfire setup to a point where it matches the average framerate of the single card. This way you could could see if the multi-gpu setup looks choppier than a single card despite having the same average frame rate.
Excellent work! At the end, simple recommendations would have been nice. =)
Please include Civilization 5 if possible the next time you benchmark.
It is an important game which will test the tesselation feature and its scaling ability in multi-gpu configurations.
Civilization 5 has been added to my benching suite along with DiRT 3 and Total War, Shogun 2.
You’ve done a great job of benchmarking gaming performance, but including charts with FPS vs $$, and $$ vs wattage would be much more useful.
The wattage (both idle and load) figures can be especially important, as some of these cards can easily draw more juice than all but the most powerful (and expensive) power supplies can provide — and that definitely factors into the cost analysis.