Big GPU-Shootout; Part III, PCIe 1.0 vs. PCIe 2.0
CRYSIS
Now we move on to Crysis. It is the most demanding game released to date for any PC. Crysis is a sci-fi first person shooter by Crytek and published by Electronic Arts, November, 2007. Crysis is based in a fictional near-future where an ancient alien spacecraft is discovered buried on an island near the coast of Korea. The single-player campaign has you assume the role of USA Delta Force, ‘Nomad’ in the game. He is armed with various futuristic weapons and equipment, including a “Nano Suit” which enables the player to perform extraordinary feats. Crysis uses DirectX10 for graphics rendering.
A standalone but related game, Crysis Warhead was released in September, 2008. It is notable for providing a similar graphical experience to Crysis, but with less graphical demands on the PC at its highest ‘enthusiast’ settings. CryEngine2 is the game engine used to power Crysis and Warhead and is an extended version of the CryEngine that powers FarCry. As well as supporting Shader Model 2.0, 3.0, and DirectX10’s 4.0, CryEngine2 is multi-threaded to take advantage of SMP aware systems. Crysis also comes in 32-bit and 64-bit versions and Crytek has developed their own proprietary physics system, called CryPhysics. There are three built-in demos that are very reliable in comparing video card performance. However, it is noted that actually playing the game is a bit slower than the demo implies.
GPU Demo, Island
All of our settings are set to ‘maximum’ but we do NOT apply any AA/AF in the game. Here is Crysis’ Island Demo benchmarks, at 1920×1200 resolution, then 1680×1050:
Not much difference for GTX280 at 1920×1200, but a little improvement is noted at 1680×1050 with our X48 motherboard over P35. 4870-512MB appears to have very little difference in either motherboard and the 1GB version is faster at all settings. This time, 4870-X2 picks up solid performance in the PCIe 2.0 slot – 19FPS at the minimium at 1920×1200, goes up to the almost playable 25 FPS! CrossfireX-3 also gets some smaller performance gains in our newer motherboard; again, the amount of gain is still probably held back by driver immaturity.
Curious, why’d you set Catalyst A.I. to “Advanced”?
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ‘short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possisble performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
Very interesting, I’ll definitely be I check your site on a regular basis now.