Big GPU-Shootout; Part III, PCIe 1.0 vs. PCIe 2.0
Let’s begin with Part III round of testing. We are still using Catalyst 8-9 and Geforce 177.83 as in our last article. Only final certified drivers are used for our regular testing consistently all through these these reviews up-until-now and also all through our entire series. Identical 250 GB hard drives are set up with the latest version of Vista 32-SP1; each with identical programs, updates and patches – the only differences are the video cards and the motherboards. The testing hardware is detailed in the following chart:
Test Configuration
Test Configuration – Hardware
* Intel Core 2 Duo E8600 (reference 3.33 GHz, Overclocked to 3.99Ghz ).
* Gigabyte P35-DS3P (Intel P35 chipset, latest BIOS. PCIe 1.0 specification; crossfire 16x+4x).
* ASUS P5e-Deluxe (Intel X48 chipset, latest BIOS. PCIe 2.0 specification; crossfire 16x+16x).
* 4 GB DDR2-PC8500 RAM (2×2 GB, dual-channel at PC6400 speeds).
* Nvidia GeForce GTX280 (1 GB, nVidia reference clocks) by BFGTech
* ATi Radeon 4870 (512 MB, reference clocks) by Sapphire
* ATi Radeon 4870 (1GB, reference clocks) by ASUS
* ATi Radeon 4870X2 (2 GB, reference clocks) by VisionTek
* Onboard RealTek audio
* 2 – Seagate Barracuda 7200.10 Hard Drives [setup identically, except for the graphics cards]
Test Configuration – Software
* ATi Catalyst 8.9, highest quality mip-mapping set in the driver; Catalyst AI set to “advanced”
* nVidia Geforce 178.13, high quality driver setting, all optimizations off, LOD clamp enabled.
* Windows Vista 32-bit SP31; very latest updates
* DirectX August 2008.
* All games patched to their latest versions.
Test Configuration – Settings
* as noted, vsync off in the driver to “application decide” and never in game.
* 4xAA only enabled in-game; all settings at maximum 16xAF applied in game [or in CP except as noted; No AA/AF for Crysis and No AA for UT3]
* All results show average, minimum and maximum frame rates
* Highest quality sound (stereo) used in all games.
* Vista32, all DX10 titles were run under DX10 render paths
Curious, why’d you set Catalyst A.I. to “Advanced”?
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ‘short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possisble performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
Very interesting, I’ll definitely be I check your site on a regular basis now.