Big GPU-Shootout; Part III, PCIe 1.0 vs. PCIe 2.0
Lost Planet DX10 benchmark
Lost Planet: Extreme Condition is a Capcom port of an Xbox 360 game which became the first DX10 game. It takes place on the icy planet of E.D.N. III which is filled with monsters, pirates, big guns, and huge bosses. This frigid world makes a great environment to highlight the benefits of high dynamic range lighting (HDR) as the snow-white environment reflects blinding sunlight, while DX10 particle systems toss snow and ice all around. The game looks great in both DirectX 9 and 10 and there isn’t really much of a difference between the two versions except perhaps shadows. Unfortunately, the DX10 version doesn’t look that much better when you’re actually playing the game and the DX10 version still runs slower than the DX9 version.
There are two versions of this benchmark. One was released as a stand-alone demo and the other is in-game. We chose the in-game demo from the retail copy of Lost Planet released on June 26, 2007 and updated through Steam to the latest version for our benchmark runs. This run isn’t completely scripted as the bugs spawn and act a little differently each time you run the demo. The benchmark is more of a scripted flyby of the level with “noclip” turned on. This means the benchmark won’t make an absolutely perfect comparison between different hardware setups, even with identical game settings. So we ran it many times. Lost Planet’s Snow and Cave demos are run continuously in-game and blend into each other. All settings are fully maxed out with 4xAA and 16AF applied.
Here are our benchmark results with Snow and Cave. All settings are fully maxed out in game including AA/AF – first at 1920×1200 resolution:
Lost Planet Benchmarks
And Now at 16×10:
Lost Planet shows a huge performance increase for GTX280 when it is unfettered by PCI express 1.0 bandwidth constraints; in some cases from barely playable to solidly playable when it is on a X48 motherboard. 4870-X2 and even 4870 has a nice increase on a X48 motherboard over P35. CrossfireX-3 also gets a nice boost from PCIe 2.0 but this time there isn’t a lot of performance difference between 16x+4x and 16x+16x PCIe lanes; probably future drivers will show it.
Curious, why’d you set Catalyst A.I. to “Advanced”?
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ‘short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possisble performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
Very interesting, I’ll definitely be I check your site on a regular basis now.