Part IV: Big GPU Shootout – Bringing it all Together – the Summary
S.T.A.L.K.E.R.
This is the last time we are running S.T.A.L.K.E.R., Shadows of Chernobyl benchmark. It is DX9c and our “Shootout Series” aim is to present the latest games and DX10 benchmarks, whenever possible. GSC Game World released a prequel story expansion on September 5, 2008 as Prologue: S.T.A.L.K.E.R., Clear Sky, and it has just become a brand new DX10 benchmark for us. Both games have a non-linear storyline and they feature role-playing gameplay elements such as trading and allying with NPC factions. In S.T.A.L.K.E.R., the player assumes the identity of “The Marked One” – an amnesiac illegal artifact scavenger in “The Zone” which encompasses roughly 30 square kilometers. It is the location of an alternate reality story surrounding the Chernobyl Power Plant after another (fictitious) explosion.
S.T.A.L.K.E.R. & Clear Sky feature “a living breathing world” with highly developed NPC creature AI. S.T.A.L.K.E.R. uses the X-ray Engine – a DirectX8.1/9 Shader model 3.0 graphics engine featuring HDR, parallax and normal mapping, soft shadows, motion blur, weather effects and day-to-night cycles. As with other engines using deferred shading, the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a form of anti-aliasing can be enabled that uses a technique to blur the image to give an impression of it. We set all the graphical options – including “AA” – to their maximum values.
Our benchmarks for this DX9c game is a single timedemo run called “short”. Its flaw would be that the maximum frame rates are skewed way too high as the camera pans the sky. The maximums should mostly be disregarded, although the minimums and averages are fairly representative of what you actually encounter in game. Even the best video cards will suffer stutters occasionally, although the general gameplay is better than the minimum suggests. Clear Sky Benchmark will have no such issues and overall it is a more detailed and even more stressful test for any PC.
S.T.A.L.K.E.R. Short Benchmark
Now at 1680×1050:
The results are almost too mixed to draw conclusions at first look. However, at the average and minimum, the GTX280 has improved a little, while 4870-X2’s frame rates took a nosedive with the Catalyst 8-12 driver set. And crossfireX-3 still is too flaky to reliably run. We are hoping for better results with S.T.A.L.K.E.R. Clear Sky “official” DX10 benchmark.
Man, there’s a *ton* of data here. Nice work.
Could have swore I already posted this here, but seems it’s gone. But anyways, why’d you use Catalyst A.I. “Advanced”?
heck .. i posted a really detailed response and it is gone also. =(
Let me check into this .. basically Cat AI set to “Advanced” maximizes performance – as for benchmarking – without impacting IQ
However, if you are doing an IQ comparison – like BFG10K’s – you set Cat AI to off or Standard for Crossfire.
I have zero idea what happened; but I found what i posted in my notes [thankfully]; here is a C&P of what I submitted originally – right after you asked:
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ’short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possible performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
I figured it out =P
There are 4 parts so far to “Big GPU shootout”l this is ‘Part 4, The Summary’
You posted your question in ‘Part 3, PCIe 1.0 vs 2.0’, yesterday
http://alienbabeltech.com/main/?p=2249#comments
Ah, got ya.
Sorry about posting the same question twice.
I just thought it was a little different since most performance reviews I’ve seen leave it at “Standard”, but, they may just do so to simulate a straight-out-of-the-box scenario.
No problem whatsoever. It took me a bit to figure out what happened myself.
I am trying to simulate how some of us actually play games. I usually use “Advanced” in my own games – and always play with settings as maxed-out as the frame rates support. When there are issues, I note them. My results may vary slightly from other sites as there are no recognized standard or universal settings for all of us to use. Each of the games that I pick for my benchmarking is well-tested to make sure that “Advanced” does not cheat, IQ-wise from driver release to driver release. There is also not a big difference even in still closeups, nor is the performance increase very large. I doubt anyone can tell any real difference while actually playing the game, in most cases. This “Standard vs. Advanced” setting also may be the focus for my own IQ article in the future – if there is interest. I also play through most of these games myself and will update Source Engine’s HL2, for example to L4D, soon enough.
In fact, there are 3 new game benches that I am validating and adding for the upcoming “Quad-core vs. Dual-Core Shootout – Q9550S vs. E8600”; to be published this Sunday or Monday. Next week, is planned a really expanded test including CrossFireX-3 covering the same area so as to compare both CPUs at 3.99Ghz; and also with e8600 OC’d further to 4.25Ghz. It will also examine CPU “scaling” from stock speeds and in-between to try to find an elusive “sweet spot” for top video cards.