Part IV: Big GPU Shootout – Bringing it all Together – the Summary
Our “Shoot-out Series” has been a steady progression examining Intel’s “Penryn” platform; one of the most popular platforms for gaming and we have been upgrading it as necessary, to maximize our PC’s gaming performance and to chart those improvements for you. This Part IV, The Summary, continues our new tradition in comparing drivers and you can actually follow our progress back to August, to Part I when we began this benchmarking, even before ABT launched on October 1st, 2008. For this fourth review in our series, we are also starting with Catalyst 8.9 vs. Geforce 178.13 as in our last two shoot-out reviews, No. II and No. III. However, this time, we are going to focus on the progress the vendors have made since then, right on through the beginning of 2009. Since our last Part III motherboard comparison was benched, each vendor has released 4 sets of drivers and we are going to compare all of them with each other. Of course, it involves a lot of charts and we will mostly let them speak for themselves; nearly one hundred charts!
In our last installment of Part III, Big GPU Shootout, PCIe 1.0 vs. PCIe 2.0, we especially focused on the motherboard’s effects on video card performance. We used the extremes – P35 PCIe 1.0 vs. X48 PCIe 2.0 with double the bandwidth and a full 16x + 16x PCIe crossfire lanes – verses the much more bandwidth-constricted 16x + 4x crossfire lanes used in the older motherboard. We saw how limiting the older motherboard’s PCIe bandwidth can be in certain situations and so we upgraded to X48.
Part II – The Big GPU Shoot-Out – Setting New Benches – also covered Catalyst 8.9 vs. Geforce 178.13 and was also tested on our P35 motherboard (PCIE 1.0/Crossfire 16x+4x) and demonstrated to us the need for overclocking our E8600 CPU from its stock 3.33 Ghz to nearly 4.0 Ghz to take full advantage of our new video cards. We also set new benchmarks with these drivers that we are still continuing to use into Part IV. Part II added a couple of more games over Part I and refined our testing slightly. We also noted that the ranking of the new video cards has remained the same: 4870-X2, GTX280 and 4870 while crossfireX-3 got more mature drivers over the last Catalyst 8.8 set.
Part I, The Big GPU Shootout: Upgrade Now or Wait? covered Catalyst 8.8 vs. Geforce 177.41 on Intel’s P35 platform as we examined the performance of five video cards. The new cards we tested were: HD4870-512MB, HD4870X2-2GB, GTX280; while 8800GTX & 2900XT represented the top and mid-range cards of the last generation. In our conclusions, we realized that last generation’s video cards are not sufficient for today’s Vista DX10 maxed-out gaming – even at 1650×1080 resolution. We even started by comparing Core 2 Duo E4300 at it its overclock of 3.33Ghz to E8600 at its stock 3.33 GHz and found the older CPU rather lacking in comparison. We then continued on for the rest of our series with our E8600 which we later overclocked to nearly 4.0 Ghz for the next 3 reviews. This changes in the next review, when we use Core 2 Quad Q9550S. We also started to bench with crossfireX-3 in Part I which ran on fairly immature drivers and we have continued to chart its progress until now.
Finally, as we conclude this series of “GPU Shootouts”, we set up for and start another shootout series – “Quad core vs. Dual Core gaming”. There we will begin with this article’s concluding benches and compare Intel’s brand new 65 watt TDP Core 2 Quad, Q9550S with our Core 2 Duo E8600 which we have been testing against each other for you while writing this very article. It does take quite a bit of time to make sense of the benchmarking, to create images, charts and graphs so as to make it easier to explain; and then to finally summarize it and even tie it in with the next future article. Expect it this week. At any rate, we are going to chart the progress of drivers from each vendor over each of their last 4 sets. Let’s begin without delay to give you our testing setup and full disclosure.
Man, there’s a *ton* of data here. Nice work.
Could have swore I already posted this here, but seems it’s gone. But anyways, why’d you use Catalyst A.I. “Advanced”?
heck .. i posted a really detailed response and it is gone also. =(
Let me check into this .. basically Cat AI set to “Advanced” maximizes performance – as for benchmarking – without impacting IQ
However, if you are doing an IQ comparison – like BFG10K’s – you set Cat AI to off or Standard for Crossfire.
I have zero idea what happened; but I found what i posted in my notes [thankfully]; here is a C&P of what I submitted originally – right after you asked:
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ’short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possible performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
I figured it out =P
There are 4 parts so far to “Big GPU shootout”l this is ‘Part 4, The Summary’
You posted your question in ‘Part 3, PCIe 1.0 vs 2.0’, yesterday
http://alienbabeltech.com/main/?p=2249#comments
Ah, got ya.
Sorry about posting the same question twice.
I just thought it was a little different since most performance reviews I’ve seen leave it at “Standard”, but, they may just do so to simulate a straight-out-of-the-box scenario.
No problem whatsoever. It took me a bit to figure out what happened myself.
I am trying to simulate how some of us actually play games. I usually use “Advanced” in my own games – and always play with settings as maxed-out as the frame rates support. When there are issues, I note them. My results may vary slightly from other sites as there are no recognized standard or universal settings for all of us to use. Each of the games that I pick for my benchmarking is well-tested to make sure that “Advanced” does not cheat, IQ-wise from driver release to driver release. There is also not a big difference even in still closeups, nor is the performance increase very large. I doubt anyone can tell any real difference while actually playing the game, in most cases. This “Standard vs. Advanced” setting also may be the focus for my own IQ article in the future – if there is interest. I also play through most of these games myself and will update Source Engine’s HL2, for example to L4D, soon enough.
In fact, there are 3 new game benches that I am validating and adding for the upcoming “Quad-core vs. Dual-Core Shootout – Q9550S vs. E8600”; to be published this Sunday or Monday. Next week, is planned a really expanded test including CrossFireX-3 covering the same area so as to compare both CPUs at 3.99Ghz; and also with e8600 OC’d further to 4.25Ghz. It will also examine CPU “scaling” from stock speeds and in-between to try to find an elusive “sweet spot” for top video cards.