Part IV: Big GPU Shootout – Bringing it all Together – the Summary

Our “Shoot-out Series” has been a steady progression examining Intel’s “Penryn” platform; one of the most popular platforms for gaming and we have been upgrading it as necessary, to maximize our PC’s gaming performance and to chart those improvements for you. This Part IV, The Summary, continues our new tradition in comparing drivers and you can actually follow our progress back to August, to Part I when we began this benchmarking, even before ABT launched on October 1st, 2008.  For this fourth review in our series, we are also starting with Catalyst 8.9 vs. Geforce 178.13 as in our last two shoot-out reviews, No. II and No. III.  However, this time, we are going to focus on the progress the vendors have made since then, right on through the beginning of 2009.  Since our last Part III motherboard comparison was benched, each vendor has released 4 sets of drivers and we are going to compare all of them with each other.  Of course, it involves a lot of charts and we will mostly let them speak for themselves; nearly one hundred charts!

In our last installment of Part III, Big GPU Shootout, PCIe 1.0 vs. PCIe 2.0, we especially focused on the motherboard’s effects on video card performance.  We used the extremes – P35 PCIe 1.0 vs. X48 PCIe 2.0 with double the bandwidth and a full 16x + 16x PCIe crossfire lanes – verses the much more  bandwidth-constricted 16x + 4x crossfire lanes used in the older motherboard.  We saw how limiting the older motherboard’s PCIe bandwidth can be in certain situations and so we upgraded to X48.

Part II - The Big GPU Shoot-Out – Setting New Benches – also covered Catalyst 8.9  vs. Geforce 178.13 and was also tested on our P35 motherboard (PCIE 1.0/Crossfire 16x+4x) and demonstrated to us the need for overclocking our E8600 CPU from its stock 3.33 Ghz to nearly 4.0 Ghz to take full advantage of our new video cards.  We also set new benchmarks with these drivers that we are still continuing to use into Part IV.  Part II added a couple of more games over Part I and refined our testing slightly.  We also noted that the ranking of the new video cards has remained the same: 4870-X2, GTX280 and 4870 while crossfireX-3 got more mature drivers over the last Catalyst 8.8 set.

Part I, The Big GPU Shootout: Upgrade Now or Wait? covered Catalyst 8.8 vs. Geforce 177.41 on Intel’s P35 platform as we examined the performance of five video cards.  The new cards we tested were: HD4870-512MB, HD4870X2-2GB,  GTX280; while 8800GTX & 2900XT represented the top and mid-range cards of the last generation.  In our conclusions,  we realized that last generation’s video cards are not sufficient for today’s Vista DX10 maxed-out gaming – even at 1650×1080 resolution.    We even started by comparing Core 2 Duo E4300 at it its overclock of 3.33Ghz to E8600 at its stock 3.33 GHz and found the older CPU rather lacking in comparison.  We then continued on for the rest of our series with our E8600 which we later overclocked to nearly 4.0 Ghz for the next 3 reviews.  This changes in the next review, when we use Core 2 Quad Q9550S.  We also started to bench with crossfireX-3 in Part I which ran on fairly immature drivers and we have continued to chart its progress until now.

Finally, as we conclude this series of “GPU Shootouts”, we set up for and start another shootout series – “Quad core vs. Dual Core gaming”.   There we will begin with this article’s concluding benches and compare Intel’s brand new 65 watt TDP Core 2 Quad, Q9550S with our Core 2 Duo E8600 which we have been testing against each other for you while writing this very article.  It does take quite a bit of time to make sense of the benchmarking, to create images, charts and graphs so as to make it easier to explain; and then to finally summarize it and even tie it in with the next future article.  Expect it this week.  At any rate, we are going to chart the progress of drivers from each vendor over each of their last 4 sets.  Let’s begin without delay to give you our testing setup and full disclosure.


As we begin with Part IV’s round of testing, we are starting with Catalyst 8-9 and Geforce 178.13 as in our last two articles, to compare with the next 3 consecutive sets from each vendor: from ATi, Catalyst 8-10, 8-11, and 8-12; from Nvidia, Geforce 178.13, 178.24, 180.48 and 181.20. What is annoying and embarrassing to this writer, is that some of our charts have the last 2 digits reversed for Nvidia’s November driver – 180.84 is always 180.48.  We will correct these mislabeled drivers in the charts later on, but they are noted below them for now.  The next Part V, with Q9550S vs. E8600 will take it from there.  Only final certified drivers are usually used for our regular testing consistently all through these these reviews, up-until-now. Identical 250 GB hard drives are set up with the latest version of Vista 32-SP1; each with identical programs, updates and patches – the only differences are the video cards. The testing hardware is detailed in the following chart:

Test Configuration

Test Configuration – Hardware

* Intel Core 2 Duo E8600 (reference 3.33 GHz, Overclocked to 3.99 Ghz ).
* ASUS P5e-Deluxe (Intel X48 chipset, latest BIOS. PCIe 2.0 specification; crossfire 16x+16x).
* 4 GB DDR2-PC8500 RAM (2×2 GB, dual-channel at PC6400 speeds).
* Nvidia GeForce GTX280 (1 GB, nVidia reference clocks) by BFGTech
* ATi Radeon 4870 (1GB, reference clocks) by ASUS
* ATi Radeon 4870X2 (2 GB, reference clocks) by VisionTek
* Onboard SupremeFX-II audio (ASUS P5e Deluxe motherboard’s daughter-card)
* 2 – Seagate Barracuda 7200.10 Hard Drives [setup identically, except for the graphics cards]

Test Configuration – Software

* ATi Catalyst 8.9, 8-10, 8-11, and 8-12; highest quality mip-mapping set in the driver; Catalyst AI set to “advanced”
* nVidia Geforce 178.13, 178.24, 180.48 and 181.20; high quality driver setting, all optimizations off, LOD clamp enabled.
* Windows Vista 32-bit SP1; very latest updates
* DirectX August 2008 and November 2008 are used, respectively, as the patches were released.
* All games patched are patched to their latest versions at the time of testing

Test Configuration – Settings

* As noted, vsync off in the driver to “application decide” and never in game.
* 4xAA only enabled in-game; all settings at maximum 16xAF applied in game [or in CP except as noted; No AA/AF for Crysis and No AA for UT3] * All results show average, minimum and maximum frame rates
* Highest quality sound (stereo) used in all games.
* Vista32, SP1; all DX10 titles were run under DX10 render paths



3DMark06 still remains the number one utility used for a system benchmark. The numbers it produces aren’t indicative of real-world gameplay – or any gameplay in general – and for that reason we really dislike using it to compare different systems. However, as long as the rest of the tech world uses it to evaluate gaming performance, we will too. We find it is mostly useful for tracking changes in a single system, what we are mostly doing now. There are four “mini-games” that it uses for benchmarking graphics, as well as two CPU tests. The scores are “weighed” and added together to give an overall number and there is a further breakdown possible with these mini games that we are charting for you.


Above is a scene from one of the four benchmark “mini games” used to benchmark GPU performance. It will give your PC a real workout even though the default resolution is only 12×10 (as pictured). Here are the results of our 3DMark06 benchmark comparison using the benchmark at its default settings:





Not to much to report from either vendor.  We see the scores adjusting slightly from one driver set to the other; and in both cases, there is a very slight drop in the overall scores.

Well, now let’s look at the mini-game frame rates:





(Geforce 180.48/181.20, not 181.84/181.40)

Another small surprise. There is very little change, except for very slight drops.  We can see the ranking has not changed and we note the maturity of both sets of vendor’s drivers are good. HD4870-X2 scales well although crossfireX-3 still barely scales in this combination of drivers and HW in this synthetic benchmark; crossfireX-3 shows diminishing returns in framerates over 4870-X2 as the 3DMark06 mini-games more specifically compare the video card’s relative performance from one driver set to the next.  We note that framerates are more significant than the final 3DMark overall score might suggest. Here we also note that crossfireX-3 outperforms 4870-X2 in almost every test, although not by very much, in most cases.



Next we move on to Vantage, Futuremark’s latest test. Of course, we feel the same way about Vantage as we do about 3DMark06 – it is really useful for tracking changes in a single system – especially driver changes, and it is particularly useful for this Summary review.  As it has become the new de facto standard for measuring PC video gaming performance, we will use it also. There are two mini-game tests: Jane Nash and Calico.  Also there are two CPU tests but we are still focusing on the graphics performance

Below is a scene from Vantage, both shown at 1920×1200 resolution and with every setting fully maxed out as Jane Nash flees in her flying Sapphire super-boat.


Lets go right to the graphs and first check the Basic Tests with the default benchmark scores:





Here we see, unlike with 3DMark06, the trend is generally for progressively higher scores.  We also note that Vantage is a better test for graphics than 3DMark06 appears to be.  So, let’s look at Vantage’s mini-game frame rates:





(Geforce 180.48/181.20, not 181.84/181.40)

There are ups and downs but mostly higher frame rates for later Catalyst drivers until 8-12, where it slips a bit.  Still, nothing you would notice actually playing these benchmarks as if they were games


Fur Benchmark

Fur Benchmark v1.2 is Open GL and a synthetic benchmark. We picked 1920×1200 resolution and 4xMSAA; mostly meaningless.  But here goes:


Fur OpenGL benchmark















(Geforce 180.48/181.20, not 181.84/181.40)

Ups and downs.  Nobody will play this “title”. Enough of the synthetics, as we move on to PC games!


Call of Juarez








Call of Juarez is the earliest DX10 benchmark that was released in June, 2007 as a fast-paced Wild West Epic Adventure Shooter, from Techland. Call of Juarez is loosely based on Spaghetti Westerns that become popular in the early 1970s. Call of Juarez features the Chrome Engine using Shader Model 4 with DirectX 10, so the usage of Vista is mandatory for this benchmark. It isn’t built into Call of Juarez, but is a stand-alone that runs a simple fly-through of a level that’s built to showcase the game’s new DX10 effects. It offers great repeatability and is a good stress test for DX10 features in today’s graphics cards although it is not quite the same as actual gameplay as the game logic and AI are stripped out of the demo. Still it is very useful in comparing video cards performance.

Performing Call of Juarez benchmark is easy as you are presented with a simple menu to choose resolution, anti-aliasing, and two choices of shadow quality options. We set the shadow quality on “high” and the shadow map resolution to its maximum, 2048×2048. At the end, the demo presents you with the minimum frame rate, maximum frame rate, and average frame rate, along with the option to quit or run the benchmark again. We always ran the benchmark at least a second time and recorded that generally higher score.

Call of Juarez DX10 benchmarks
































































We have seen mostly impressive gains in crossfireX-3 and decent gains with GTX280.  Let’s check out 1650×1080 next:































































Again, crossfireX-3 and GTX280 have the best gains over 4 months.



Crysis DX10





Now we move on to Crysis. It is one of the most demanding games released to date for the PC. Crysis is a sci-fi first person shooter by Crytek and published by Electronic Arts on November, 2007. Crysis is based in a fictional near-future where an ancient alien spacecraft is discovered buried on an island near the coast of Korea. The single-player campaign has you assume the role of USA Delta Force, ‘Nomad’ in the game. He is armed with various futuristic weapons and equipment, including a “Nano Suit” which enables the player to perform extraordinary feats. Crysis uses DirectX10 for graphics rendering.

A standalone but related game, Crysis Warhead was released on September, 2008. It is notable for providing a similar graphical experience to Crysis, but with less graphical demands on the PC at its highest ‘enthusiast’ settings. CryEngine2 is the game engine used to power Crysis and Warhead and it is an extended version of the CryEngine that powers FarCry. As well as supporting Shader Model 2.0, 3.0, and DirectX10’s 4.0, CryEngine2 is also multi-threaded to take advantage of SMP-aware systems. Crysis also comes in 32-bit and 64-bit versions and Crytek has developed their own proprietary physics system, called CryPhysics. There are three built-in demos that are very reliable in comparing video card performance. However, it is noted that actually playing the game is a bit slower than the demo implies.

GPU Demo, Island

All of our settings are set to ‘maximum’ but we do NOT apply any AA/AF in the game. Here is Crysis’ Island Demo benchmarks, at 1920×1200 resolution, then 1680×1050:































































We see 4870-1GB take a jump over the first September and October driver releases.  4870-X2 and crossfireX-3 configurations mostly stay steady.  Nvidia matches a decent increase over the same months.  Performance slips very slightly with the latest release, however.
















(4870/Cat 8-9 correction – 31 Maximum, 24.80 Average and 20 Minimum)


















































(Geforce 180.48/181.20, not 181.84/181.40)

Not much difference here compared to 1920×1200 resolution but a little improvement is generally noted at 1680×1050.  It is also no real advantage to play with crossfireX-3 over 4870-X2.


Quake Wars: Enemy Territory


Quake Wars: Enemy Territory is an objective-driven, class-based first person shooter set in the Quake universe. It was developed by id Software and Splash Damage for Windows and published by Activision. Quake Wars pits the combined human armies of the Global Defense Force (GDF) against the technologically superior Strogg, an alien race who has come to earth to use humans for spare parts and food. It allows you to play a part – probably best as an online multi-player experience – in the desperate battles waged around the world in mankind’s war to survive.

Quake Wars is an OpenGL game based on id’s Doom3 game engine with the addition of their MegaTexture technology. It also supports some of the latest 3D effects seen in today’s games, including soft particles, although it is somewhat dated and less demanding on video cards than DX10 games.  id’s MegaTexture technology is designed to provide very large maps without having to reuse the same textures over and over again. For our benchmark we chose the flyby, Salvage Demo, from Quake Wars: Enemy Territory. It is one of the most graphically demanding of all the flybys and is very repeatable and reliable in its results. It is fairly close to what you will experience in-game. All of our settings are set to ‘maximum’ and we also apply 4xAA/16xAF in game.

Salvage Demo fly-by:





























































The drivers seem to swap performance from one set to another.  Only crossfireX-3 seems to have really improved substantially.





























































At 1680×1050 resolution we actually note the 4870 overtaking and passing the GTX280 in the latest driver sets from each vendor, where it was a bit closer before.  HD4870-X2 is the biggest performance gainer, followed by crossfire-X3




F.E.A.R. – First Encounter Armed Assault – is a DX9c game by Monolith Productions that was originally released in October 2005 by Vivendi Universal Production. Later, there were two expansions with the latest, Perseus Mandate, released in 2007. Although the game engine is aging a bit, it still has some of the most spectacular effects of any game. F.E.A.R. showcases a powerful particle system, complete with sparks and smoke for collisions as well as featuring bullet marks and other effects including “soft shadows”. This is highlighted by the built-in performance test,  although it was never updated. This performance test will tell you how F.E.A.R. will run, but its first and second expansions, Extraction Point and Perseus Mandate, are more demanding on your PC graphics and will run slower than the demo. We always run at least 2 sets of tests with all in-games features at ‘maximum’, using 4xAA instead of soft shadows, as they do not run well together and 4xAA is actually more demanding. F.E.A.R. uses the Jupiter Extended Technology engine from Touchdown Entertainment.

We test with the most demanding settings.  Fully maxed details with 4xAA/16xAF; soft shadows ‘off’, as they do not play well with AA; first at 1920×1200:






Finally we test at 1650×1080 with 4 AA/16xAF and with Soft shadows again disabled:




























































This title seems to have been mostly ignored by both vendors although 4870-X2′s low minimums were fixed in the last two driver sets.  It doesn’t really matter as all of the cards can play F.E.A.R. really well, with only the 4870 slowing down a bit in the minimums.  F.E.A.R.2 has been released and we will keep an eye out on performance for you.


Half-Life2: Lost Coast






Half-Life2 is still a popular game and it is the oldest game we review for this series. Half-Life2: Lost Coast is an additional level for this 2004 game. Lost Coast was released October, 2005 as a free download to all purchasers of Half-Life2. Lost Coast was developed as a playable tech demo, intended to showcase the newly-added high definition range lighting – HDR – features of the Source Engine. Lost Coast features some very minor storyline details that were scrapped from Half-Life2. A flyby of this level is played during the HL2 video stress test and it is very repeatable and quite accurate. All in-game settings are fully maxed out, including 4xAA/16xAF.





Now at 1650×1080 resolution:





This old Source engine barely benefits from any driver updates as all of our video cards breeze through this benchmark.  You may notice that the game’s 300 FPS cap will also influence the average scores, as many setups are hitting that ceiling.  We see definite ups and downs in performance but nothing you would notice in the game to diminish your enjoyment from one driver set to another.


Lost Planet DX10 benchmark






Lost Planet: Extreme Condition is a Capcom port of an Xbox 360 game which became the first DX10 game. It takes place on the icy planet of E.D.N. III which is filled with monsters, pirates, big guns, and huge bosses. This frigid world makes a great environment to highlight the benefits of high dynamic range lighting (HDR) as the snow-white environment reflects blinding sunlight, while DX10 particle systems toss snow and ice all around. The game looks great in both DirectX 9 and 10 and there isn’t really much of a difference between the two versions except perhaps shadows. Unfortunately, the DX10 version doesn’t look that much better when you’re actually playing the game and the DX10 version still runs slower than the DX9 version.

There are two versions of this benchmark. One was released as a stand-alone demo and the other is in-game. We chose the in-game demo from the retail copy of Lost Planet released on June 26, 2007 and updated through Steam to the latest version for our benchmark runs. This run isn’t completely scripted as the bugs spawn and act a little differently each time you run the demo. The benchmark is more of a scripted flyby of the level with “noclip” turned on. This means the benchmark won’t make an absolutely perfect comparison between different hardware setups, even with identical game settings. So we ran it many times. Lost Planet’s Snow and Cave demos are run continuously in-game and blend into each other. All settings are fully maxed out with 4xAA and 16AF applied.

Here are our benchmark results with Snow, the more demanding of the two benches . All settings are fully maxed out in game including 4xAA/16xAF – first at 1920×1200 resolution:

Lost Planet Benchmarks










And Now at 16880×10:





























































Lost Planet shows slight performance increases for GTX280 and HD4870, although 4870 still struggles with minimums. 4870-X2 and even crossfire-X3 has a nice increase as ATi apparently fixed the multi-GPU issues and low minimums from October.


Unreal Tournament3






Unreal Tournament3 (UT3) is the fourth game in the Unreal Tournament series. UT3 is a first-person shooter and online multiplayer video game by Epic Games. It was released for Windows on November, 2007. While many games share the same Unreal3 engine, the developers can decide how high the system requirements will be by increasing the level of detail. Unreal Tournament3 provides a good balance between image quality and performance, rendering complex scenes even on lower-end PCs. Of course, on high-end graphics cards you can really turn up the detail. UT3 is primarily an online multiplayer title offering several game modes and it also includes an offline single-player game with a campaign.

For our tests, we used the latest game patch for Unreal Tournament3. The game doesn’t have a built-in benchmarking tool, however, so we used FRAPS as well as HardwareOC’s benchmark tool which does a fly-by of a chosen level. Here we note that performance numbers reported are a bit higher than compared to in-game. The map we use is called “Containment” and it is one of the most demanding of the fly-bys. Our tests were run at resolutions of 1920 x 1200 and 1680 x 1050 with the UT3’s in-game graphical options set to their maximum values. One drawback of the way the UT3 engine is designed is that there is no support for anti-aliasing built in. While video card vendors have found ways to force this in their driver’s control panels, we did not force it. Again, we use the “timedemo-style” of benchmark for UT3. A “demo” is recorded in the game and a set number of frames are saved in a file for playback. When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see the it playing it back more quickly than you would actually play the game.

Containment Demo































































… On to 1680×1050 resolution:






























































(Geforce 180.48/181.20, not 181.84/181.40)

We note some real inconsistency from one driver set to another, with GTX280 taking a nosedive in the framerates with the latest driver.  We see many ups and downs, although all of our GPUs can easily handle this game with fully maxed out details.  We also realize that UT3 is CPU dependent and soon we will also force AA in the vendor’s respective control panels with our new Core 2 Quad 9550S benchmarking the runs, to see if we can get some better results for you.



The railing shows the in-game "AA"

This is the last time we are running  S.T.A.L.K.E.R., Shadows of Chernobyl benchmark. It is DX9c and our “Shootout Series” aim is to present the latest games and DX10 benchmarks, whenever possible. GSC Game World released a prequel story expansion on September 5, 2008 as Prologue: S.T.A.L.K.E.R., Clear Sky, and it has just become a brand new DX10 benchmark for us. Both games have a non-linear storyline and they feature role-playing gameplay elements such as trading and allying with NPC factions. In S.T.A.L.K.E.R., the player assumes the identity of “The Marked One” – an amnesiac illegal artifact scavenger in “The Zone” which encompasses roughly 30 square kilometers. It is the location of an alternate reality story surrounding the Chernobyl Power Plant after another (fictitious) explosion.

S.T.A.L.K.E.R. & Clear Sky feature “a living breathing world” with highly developed NPC creature AI.  S.T.A.L.K.E.R. uses the X-ray Engine – a DirectX8.1/9 Shader model 3.0 graphics engine featuring HDR, parallax and normal mapping, soft shadows, motion blur, weather effects and day-to-night cycles. As with other engines using deferred shading, the X-ray Engine does not support anti-aliasing with dynamic lighting enabled. However, a form of anti-aliasing can be enabled that uses a technique to blur the image to give an impression of it. We set all the graphical options – including “AA” – to their maximum values.

Our benchmarks for this DX9c game is a single timedemo run called “short”. Its flaw would be that the maximum frame rates are skewed way too high as the camera pans the sky. The maximums should mostly be disregarded, although the minimums and averages are fairly representative of what you actually encounter in game. Even the best video cards will suffer stutters occasionally, although the general gameplay is better than the minimum suggests.  Clear Sky Benchmark will have no such issues and overall it is a more detailed and even more stressful test for any PC.

S.T.A.L.K.E.R. Short Benchmark





Now at 1680×1050:








































The results are almost too mixed to draw conclusions at first look. However, at the average and minimum, the GTX280 has improved a little, while 4870-X2′s frame rates took a nosedive with the Catalyst 8-12 driver set.  And crossfireX-3 still is too flaky to reliably run.  We are hoping for better results with S.T.A.L.K.E.R. Clear Sky “official” DX10 benchmark.


PT Boats: Knights of the Sea DX10 benchmark


PT Boats: Knights of the Sea is a stand-alone DX10 benchmark utility released by Akella, last year. It is a benchmark-friendly tech demo of their upcoming simulation-action game. This DX10 benchmark test runs reliably and apparently provides very accurate and repeatable results.

We set the only settings options available to us as follows:

DirectX Version: DirectX 10
Resolution: 1920×1600 and 1680×1050 at 60 Hz
Image Quality: High
Anti aliasing: 4x

PT Boats DX10 benchmark


































































And now the results at 1650×1080 resolution:


































































We see really mixed results as no one is probably optimizing for this yet as it is unreleased.  We are really looking forward to this game’s release, later this year.




We remember some missing textures in FarCry2 with Catalyst 8-12 hotfix driver that needed to be fixed with the second 8-12 hotfix almost immediately after it was released to address issues with Catalyst 8-12.  This is a new game and we only started to test it with Catalyst 8-10 and Geforce 178.24 drivers.

FarCry2 uses the name of the original FarCry but it is not connected to the first game as it brings you a new setting and a new story. Ubisoft created it based on their Dunia Engine.  The game setting takes place in an unnamed African country, during an uprising between two rival warring factions: the United Front for Liberation and Labor and the Alliance for Popular Resistance. Your mission is a simple one, to kill “The Jackal”; the mercenary that arms both sides of the conflict that you are sent into.

The FarCry2 game world is loaded in the background and on the fly to create a completely seamless open world.  The Dunia game engine provides good visuals that scale well and it runs on a wide range of PC hardware.  The FarCry2 design team actually went to Africa to give added realism to this game and it does work very well. One thing to note, is FarCry2′s very realistic fire propagation by the Dunia engine that is a far cry from the scripted fire and explosions that we are used to seeing up-until-now.




And now at 1650×1080:




(Geforce 180.48/181.20, not 181.84/181.40)

We notice steady improvement in performance by both vendors since this AAA title was released.  However, Catalyst 8-12 appeared to have had real issues with FarCry2 and required ATi to release 2 hotfixes for it.  We will let you know if Catalyst 9-1 address FarCry2′s performance issues.  The ‘stutter’ is minimized in the latest hotfix but it still needs some work for 4870.



Here we have to go back to our ranking to see that it is basically unchanged since we started benching back in August, 2008 with these same cards, although there has been some shifting around of ranking in a few individual benchmarks:

  1. CrossfireX-3
  2. 4870-X2
  3. GTX280
  4. 4870-1GB

We also note some shifting about of game performance as minimal losses and gains are adjusted and fine-tuned for each subsequent driver set.  There are generally no significant losses that would impact gameplay with the cards and games that we tested, and in some cases, games that were barely playable in August, now were playing much better with only driver improvements.  Catalyst 8-12 was a slight bit of a disappointment, performance-wise, as ATi appeared focused on unlocking the potential of their new “Stream Drivers”.  Even Nvdia’s latest driver offering, seemed slightly weaker than the set before it as they continually work on also fine-tuning drivers for PhysX, 3-D Vision and CUDA.

We have completed Part Two’s testing of our new Shootout: “Quad core vs. Dual Core gaming”.   There we will begin with this article’s concluding benches and compare Intel’s brand new 65 watt TDP Core 2 Quad Q9550S, with our Core 2 Duo E8600 – both at 3.99 Ghz – which we have been testing against each other for you.  You can expect it this week.  After that, we will have a much more detailed testing of that same platform emphasizing multi-GPU performance and then we expect to compare the maturing Intel Core i-7 CPU platform with our currently maxed out Penryn system.  Eventually, we expect to also explore Nvidia GTX280/285 SLi on an X58 motherboard.  And we will prepare for it by upgrading to Vista64-bit and give you a comparison vs. gaming on Vista32.

Stay tuned.  We think we will have some very interesting articles for you to read very shortly as you plan your own coming upgrades.  Well, we are done with our benches and this part of our “Shootout” Series.  We would like to give you a sneak preview of what new game benchmarks we will be adding to our next article: “Quad core vs. Dual core shootout: Q9550S vs. E8600“:

S.T.A.L.K.E.R., Clear Sky


This 12 minute, stand-alone “official” benchmark by Clear Sky’s creators, will replace S.T.A.L.K.E.R. as our benchmark in this ABT “Shootout” series and it is far much more detailed and way better than the one we are currently using.  Best of all, it appears that Clear Sky makes good use of multi-core CPUs!  We will test it for you.  As an expansion to the original game, Clear Sky is top-notch and worthy to be S.T.A.L.K.E.R’s successor with even more awesome DX10 effects which help to create and enhance their game’s incredible atmosphere.  But DX10 comes at a steep HW requirement and this new benchmark really needs multi-GPU to run at its maximum settings – even below 1650×1080!.

X3-Terran Conflict


This is another beautiful stand-alone benchmark that runs multiple tests and will really strain a lot of video cards with its extra-high polygon count.  We look forward to adding it to our benchmark suite for you.

World in conflict


Here is yet another amazing and very customizable and detailed DX10 benchmark that is available in-game or as a stand-alone.  The particle effects and explosions in World in Conflict have taken first place in my heart away from F.E.A.R., finally!  It appears to take particular advantage of more than 2 cores, so we will be testing with it extensively.  Look for our “Quad Core vs. Dual Core Shootout: Q9550S vs. E8600, later this week.

Mark Poppin

ABT Editor


Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16


Founder and Senior Editor of ABT.

7 Responses

  1. BFG10K says:

    Man, there’s a *ton* of data here. Nice work. :)

  2. Josh6079 says:

    Could have swore I already posted this here, but seems it’s gone. But anyways, why’d you use Catalyst A.I. “Advanced”?

  3. apoppin says:

    heck .. i posted a really detailed response and it is gone also. =(

    Let me check into this .. basically Cat AI set to “Advanced” maximizes performance – as for benchmarking – without impacting IQ

    However, if you are doing an IQ comparison – like BFG10K’s – you set Cat AI to off or Standard for Crossfire.

  4. apoppin says:

    I have zero idea what happened; but I found what i posted in my notes [thankfully]; here is a C&P of what I submitted originally – right after you asked:

    How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:

    Here is the tweak guide which supports my own research:

    “Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ’short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.

    Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”

    In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to "Standard"]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.

    You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.

    That is why I recommend leaving it on “Advanced”; you get a possible performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.

    Hope that explains it.

  5. apoppin says:

    I figured it out =P

    There are 4 parts so far to “Big GPU shootout”l this is ‘Part 4, The Summary’
    You posted your question in ‘Part 3, PCIe 1.0 vs 2.0′, yesterday

  6. Josh6079 says:

    Ah, got ya.

    Sorry about posting the same question twice.

    I just thought it was a little different since most performance reviews I’ve seen leave it at “Standard”, but, they may just do so to simulate a straight-out-of-the-box scenario.

  7. apoppin says:

    No problem whatsoever. It took me a bit to figure out what happened myself.

    I am trying to simulate how some of us actually play games. I usually use “Advanced” in my own games – and always play with settings as maxed-out as the frame rates support. When there are issues, I note them. My results may vary slightly from other sites as there are no recognized standard or universal settings for all of us to use. Each of the games that I pick for my benchmarking is well-tested to make sure that “Advanced” does not cheat, IQ-wise from driver release to driver release. There is also not a big difference even in still closeups, nor is the performance increase very large. I doubt anyone can tell any real difference while actually playing the game, in most cases. This “Standard vs. Advanced” setting also may be the focus for my own IQ article in the future – if there is interest. I also play through most of these games myself and will update Source Engine’s HL2, for example to L4D, soon enough.

    In fact, there are 3 new game benches that I am validating and adding for the upcoming “Quad-core vs. Dual-Core Shootout – Q9550S vs. E8600″; to be published this Sunday or Monday. Next week, is planned a really expanded test including CrossFireX-3 covering the same area so as to compare both CPUs at 3.99Ghz; and also with e8600 OC’d further to 4.25Ghz. It will also examine CPU “scaling” from stock speeds and in-between to try to find an elusive “sweet spot” for top video cards.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>