Big GPU Shootout – Revisited
Welcome to our Big GPU Shootout, Revisited. We are updating our first “Big GPU Shootout” review that we published back in November. We now feature ten video card configurations which are four more than the last time, and we also expect to add more later on as we continue to benchmark for you. In our last review, CrossFire-X eXplored, we tested AMD hardware with Catalyst 9-4 which you can now directly compare with Catalyst 9-5 performance in this review. To compare Nvidia drivers, you need to refer to the article before that one, Diamond HD4890-XOC Review, to compare the 182.08 driver set then, with the next set 185.85 now. In this way, you can directly judge the progress ATi and Nvidia are making with their drivers.
Do I need to upgrade? This is perhaps the most commonly asked question by today’s gamer. This review showcases the performance comparison between last generation’s top and midrange cards vs. today’s latest and greatest, as well as including today’s midrange. When we started benchmarking well over 9 months ago with Catalyst 8-8 and GeForce 175.19, we were not quite certain exactly where we would go or what we would find. We did have a clear idea that we would try to determine if video cards from the last generation or so need upgrading; and we are referring particularly to Radeon 2900/3800 series, and GeForce 8800/9800 series.
We decided to test the 2900 XT, which is largely unsurpassed by the 3800 series – and is also the near-equivalent of the 8800 GTS series – as representative of the upper-midrange cards of that generation. We also picked the venerable 8800 GTX, which has been a top choice standard for about 2 years and was mostly unsurpassed by Nvidia’s offerings, except for the higher-clocked Ultra, until GTX 280 launched last Summer – which we are also testing. We also added the 9800 GT and the GTS 250 to our comparisons; less expensive midrange follow-ups to the 8800 series and we are adding HD 4890, ATi’s follow-up to HD 4870 series.
Each card has been tested for weeks and you will also get impressions of relative performance beyond benchmarking to actual game play – which is subjective and will be clearly stated as personal opinion or preference. We will also be testing CrossFire X-3 = 4870×2 + 4870, 4890 CrossFire, and HD 4870-X2 will be standing in for 4870 CrossFire. We use Intel’s Q9550s at 4.0 GHz with each of our video cards and CrossFire combinations. We are using Catalyst 9-5 and GeForce 185.85; final certified drivers are used for our testing all through this review series. Identical 250GB hard drives are set up with the latest version of Vista 64; each with identical programs, updates and patches – the only differences are the video cards.
We are continuing to test at two of the most popular demanding wide-screen resolutions, 1680×1050 and 1920×1200, 4xAA plus 16xAF and with maximum DX10 details whenever it is available. For our four older and mid-range video cards, we will test at 1440×900 resolution also – but not at 1920×1200 as they have a difficult time generally managing that resolution.
Our single GPU reference cards are:
- Radeon HD 2900 XT 512MB (743/825)
- GeForce 8800 GTX 768MB (575/900)
- GeForce 9800 GT 1GB (600/900)
- GeForce GTS 250 512MB (738/1100)
- GeForce GTX 280 1GB (602/1107)
- Radeon HD 4870 1GB (750/900)
- Radeon HD 4890 1GB (850/975)
CrossFire/TriFire is represented by:
- HD 4870-X2 (750/900) which is very similar to 4870 CrossFire at stock speeds
- HD 4890-CrossFire (850/975)
- HD 4870-X2 plus HD 4870 in CrossFire-X3 (750/900)
We are paying attention to how the drivers have changed overall in relation to each other and we are also setting new benches for you. The old ones we ran back in November cannot be exactly compared as they were run on Vista 32.
Test Configuration
Test Configuration – Hardware
- Intel Core 2 Quad Q9550S (engineering sample reference 2.83 GHz and overclocked to 4.0 GHz – i.e. 470 FSB.)
- ASUS Rampage Formula (Intel X48 chipset, latest BIOS, PCIe 2.0 specification; CrossFire 16x+16x).
- 4 GB DDR2-PC8500 RAM (2×2 GB, dual-channel at PC 8500 speeds)
- ATi Radeon HD 4890-XOC (1 GB, Diamond reference clocks 925/1050, underclocked to 850/975) by Diamond
- ATi Radeon HD 4890-XOC (1 GB, reference clocks 850/975) by HIS
- ATi Radeon HD 4870 (1GB, reference clocks 750/900) by ASUS
- ATi Radeon HD 4870-X2 (2GB, reference clocks 750/900) by VisionTek
- Nvidia GeForce 9800 GT (1 GB, reference clocks) by Palit
- Nvidia GeForce 8800 GTX (768MB, reference clocks)
- Nvidia GeForce GTS 250 (512MB, reference clocks) by Galaxy
- Nvidia GeForce GTX280 (1GB, reference clocks) by BFG Tech
- Onboard SupremeFX-II (ASUS Rampage Formula motherboard daughter-card)
- 2 – 250 GB Seagate Barracuda 7200.10 hard drives
- OCZ 850 watt power supply
Test Configuration – Software
- ATi Catalyst 9-5; highest quality mip-mapping set in the driver, Catalyst AI set to “Standard”
- GeForce 185.85; high quality filtering; optimizations off and LOD clamp enabled
- Windows Vista 64-bit SP1; very latest updates
- DirectX March 2008.
- All games are patched to their latest versions.
Test Configuration – Settings
- vsync is off in the control panel and is never set in-game.
- 4xAA enabled in all games and “forced” in Catalyst Control Center for UT3; all in-game settings at “maximum” or “ultra” with 16xAF always applied
- All results show average, minimum and maximum frame rates except as noted.
- Highest quality sound (stereo) used in all games.
- Vista 64, all DX10 titles were run under DX10 render paths
3DMark06
3DMark06 still remains the number one utility used for a system benchmark. We find that it is mostly useful for tracking changes within a single system. There are four “mini-games” that it uses for benchmarking graphics, as well as two CPU tests. The scores are weighed and added together to give an overall “score” and there is a further frame rate breakdown possible with these mini-games that we are charting for you.
Here is a scene from “Canyon Flight” in 3DMark06, a mini-game which is used to benchmark performance. These tests will still give your PC a real workout even though its default resolution is only 1280×1024.
Here are the results of our 3DMark06 benchmark comparison using the benchmark at its default settings:
The results are just scores. Often a little variability will occur by just running the tests over and over. Now the mini-game frame rates:
As though we were ‘playing’ the 3DMark06 mini-games, we note the frame rate rankings. So, let’s move on to our second synthetic benchmark, Vantage.
Vantage
Vantage is Futuremark’s latest test. It is really useful for tracking changes in a single system – especially driver changes. There are two mini-game tests, Jane Nash and Calico and also two CPU tests, but we are still focusing on the graphics performance.
Here is a scene from Vantage’s second mini-game.
Let’s go right to the graphs and first check the Basic Tests with the default benchmark scores:
We see more scores. Now let’s look at the mini-game frame rates:
Our HD 2900 XT refused to complete this synthetic test. The 9800 GT sits at the bottom of the rankings with the GTS 250 edging it out while the 8800 GTX of the generation before it, is a stronger performer. The HD 4870 is beaten solidly by the HD 4890 and in turn by the GTX 280. The top three spots have TriFire 4870-X2+4820 as the king here, followed by HD 4890 CrossFire and our HD 4870-X2. Enough of the synthetics as we move on to PC games and real world situations!
Call of Juarez
Call of Juarez is one of the very earliest DX10 games. Techland’s Call of Juarez is loosely based on Spaghetti Westerns that became popular in the early 1970s. Call of Juarez features its Chrome Engine using Shader Model 4 with DirectX 10. Our benchmark isn’t built into Call of Juarez, but is an official stand-alone that runs a simple flyby of a level that is created to showcase its DX10 effects. It offers good repeatability and it is a good stress test for DX10 features in graphics cards, although it is not quite the same as actual gameplay because the game logic and AI are stripped out of this demo.
Performing Call of Juarez benchmark is easy. You are presented with a simple menu to choose resolution, anti-aliasing, and two choices of shadow quality options. We set the shadow quality on “high” and the shadow map resolution to the maximum, 2048×2048. At the end of the run, the demo presents you with the minimum, maximum, and average frame rate, along with the option to quit or run the benchmark again. We always ran the benchmark at least a second time and recorded that generally higher score.
Call of Juarez DX10 benchmark at 1920×1200:
Call of Juarez has no surprises. It really takes multi-GPU to play this game fully maxed out. The HD 4890 beats the HD 4870 significantly and the HD 4870-X2’s frame rates are completely satisfactory at 1920×1200 with completely maxed out details and with 4xAA/16xAF applied. The GTX 280 is generally slower than the HD 4870 and the HD 4890 and they struggle with the minimums even at 1680×1050. The HD 4870-X2 is consistently beaten by HD 4890 Crossfire and Tri-Fire wins out over everything else. We also see it takes at least an HD 4870-X2 to get the minimums out of the teens and 20s, the domain of the single card.
Even at our lowest resolution tested, the HD 2900 XT is not really playable and although 9800 GT is faster, with 8800 GTX beating it handily and with GTS 250 leading the midrange pack, you would have to eliminate AA/AF and turn down details to get satisfactory framerates with these older/midrange video cards.
CRYSIS
Next we move on to Crysis, a science fiction first person shooter by Crytek. It remains one of the most demanding games for any PC and it is also still one of the most beautiful games released to date. Crysis is based in a fictional near-future where an alien spacecraft is discovered buried on an island near the coast of Korea. The single-player campaign has you assume the role of USA Delta Force, ‘Nomad’ who is armed with futuristic weapons and equipment. Crysis uses DirectX10 for graphics rendering.
A standalone but related game, Crysis Warhead was released last year. CryEngine2 is the game engine used to power Crysis and Warhead and it is an extended version of the CryEngine that powers FarCry. As well as supporting Shader Model 2.0, 3.0, and DirectX10’s 4.0, CryEngine2 is also multi-threaded to take advantage of SMP-aware systems and Crytek has developed their own proprietary physics system, called CryPhysics. However, it is noted that actually playing this game is a bit slower than the demo implies.
GPU Demo, Island
All of our settings are set to ‘maximum’ including 4xAA and we force 16AF in the control panel. Here is Crysis’ Island Demo benchmark, first at 1920×1200 resolution:
We could always use a single ‘extra’ frame rate in Crysis for any configuration. Now at 1680×1050:
We see the HD 4890’s performance is quite a bit better than the HD 4870 but no single GPU video card is very satisfactory at any resolution tested with maxed settings. However, Crysis is quite playable with HD 4870-X2, even with 4xAA/16xAF, if you are willing to tweak your settings a bit downward. Tri-Fire is faster overall in the averages and the maximums but still hangs in the mid 20s swapping performance with 4890 CrossFire. Our midrange video cards really struggle with Crysis.
S.T.A.L.K.E.R., Clear Sky
Prologue: S.T.A.L.K.E.R., Clear Sky became a brand new DX10 benchmark for us when GSC Game World released a prequel story expansion to the original Shadows of Chernobyl, last year. Both games have non-linear storylines which feature role-playing game elements. In both games, the player assumes the identity of a S.T.A.L.K.E.R.; an illegal artifact scavenger in “The Zone” which encompasses about 30 square kilometers. It is the location of an alternate reality story surrounding the Chernobyl Power Plant after another (fictitious) explosion.
S.T.A.L.K.E.R. & Clear Sky feature “a living breathing world” with highly developed NPC creature AI. S.T.A.L.K.E.R., Clear Sky uses the X-ray Engine – a DirectX8.1/9/DX10/10.1 Shader model 3.0 & 4.0 graphics engine featuring HDR, parallax and normal mapping, soft shadows, motion blur, weather effects and day-to-night cycles. As with other engines using deferred shading, the original DX9c X-ray Engine does not support anti-aliasing with dynamic lighting enabled, although the DX10 version does.
We are using the stand-alone “official” benchmark by Clear Sky’s creators. Clear Sky is top-notch and worthy to be S.T.A.L.K.E.R’s successor with even more awesome DX10 effects which help to create and enhance their game’s already incredible atmosphere. Unfortunately, DX10 comes with steep hardware requirements and this new game really needs multi-GPU to run at its maximum settings. We picked the most stressful test out of the four, “Sun shafts”. It brings the heaviest penalty due to its extreme use of shaders to create DX10/DX10.1 effects. We ran this benchmark fully maxed out in DX10.0 with “ultra” settings plus 4xAA, but did not apply edge-detect MSAA which chokes performance even further.
S.T.A.L.K.E.R., Clear Sky DX10 benchmark “Sun shafts” at 1920×1200:
Here we see something really strange where the 9800 GT beats the GTS 250 by 1.5 FPS in the averages at 1680×1050. It appears that this may be one of the times that the GTS 250 is limited by having only 512 MB of video RAM, compared with the 9800 GT’s 1 GB vRAM, as it does not show up at 1440×900 resolution; or it could be a driver issue.
Again we see the HD 4890 beating the HD 4870 and well ahead of the midrange and last generation’s video cards. TriFire takes off on the maximums and averages but chokes a bit on the minimums to lose overall to HD 4890 CrossFire. However, even the HD 4870-X2 cannot climb out of the teens for the minimum frame rates and the single GPU video cards are held to single digits. We also note that our HD 4890 and HD 4870-X2 suffered a couple of noticeable minuses with Catalyst 9-5 compared to the earlier drivers while our HD 4890 appeared to gain performance, but it was not as significant as in Far Cry 2.
PT Boats: Knights of the Sea DX10 benchmark
PT Boats: Knights of the Sea is a stand-alone DX10 benchmark utility released by Akella last year. It is actually a tech demo of their upcoming simulation-action game. This DX10 benchmark test runs reliably and apparently provides very accurate and repeatable results.
We set the only settings options available to us as follows:
DirectX Version: DirectX 10
Resolution: 1920×1600 and 1680×1050 at 60 Hz
Image Quality: High
Anti aliasing: 4x
PT Boats DX10 benchmark, first at 1920×1200:
Let’s look at 1440×900 with the midrange cards:
Our HD 2900 XT gave a genuine slide show and we see that it is probably unrelated to its only having 512 MB of video RAM as our 512MB GTS 250 has no such issues. We see a repeat performance where the HD 4890 clearly beats the HD 4870. All of these cards drop into the 20s at 1920×1200 resolution except for our CrossFire and TriFire and the HD 4870-X2 is held to middle/low 20s for its minimums and trades blows with the GTX 280.
FarCry 2
Far Cry 2 uses the name of the original Far Cry but it is not connected to the first game as it brings you a new setting and a new story. Ubisoft created it based on their Dunia Engine. The game setting takes place in an unnamed African country, during an uprising between two rival warring factions. Your mission is to kill “The Jackal”; the Nietzsche-quoting mercenary that arms both sides of the conflict that you are dropped into.
The Far Cry 2 game world is loaded in the background and on the fly to create a completely seamless open world. The Dunia game engine provides good visuals that scale well. The Far Cry 2 design team actually went to Africa to give added realism to this game. One thing to especially note is Far Cry 2’s very realistic fire propagation by their engine that is a far cry from the scripted fire and explosions that we are used to seeing.
Far Cry 2 benchmark at 1920×1200 – all resolutions tested with AI enabled:
All of our video cards can play Far Cry 2 very satisfactorily at the resolutions chosen for them except for our HD 2900 XT. The midrange GPUs all do nicely at even 1680×1050 which appears to be their playable resolution limit. The 8800 GTX leads when the GTS 250’s minimums takes a nosedive and is even beaten by the 9800 GT; perhaps because the GTS only has 512 MB of video RAM.
Unfortunately, here we saw the biggest performance loss with Catalyst 9-5 from the earlier one with our HD 4870 and HD 4870-X2. Conversely, the HD 4980 had a nice performance gain from the new drivers. A note about Nvidia’s GeForce 185.85 drivers that we are testing in this review – generally there were nice performance gains with it overall, compared with the last 182.08 set.
The GTX 280 leads the single-GPU card and our HD 4890 clearly beats the HD 4870. Our HD 4870-X2 still puts in a good performance, and TriFire scales nicely.
World in Conflict
World In Conflict is set in an alternate history Earth where the Cold War did not end and Russia invaded the USA in 1989 and the remaining Americans decided to strike back. World in Conflict (WiC) is a real-time tactical/strategy video game developed by Massive Entertainment. Although it is generally considered a real-time strategy (RTS) game, World in Conflict includes gameplay typical of real-time tactical (RTT) games. WiC is filled with real vehicles from both the Russian and American military. There are also tactical aids, including calling in massive bombing raids, access to chemical warfare, nuclear weapons, and far more.
Here is yet another amazing and very customizable and detailed DX10 benchmark that is available in-game or as a stand-alone. The particle effects and explosions in World in Conflict are truly spectacular! Every setting is fully maxed out. First at 1920×1200 resolution:
World in Conflict at 1680×1050:
This is getting repetitious. The midrange video cards struggle, with the older architecture 8800 GTX leading that pack and our HD 2900 XT is badly trailing 9800 GT and GTS 250 in that order. Again the HD 4890 beats the HD 4870 but is insufficient to run maxed out at even 1680×1050 while the GTX 280 clearly rules the single-GPU video cards. Together with the GTX 280, the multi-GPU configurations really have frame rates that are playable at their minimum without lowering many details or filtering options.
X3-Terran Conflict
X3:Terran Conflict (X3:TC) is another beautiful stand-alone benchmark that runs multiple tests and will really strain a lot of video cards. X3:TC is a space trading and combat simulator from Egosoft and is the most recent of their X-series of computer games. X3:TC is a standalone expansion of X3: Reunion, based in the same universe and on the same engine. It complements the story of previous games in the X-Universe and especially continues the events after the end of X3: Reunion.
Compared to Reunion, Terran Conflict features a larger universe, more ships, and of course, new missions. The X-Universe is huge. The Terran faction was added with their own set of technology including powerful ships and stations. Many new weapons systems were developed for the expansion and it has generally received good reviews. It has a rather steep learning curve.
First at 1920×1200:
X3:Terran Conflict at 1680×1050:
And now at 1440×900:
There is no reason to be dissatisfied with any configuration tested, except at the minimums which only varies between 15-21 FPS for all of our tested configurations. TriFire leads, but not by much and in some cases the HD 4890s in CrossFire are better performers.
Enemy Territory: Quake Wars
Enemy Territory: Quake Wars is an objective-driven, class-based first person shooter set in the Quake universe. It was developed by id Software and Splash Damage for Windows and published by Activision. Quake Wars pits the combined human armies of the Global Defense Force against the technologically superior Strogg, an alien race who has come to earth to use humans for spare parts and food. It allows you to play a part, probably best as an online multi-player experience, in the battles waged around the world in mankind’s desperate war to survive.
Quake Wars is an OpenGL game based on id’s Doom3 game engine with the addition of their MegaTexture technology. It also supports some of the latest 3D effects seen in today’s games, including soft particles, although it is somewhat dated and less demanding on video cards than many DX10 games. id’s MegaTexture technology is designed to provide very large maps without having to reuse the same textures over and over again. For our benchmark we chose the flyby, Salvage Demo. It is one of the most graphically demanding of all the flybys and it is very repeatable and reliable in its results. It is fairly close to what you will experience in-game. All of our settings are set to ‘maximum’ and we also apply 4xAA/16xAF in game.
First we test at 1920×1200 resolution:
Salvage Demo fly-by at 1680×1050 resolution
Here there are serious issues with artifacting with all of our multi-GPU configurations. It must be a driver issue that was not present in earlier Catalyst drivers. Still, all of these video cards, including HD 2900 XT at 1440×900, have no trouble handling this game fully maxed out.
F.E.A.R.
F.E.A.R. – First Encounter Armed Assault – is a DX9c game by Monolith Productions that was originally released in October 2005 by Vivendi Universal Production. Later, there were two expansions with the latest, Perseus Mandate, released in 2007. Although the game engine is aging a bit, it still has some of the most spectacular effects of any game. F.E.A.R. showcases a powerful particle system, complete with sparks and smoke for collisions as well as featuring bullet marks and other effects including “soft shadows”. This is highlighted by the built-in performance test, although it was never updated. This performance test will tell you how F.E.A.R. will run, but both of its expansions are progressively more demanding on your PC graphics and will run slower than the demo. We always run at least two sets of tests with all in-game features at ‘maximum’. F.E.A.R. uses the Jupiter Extended Technology engine from Touchdown Entertainment.
We test with the most demanding settings. Fully maxed details with 4xAA/16xAF; soft shadows ‘off’, as they do not play well with AA. Let’s start again first at 1920×1200:
We see a couple of driver stumbles and irregularities with CrossFire here, but it is a pretty similar picture to all of our previous testing.
Now at 1680×1050:
Now at 1440×900 with the midrange cards:
In this case, our GTX 280 has the best minimum frame rates as the HD 4870-X2 appears to be having driver issues and some hiccups with its minimums as it did with Catalyst 9-4; even the single-GPU video cards, the HD 4870 and the HD 4890 beat it, but it also excels in the averages and maximums. Clearly, the HD 4890 is solidly faster than the HD 4870 but there is no practical difference playing with any of our video card configurations – except perhaps for the aging HD 2900 XT.
Lost Planet DX10 benchmark
Lost Planet: Extreme Condition is a Capcom port of an Xbox 360 game. It takes place on the icy planet of E.D.N. III which is filled with monsters, pirates, big guns, and huge bosses. This frozen world highlights high dynamic range lighting (HDR) as the snow-white environment reflects blinding sunlight as DX10 particle systems toss snow and ice all around. The game looks great in both DirectX 9 and 10 and there isn’t really much of a difference between the two versions except perhaps shadows. Unfortunately, the DX10 version doesn’t look that much better when you’re actually playing the game and it still runs slower than the DX9 version.
We use the in-game performance test from the retail copy of Lost Planet and updated through Steam to the latest version for our runs. This run isn’t completely scripted as the creatures act a little differently each time you run it, requiring multiple runs. Lost Planet’s Snow and Cave demos are run continuously by the performance test and blend into each other.
Here are our benchmark results with the more demanding, Snow. All settings are fully maxed out in-game including 4xAA/16xAF.
First at 1920×1200 resolution:
TriFire rules the multi-GPU category beating out the HD 4890 CrossFire and our HD 4870-X2. However, all of our single-GPU video cards have troubles with the minimum frame rates and are unsatisfactory at even 1680×1050 except for the GTX 280 which make a very strong showing. Again, the HD 4890 makes a nice improvement over the HD 4870 but not enough to make much difference at the bottom.
Unreal Tournament 3
Unreal Tournament 3 (UT3) is the fourth game in the Unreal Tournament series. UT3 is a first-person shooter and online multiplayer video game by Epic Games. Unreal Tournament 3 provides a good balance between image quality and performance, rendering complex scenes well even on lower-end PCs. Of course, on high-end graphics cards you can really turn up the detail. UT3 is primarily an online multiplayer title offering several game modes and it also includes an offline single-player game with a campaign.
For our tests, we used the very latest game patch for Unreal Tournament 3, released after its ‘Titan’ pack. The game doesn’t have a built-in benchmarking tool so we used FRAPS and did a fly-by of a chosen level. Here we note that performance numbers reported are a bit higher than compared to in-game. The map we use is called “Containment” and it is one of the most demanding of the fly-bys. Our tests were run at resolutions of 1920 x 1200 and 1680 x 1050 with UT3’s in-game graphics options set to their maximum values.
One drawback of the way the UT3 engine is designed is that there is no support for anti-aliasing built in so we forced 4xAA in each vendor’s control panel. We record a demo in the game and a set number of frames are saved in a file for playback. When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see it playing it back more quickly than you would actually play the game.
Containment Demo at 1920×1200:
There is absolutely no problem playing this game fully maxed out with any of our video cards at our chosen resolutions. We note that CrossFire doesn’t scale well except for the maximum frame rates.
Call Of Duty 4: Modern Warfare
Call of Duty 4: Modern Warfare (CoD4) is a first person shooter running on a custom engine. There are nice graphics but the engine is somewhat dated compared to others and it runs well on modern PCs. It is the first CoD installment to take place in a modern setting instead of in World War II. It differs from the previous Call of Duty games by having a more film-like plot that uses intermixed story lines from two perspectives; that of a USMC sergeant and a British SAS sergeant. There is also a variety of short missions where players control other characters in flashback sequences to advance the story. Call of Duty 4’s move to modern warfare introduced a variety of modern conventional weapons and technologies including plastic explosives.
There are currently 20 multiplayer maps in CoD4. It is very popular and there is a new expansion for it. For multiplayer, it includes five preset classes and introduces the Perks system. Perks are special abilities which allow users to further customize their character to suit their personal style. Our timedemo benchmark was created by ABT’s own Senior Editor and lead reviewer, BFG10K. It is very accurate and totally repeatable.
And now at 1650×1080:
We see results similar to Unreal Tournament 3. A popular multiplayer game is very playable even on midrange graphics cards from the last generation if you are willing to make some detail sacrifices. It appears strange that the GTS 250 is way faster than the 9800 GT – and it could be a driver issue with our configuration – but we see the 250 GTS trade blows with the 8800 GTX.
Half-Life 2: Lost Coast
Half-Life 2 is still a popular game and it is the oldest game we review for this series. Half-Life 2: Lost Coast is an additional level for this 2004 game. Lost Coast was released October, 2005 as a free download to all purchasers of Half-Life 2. Lost Coast was developed as a playable tech demo that was evidently intended to showcase the newly-added high definition range (HDR) lighting features of the Source Engine. A flyby of this level is played during the HL2 video stress test and it is very repeatable. All in-game settings are maxed out, including 4xAA/16xAF. We had issues with Source engine stop responding on our Nvidia partition that we did not resolve during our testing of GTX 280 and GTS 250, so there are no results for these two video cards.
Although all of our configurations breeze through this benchmark, the HD
Conclusion
It is interesting to return to an older review and update it many months later. We added 4 video cards that helped to fill in the picture from our first GPU Shootout – that last generation’s top cards can barely run 1440×900 DX10 maxed out gaming – never mind the 8800 GTS, 9800 GTS, HD 2900 and HD 3800 series. We also noted that the 8800 GTX is still very capable as a midrange video card; it is often able to trade blows with and even beat the later-released 9800 GT and GTS 250 series.
Since the HD 4890 is now often priced below $200, there is little reason to hold back on upgrading from an older series card for today’s games. Two HD 4890s paired in CrossFire offer a great value and even the older series HD 4870-X2 has always been able to easily top Nvidia’s GTX 280 – now eclipsed by the GTX 285.
In the last review with Catalyst 9-4, we saw CrossFire-X scaling exposed. Two “mismatched” cards in ‘FrankenFire’ do not necessarily default to the slower card’s speeds but the slower one generally contributes to the overall performance as the load is balanced between them as well as the drivers are capable with that particular game. Our Diamond HD 4890-XOC has proved worthy of our “Editor’s Award” as it shows excellent scaling over the stock HIS HD 4890.
Our results are very consistent and we carried on where our Diamond HD 4890-xOC Preview, Part 1 and Part 2 left off. We saw Diamond’s overclocked HD 4890-XOC coming on very strongly to replace the HD 4870 as AMD’s top single-GPU. We saw the HD 4890-XOC even trade blows with Nvidia’s now second-fastest single GPU, the GTX280. We are most impressed and highly recommend Diamond’s HD 4890-XOC as offering great bang-for-buck!
If you are going to pick the most bang for buck in a multi-GPU configuration, overclocked HD 4890 CrossFire should be your first choice. If you already have an HD 4870 and especially if it is overclockable, you might consider pairing it with the fastest HD 4890 that you can find as this mismatched CrossFire-X pair comes reasonably close in a lot of the benches to “true” HD 4890 CrossFire and is faster than an HD 4870-X2 or HD 4870 CrossFire.
Our “Shoot-out Series” has been a steady progression examining Intel’s Penryn platform, and we have been upgrading it as necessary to maximize our PC’s gaming performance and to chart those improvements for you. Part IV, The Summary, showed this by comparing drivers all the way back to August 2008 when we first began benchmarking and focusing on the progress each vendors has made since then.
In our installment of Part III, Big GPU Shootout, PCIe 1.0 vs. PCIe 2.0, we especially focused on the motherboard’s effects on video card performance, using the extremes – P35 PCIe 1.0 vs. X48 PCIe 2.0. We saw how limiting the older motherboard’s PCIe bandwidth can be in certain situations and so we upgraded to X48.
Part II – The Big GPU Shoot-Out – Setting New Benches – demonstrated the need for overclocking our E8600 CPU from its stock 3.33 GHz to 4.0 GHz to take full advantage of our new video cards.
Part I, The Big GPU Shootout: Upgrade Now or Wait? we examined the performance of five video cards. We realized that the last generation’s video cards are not sufficient for today’s DX10 maxed-out gaming. Since our Q9550S review article, we now use Core 2 Quad Q9550S and recommend it highly! We also started to bench with CrossFireX-3 in Part I which ran on fairly immature Catalyst 8-8 drivers at the time and we have continued to chart its progress until now.
Stay tuned. We think we will have some very interesting articles for you to read as you plan your own coming upgrades. Well, we are done with this second part of our series, “Big GPU shootout – revisited”, and we plan to update it in the future – with many more video cards and game benchmarks than now.
Our next project is to build a new “value” quad core PC with Cooler Master and AMD and to compare it with our current Intel Penryn platform featuring Q9550S at 3.1 and 4.0 GHz. You can expect a preview of our new value series of articles in a few days and there is already a topic on our forums – New AMD Build – Unlocking and Overclocking Phenom II 550 X-2 to X4 – 4 GHz or Bust!! Feel free to participate and perhaps to make requests for what you would like to see us focus on for this next series that we are doing right now.
In the meantime, feel free to comment below, ask questions or have a detailed discussion in our ABT forum. We want you to join us and Live in Our World. It is fast expanding and we think you will like what you progressively discover here.
Mark Poppin
ABT editor
Please join us in our Forums
Follow us on Twitter
For the latest updates from ABT, please join our RSS News Feed