Big GPU-Shootout; Part III, PCIe 1.0 vs. PCIe 2.0
INTRODUCTION:
Here is our third installment of “Big GPU-Shootout”. Part I covered Cat 8.8 vs. Geforce 177.41 on Intel’s P35 MB platform as we examined the performance of five video cards. The new cards we tested were: HD4870-512MB, HD4870X2-2GB, GTX280; while 8800GTX & 2900XT represented the top and mid-range cards of the last generation. The resolutions we test at for these reviews at are fairly demanding for any PC – 1920 x 1200 and 1680 x 1050. We used a combination of ten benchmarks, including PC games’ built-in performance benchmarks and custom time demos as well as synthetic tests, and we came to some interesting conclusions. We realized that last generation’s video cards are not sufficient for today’s Vista DX10 maxed-out gaming at 1920×1200 and even 1650×1080 with 4xAA/16xAF. We realized that a modern gamer wanting to play the latest DX10 PC games needs to upgrade to this generation of video cards. Our CPU for the first article of our series compared e4300 at it its overclock of 3.33Ghz with e8600 at its stock 3.33 GHz and found the older CPU lacking in comparison. We continued for the rest of our series with our e8600 which we later overclock to nearly 4.0 Ghz for the next 3 reviews, and we always use the DX10 pathway in Vista 32, whenever possible, for this entire series. We also started to bench with crossfireX-3 in Part I which ran on fairly immature drivers and we will continue to chart its progress.
Part II – Cat 8.9 vs. Geforce 178.13 was also tested on our P35 motherboard (PCIE 1.0/Crossfire 16x+4x) and demonstrated to us the need for overclocking our e8600 CPU from stock 3.33 Ghz to nearly 4.0 Ghz to take full advantage of our new video cards. We also set new benchmarks with these drivers that we are continuing to use to compare two different motherboard platform’s performance into Part III, this article. For Part II, we added a couple of more games and refined our testing slightly. We also noted that the ranking of the new video cards has remained the same: 4870X2, GTX280 and 4870 while crossfireX-3 got more mature drivers over the last Catalyst 8.8 set.
Part III, for this review – PCIe 1.0 vs. PCIe 2.0 – we are also using Cat 8.9 vs. Geforce 178.13 as in the last review, Part II in our series. However, this time, we are particularly comparing individual video card and crossfireX-3 performance with our new x48 motherboard’s double-the-bandwidth 2.0 PCIe vs. our old P35 motherboard’s PCI express (PCIe) 1.0 specification. We also look at the potential performance increase in upgrading from P35 motherboard’s PCIe 1.0 bandwidth – doubling to X48 motherboard’s PCIe 2.0 specification with our top 3 video cards: HD4870, GTX280 and HD4870X2. We will also note the possible benefits of Intel’s X48 motherboard’s full X16 + X16 crossfire slots over P35 motherboard’s X16 + X4 crossfire slots.
This time, there are a lot of comparisons to be made between these two motherboard platforms as we look for possible reasons a PC gamer would upgrade his motherboard platform – or not. AMD’s crossfire or Nvidia’s SLi technology is a solid way to transform an ordinary gaming machine into a gamer’s powerhouse. However, with several Intel-based platforms supporting crossfire across different PCI express lane configurations, there are several options that concern many gamers about whether or not their motherboard can provide enough bandwidth to realize the full potential of crossfire – or even a single fast GPU like GTX280. So we are going to take the two extremes – P35 motherboard’s PCIe 1.0 and crossfire 16x + 4x PCIe vs. X48’s PCIe 2.0 and 16x +16x crossfire slots and examine their performance differences.
Since Intel’s Core 2 chipset development has been halted in favor of Core i7, it is now a good time to analyze how crossfire scales on these two motherboard chipsets as a guide for those looking to possibly upgrade to the best performance so as to get the best bang for the buck. We are assuming that the owners of P35-based motherboard high-end systems, probably should have purchased Intel’s fastest Core 2 Duo processor and four gigabytes of high-speed memory so as to still have a decent gaming PC. In this review, we will attempt to determine if P35 motherboards are suitable for crossfire, SLi or single slot 4870, 4870-X2 or GTX280 upgrades.
P35 Express
Intel released its P35 Express chipset in mid-2007 and brought with it support for FSB-1333 processors and DDR3 SDRAM, though Intel left support of DDR2 memory intact. It’s spec is PCIe 1.0 or 1.1 and we chose a Gigabyte P35 DS3P motherboard which features PCIe 1.0 and crossfire PCI express lanes of 16x+4x and DDR2.
X48 Express
X48 and X38 chipsets are basically the same with the only real difference between the two chipsets is that Intel approved X48 for 1,600 MHz FSB speeds; both the X48 and X38 have built-in support for FSB 1600. The main features that Intel released in its X38/X48 chipsets were aimed at high-end graphics. First was full support for AMD’s crossfire by adding sixteen more lanes for full x16 transfer mode to both cards instead of splitting them into 16x+4x as in P35 Motherboards or even splitting them in half as in the later motherboard chipset P45’s 8x+8x Crossfire lanes. In addition to the added pathways, PCI express 2.0 transfer rates doubled peak slot bandwidth. We purchased a DDR2-supporting, X48-based ASUS P5E Deluxe for our tests to more directly compare with our older Gigabyte’s DDR-2 P35 DS3P MB. Overclocked to nearly 4.0 Ghz for both of our platforms, our Core 2 Duo E8600 required good RAM to achieve its optimum performance. Unfortunately, our older P35 chipset couldn’t use the DDR2-1000 setting without instability. We settled for identical RAM speeds and latencies at DDR2-800 speeds for both motherboards to compare performance identically.
Curious, why’d you set Catalyst A.I. to “Advanced”?
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ‘short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possisble performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
Very interesting, I’ll definitely be I check your site on a regular basis now.