Big GPU-Shootout; Part III, PCIe 1.0 vs. PCIe 2.0
Call of Juarez
Call of Juarez is the first ever DX10 benchmark from Techland that was released in June 2007 as a fast-paced Wild West Epic Adventure Shooter. Call of Juarez is loosely based on Spaghetti Westerns that become popular in the early 1970s. Call of Juarez features it’s Chrome Engine using Shader Model 4 with DirectX 10, so the usage of Vista is mandatory. This benchmark isn’t built into Call of Juarez but is a stand-alone that runs a simple fly-through of a level that’s built to showcase the game’s new DX10 effects. It offers great repeatability and is a good stress test for DX10 features in today’s graphics cards although it is not quite the same as actual gameplay as the game logic and AI are stripped out of the demo. Still it is very useful in comparing video cards performance.
Performing Call of Juarez benchmark is easy as you are presented with a simple menu to choose resolution, anti-aliasing, and two choices of shadow quality options. We set the shadow quality on “high” and the shadow map resolution to its maximum, 2048×2048. At the end, the demo presents you with the minimum frame rate, maximum frame rate, and average frame rate, along with the option to quit or run the benchmark again. We always ran the benchmark at least a second time and recorded that generally higher score.
Call of Juarez DX10 benchmarks
Interesting. 280GTX shows mixed results on our motherboards. It shows more gain at the lower of the two resolutions we chose. Again, 4870-512MB shows little difference in either motherboard with the edge going to the X48. This time, the 512MB 4870 beats the 1GB version, consistently! 4870-X2 gains +7 FPS in the maximum frame rate at 1920×1200 while crossfireX-3 gains about +5 FPS with the PCIe 2.0 MB over the P35 1.0 spec. It is doubtful it makes a practical playable difference in the minimum frame rates, as we see 4870X2 only picking up a few tenths of a frame at 1920×1200, but crossfireX-3 gains more than +3 FPS – from 40>43.4. For 16×10 resolution, perhaps there is more reason to upgrade you P35 MB as we do see a bit more performance using X48 motherboard; but not for crossfireX-3 with 1.0 GB VRAM over 512 MB as the 2nd card in Call of Juarez.
Curious, why’d you set Catalyst A.I. to “Advanced”?
How about a few links to explanations of Catalyst AI and what “advanced” really does? Here is an old article on it:
http://www.hardocp.com/article.html?art=NjY2LDI=
Here is the tweak guide which supports my own research:
http://www.tweakguides.com/ATICAT_7.html
“Catalyst A.I. allows users to determine the level of ‘optimizations’ the drivers enable in graphics applications. These optimizations are graphics ‘short cuts’ which the Catalyst A.I. calculates to attempt to improve the performance of 3D games without any noticeable reduction in image quality. In the past there has been a great deal of controversy about ‘hidden optimizations’, where both Nvidia and ATI were accused of cutting corners, reducing image quality in subtle ways by reducing image precision for example, simply to get higher scores in synthetic benchmarks like 3DMark. In response to this, both ATI and Nvidia have made the process transparent to a great extent. You can select whether you want to enable or disable Catalyst A.I. for a further potential performance boost in return for possibly a slight reduction in image quality in some cases. If Catalyst AI is enabled, you can also choose the aggressiveness of such optimizations, either Standard or Advanced on the slider. The Advanced setting ensures maximum performance, and usually results in no problems or any noticeable image quality reduction. If on the other hand you want to always ensure the highest possible image quality at all costs, disable Catalyst A.I. (tick the ‘Disable Catalyst A.I.’ box). I recommend leaving Catalyst A.I enabled unless you experience problems. ATI have made it clear that many application-specific optimizations for recent games such as Oblivion are dependent on Catalyst AI being enabled.
Note: As of the 6.7 Catalysts, Crossfire users should set Catalyst A.I. to Advanced to force Alternate Frame Rendering (AFR) mode in all Direct3D games for optimal performance. Once again, Catalyst A.I. should only be disabled for troubleshooting purposes, such as if you notice image corruption in particular games”
In other words, one can choose the aggressiveness of your optimizations, either “Standard” or “Advanced”. The Advanced setting ensures maximum performance – as for benchmarking games – and with no noticeable image quality reduction. However, if you are doing IQ comparisons as BFG10K did, and want to guarantee the very highest image quality, then disable Catalyst A.I. [but not for crossfire; set it to “Standard”]. I have always recommended leaving Catalyst A.I enabled unless you experience any glitches in games.
You have to realize that Cat AI is not necessarily supposed to give you a boost in every single game. It tries to do optimizations, if possible, but many times these are either not possible with a particular game, or the settings you’ve chosen in the game may be too low for it to make any noticeable impact.
That is why I recommend leaving it on “Advanced”; you get a possisble performance boost; if not then you lose nothing. Or you can set it to standard or off if you feel your image quality is being degraded.
Hope that explains it.
Very interesting, I’ll definitely be I check your site on a regular basis now.