nVidia GTX260+ Bottleneck Investigation
Commentary
Starting with Far Cry 2, we see the benchmark resisting attempts to show differences compared to the other titles, especially at 0xAA. This is possibly because the benchmark is CPU limited, and/or because it doesn’t have the precision to pick up a 7% underclock as well as the other games do. Nevertheless, it appears memory bandwidth is the most important element in this benchmark.
Moving on to Clive Barker’s Jericho, we notice a large performance drop with 0xAA when the shader clock is lowered, which then shifts to the memory clock when 4xAA is enabled. Clearly this title is shader bound without AA, and memory bandwidth bound with AA.
In Stalker Clear Sky we see the core clock being the most important without AA, and the memory clock becoming slightly more important when 4xAA is enabled.
And finally in Unreal Tournament 3, both the core and memory experience increased performance hits when moving from 0xAA to 4xAA, and in both cases the memory clock is the most important factor in overall performance.
Conclusion
As far as overall results go, the 8800 Ultra experienced more than double the performance loss with the core clock compared to the other clocks. The GTX260+ however is reasonably even across all three clocks, each showing an average performance drop between 2% and 3%. This is likely because the GTX260+ is a more balanced design than the 8800 Ultra, and also possibly due to the different selection of games being used.
So while it was a given to increase the core as high as possible if you were overclocking the 8800 Ultra, things aren’t as clear-cut with the GTX260+. Some experimentation may be necessary to get optimum performance out of the games you play at the settings you use.
Please join us in our Forums
Follow us on Twitter
For the latest updates from ABT, please join our RSS News Feed
Hello there, just browsed by. Cool website. Have a great day.