nVidia GeForce GT200 Series Anti-Aliasing Investigation
Introduction
This is part three of a trilogy of articles investigating current AA techniques on modern hardware. In part one I compared the image quality of ATi’s and nVidia’s latest hardware. In part two ATi’s 4000 series was covered in more detail. Now with part three I’ll focus on nVidia’s GT200 series, whose anti-aliasing algorithms are identical to that of the 8000 and 9000 series. For this particular article I’ll use my GTX285.
While the purpose of this article is to compare nVidia’s AA modes to each other, great care has been taken to make the article’s format compatible with the ATi 4000 series investigation. That means you can compare the two articles if you wish, including the screenshots.
Hardware
- Intel Core 2 Duo E6850 (reference 3 GHz clock).
- nVidia GeForce GTX285 (1 GB, reference clocks).
- 4 GB DDR2-800 RAM (4×1 GB, dual-channel).
- Gigabyte GA-G33M-DS2R (Intel G33 chipset, F7 BIOS).
- Creative X-Fi XtremeMusic.
- 30” HP LP3065 (maximum resolution 2560×1600).
Software
- Windows XP 32 bit SP3.
- nVidia driver 186.18, high quality filtering, all optimizations off, LOD clamp enabled.
- DirectX August 2009.
- All games patched to their latest versions.
Settings
- 16xAF forced in the driver, vsync forced off in the driver.
- AA forced either through the driver or enabled in-game, whichever works better.
- Highest quality sound (stereo) used in all games.
- All results show an average framerate.
I run 2560×1600 with 2 gtx280’s and something very noticable was how much smoother all games run with AA (all in game no need for Nvidia control panel) when I upgraded from Vista 64 to Win 7 64. Games like Crysis and COH at 2560×1600 everything maxed with 8xcsa wouldnt even render a frame with Vista, after upgrade to WIN 7 64, COH and Crysis run 16x NP, (Cryis with low FPS obviously) SLI 285’s and 280’s Win7 is the way to go.
Yes, Windows 7 handles SLI/CF better than Vista.