This article will focus on the image quality enhancements introduced by nVidia’s GeForce 400 series (GF100/Fermi). Given AF quality has remained unchanged since the 8000 series, this article will focus on AA, and all testing will utilize a single GTX480.
For some background information, I’d also suggest reading the similar articles I’ve written in the past:
- ATi 5000 Series Image Quality Analysis.
- nVidia GT200 Series AA Investigation.
- ATi 4000 Series AA Investigation.
- Image Quality Comparison.
- Intel Core i7 870 (3.2 GHz, turbo on, HT off).
- 4 GB DDR3-1333 RAM (2×2 GB, dual-channel).
- Gigabyte GA-P55-UD3 (Intel P55 chipset, F6 BIOS).
- nVidia GeForce GTX480 (1.5 GB, reference clocks).
- Creative X-Fi XtremeMusic.
- 30” HP LP3065.
- Windows 7 (64 bit).
- nVidia driver 258.96, high quality filtering, all optimizations off, LOD clamp enabled.
- DirectX June 2010.
- All games patched to their latest versions.
- 16xAF forced in the driver, vsync forced off in the driver.
- AA forced either through the driver or enabled in-game, whichever works better.
- Highest quality sound (stereo) used in all games.
- All results show an average framerate.