The GTX 690 Arrives – Exotic Industrial Design takes the Performance Crown!
Nvidia has finally released its first “designer card”, the $999 GeForce GTX 690 based on its brand new 28nm Kepler DX11.1 architecture today. This new dual-GPU flagship card is a continuation of Nvidia’s strategy for an exotic card that is aimed at the highest end of PC gamers to capture their hearts.
The GTX 690 is the culmination of five years of Nvidia’s efforts with their new DX11.1 Kepler architecture, their dual-GPU flagship video card and the replacement for the GTX 590 which launched at the beginning of 2011. This time, Nvidia is aiming for GTX 680 SLI performance on a single card which is very different than with the $750 GTX 590 which is considerably slower than GTX 580 SLI.
To properly bring you this review, we are comparing the performance of the GTX 680 and the HD 7970 to see how much faster this new dual-GPU GTX 690 is.
So you will see us pit the reference GTX 690 both stock and overclocked against our PowerColor reference design HD 7970 (above right) and against the stock GTX 680 (above left) using 18 modern games and 3 synthetic benchmarks mostly using 1920×1200 and 2560×1600 resolutions. We are also comparing the performance of our last generation reference dual-GPU video cards, HD 6990 and the GTX 590 (below) as they were – up until today – the fastest video cards of AMD’s and Nvidia’s last 40nm generation.
We shall also compare Nvidia’s 3-panel Surround and 3D Vision Suround working now off of a single GTX 690 at 5760×1080 resolution. Lastly, we also bench 3D Vision 2 and PhysX, ‘on’ versus ‘off’ at the very popular 1920×1080 resolution.
What’s New with Kepler’s GTX 690?
Nvidia’s marketing buzzwords for the GTX 680 launch were, “Faster. Smoother. Richer.” The GTX 690 is also designed for extreme efficiency and high performance.
Faster
The GTX 690’s Kepler architecture is now SMX-based with 2 x 1536 CUDA cores. It promises better geometry and texture processing than Fermi thanks to its improved instruction throughput and redesign. In addition, Nvidia brings “GPU Boost”, a dynamic way to boost clocks speeds and maximize performance for each game.
Smoother
New kinds of anti-aliasing – FXAA and TXAA – are now said to compete with MSAA in terms of IQ while not sacrificing performance. And there is a new “Adaptive VSync” that is supposed to reduce tearing and stuttering associated with regular VSync. Great hardware needs great software to support it and Nvidia is also a software company.
Richer
For the first time, it is now possible to play games spanning 3 displays in Surround or in 3D Surround off of a single GeForce GPU, the GTX 680. And this time , the GTX 690 brings three dual-link DVI connectors (and a DisplayPort for a 4th accessor display) so that no adapters are needed for any DVI-enabled displays. PhysX has also been improved.
How does the GTX 690 compare with its rival, AMD’s cards
This evaluation attempts to analyze and compare GTX 690, GTX 590 and GTX 680 performance. We also include HD 7970 performance as well as AMD’s fastest card, their dual-GPU HD 6990 and we will announce a performance winner. We will also look at the details to see what this new Nvidia Kepler dual GPU brings to the table for a thousand dollars. We also believe that we have a good handle on how AMD is going to respond to Nvidia’s GTX 690 launch and we will share our analysis and insights with you.
The five competing cards
The GTX 680, GTX 590, the GTX 690, the HD 7970, and the HD 6990 are the top cards from Nvidia and AMD of this and of the last generation and we will see where they sit in relation to each other. The HD 6990 and the GTX 590 are dual-GPU cards and were considered the fastest production video cards until today. And it is important to see how much performance increase the GTX 690 has brought over the GTX 590.
Since we do not want any chance of our CPU “bottlenecking” our graphics, we are testing all of our Graphics cards by using our brand new Intel Core i7-3770K at 4.60GHz, 4 GB Kingston DDR3 and a Gigabyte Z77 motherboard. This new motherboard features an 8x + 8x PCIe 3.0 specification for CrossFire/SLI, but we are testing the GTX 680 using 16x PCIe 3.0 bandwidth for a single slot. This new Core i7-3770K at 4.6GHz is sufficient to differentiate high end video cards at high resolution and high detail settings.
And we found something very surprising – our CPU needed a +200MHz boost to 4.8GHz to differentiate some games at the lower 1920×1200 resolution when we overclocked our GTX 690! However, before we do performance testing, let’s take a look at the GTX 690 and quickly recap its new Kepler DX11.1 architecture and features.
Architecture and Features
We have covered Fermi’s GK104 architecture in a lot of detail previously. You can can read our GTX 680 introductory article and and its follow-up. The new Kepler architecture builds on Fermi architecture with some important improvements and refinements that we will briefly cover before we get into performance testing.
SMX architecture
As Nvidia’s slide indicates, the new architecture is called SMX and it emphasizes 2x the performance per Watt of Fermi. Their multi-threaded engine handles all of the information using four graphics processing clusters including the raster engine and two streaming multi-processors.
The SM is now called the SMX cluster. Each SMX cluster includes a Polymorph 2.0 engine, 192 CUDA cores, 16 texture units and a lot of high-level cache. Four raster units and 128 Texture units comprise 32 ROPs; eight geometry units each have a tessellation unit, and more lower-level cache.
To sum it all up, a GTX 680 GPU consists of 2 SMXs each, times 16 SMXs each including 192 CUDA cores, equal 1536 CUDA cores for a GTX 680 and these are doubled for the GTX 690 because two full GTX 680 cores are used.
Nvidia has really improved their memory controller over last generation as there is a 256-bit wide GDDR5 memory interface at 6Gbps declared throughput for each of the two GPUs. An onboard PLX bridge chip provides independent PCI Express 3.0 x16 access to both GPUs for maximum multi-GPU throughput.
The memory subsystem of the GeForce GTX 690 is similar to the GTX 680, consisting of four 64-bit memory controllers (256-bit) with 2GB of GDDR5 memory per GPU (4GB total).
The base clock speed of the GeForce GTX 690 is 915MHz. The typical Boost Clock speed is 1019MHz. The Boost Clock speed is based on the average GeForce GTX 690 card running a wide variety of games and applications. Note that the actual Boost clock will vary from game-to-game depending on actual system conditions. GeForce GTX 690’s memory speed is 6008MHz data rate.
The GeForce GTX 690 reference board measures 11″ in length. Display outputs include three dual-link DVIs, and one mini-DisplayPort connector. Two 8-pin PCIe power connectors are required for its operation.
This is a very brief overview of Kepler architecture as presented to the press at Kepler Editor’s Day in San Francisco a few weeks ago. When we attend Nvidia’s upcoming GPU Technology Conference (GTC) in less than two weeks time, you can expect a lot more details about the architecture.
GPU Boost
GPU Boost was invented by Nvidia to improve efficiency and to raise the GTX 690 clocks automatically in response to dynamically changing power requirements. Up until now, Nvidia engineers had to select clock speeds on a specific “worst case” power target – often a benchmark.
Unfortunately, all apps are not equal in their power requirements and some applications are far more power-hungry than others. That means that in some games with lower power requirements, the game is not optimized for higher core frequency because it is limited by a global power target. With GPU Boost, there is real time dynamic clocking with polling every millisecond. In this way, clocks can be ramped up to meet the power target of each application – not held back by the most stressful application, which is usually a benchmark, not a game.
As we found with the GTX 680, GPU Boost goes hand-in-hand with overclocking and it delivers additional frequency in addition to the clocks set by the end user. GPU Boost continues to work while overclocking to the maximum allowed by the ever-changing power envelope.
Moving the voltage higher also moves the frequency and boost higher. In practice, if you monitor the frequencies, they constantly change up and down.
Adaptive VSync
Traditional VSynch is great for eliminating tearing until the frame rate drops below the target – then there is a severe drop from usually 60 fps down to 30 fps if it cannot meet exactly 60. When that happens, there is a noticeable stutter.
Nvidia’s solution is to dynamically adjust VSync – to turn it on and off instantaneously. In this way VSync continues to prevent tearing but when it drops below 60 fps, it shuts off VSync to reduce stuttering instead of drastically dropping frame rates from 60 to 30 fps or even lower. When the minimum target is again met, VSync kicks back in. In gaming, you never notice Adaptive VSync is happening; you just notice less stutter ( especially in demanding games).
Adaptive VSync is a good solution that works well in practice. We spent more time with Adaptive VSync in playing games and it is very helpful although we never use it when benching.
FXAA & TXAA
TXAA
There is a need for new kinds of anti-aliasing as many of the modern engines use differed lighting which suffers a heavy performance penalty when traditional MSAA is applied. The alternative, to have jaggies is unacceptable. TXAA – Temporal Anti-Aliasing is a mix of hardware mult-sampling with a custom high quality AA resolve that use temporal components (samples that are gathered over micro-seconds are compared to give a better AA solution).
There is TXAA 1 which extracts a performance cost similar to 2xAA which under ideal circumstances give similar results to 8xMSAA. Of course, from what little time we have spent with it, it appears to be not quite as consistent as MSAA but works well in areas of high contrast. TXAA 2 is supposed to have a similar performance penalty to 4xMSAA but with higher quality than 8xMSAA.
TXAA will be the subject of an IQ analysis in a forthcoming article and we are told that we shall see games that support it natively this year. For now, it appears to be a great option for the situations where MSAA doesn’t work efficiently.
FXAA
Nvidia has already implemented FXAA – Fast Approximate Anti-Aliasing. In practice, it works well in some games (Duke Nukem Forever), while in other games text may be a bit blurry for some. FXAA is a great option to have when MSAA kills performance. We plan to devote a entire evaluation to comparing IQ between the HD 7000 series and the GTX 600 series as well as comparisons with the older series video cards.
Specifications
Here are the specifications for the GTX 680:
Here are the specifications of the GTX 690; it is a near-doubling of the specifications and an 8+8-pin PCIe power connectors are used.
We see everything is just about double when compared to the GTX 680 – the only difference is the memory clocks are set a bit lower but the GPU boost is set higher to nearly compensate. The GeForce GTX 690 was designed from the ground up to deliver exceptional tessellation performance which Nvidia claims is about 8 times the HD 7970’s tessellation performance. Tessellation is the key component of Microsoft’s DirectX 11 development platform for PC games.
Tessellation allows game developers to take advantage of both GeForce GTX 680 GPU’s tessellation ability to increase the geometric complexity of models and characters to deliver far more realistic and visually rich gaming environments. Needless to say, the new GTX 690 brings a lot of features to the table that current Nvidia’s customers will appreciate, including improved CUDA’s PhysX, 2D and 3D Surround plus the ability to drive up to 3 LCDs plus a 4th accessory display from a single GTX 690 with no adapters required for the most popular dual-DVI enabled displays; superb tessellation capabilities and a really fast and power efficient GPU in comparison to their previous dual-GPU flagship, the GTX 590.
Surround plus an Accessory display from a single card
One of the criticisms that Kepler addressed in Fermi was that two video cards in SLI are required to run 3-panel Surround or 3D Vision Surround. The GTX 680 and the GTX 690 now can run three displays plus an accessory display and Nvidia has changed their taskbar from the left side to the center screen. We didn’t find much difference with the taskbar in the center; it might be more convenient for some users.
One thing that we did notice. Suround and 3D Vision Surround are now just as easy to configure as AMD’s Eyefinity. And AMD has no real answer to 3D Vision or 3D Vision Surround – HD3D lacks basic support in comparison.
One new option with the GTX 680/690 is in the Bezel corrections. In the past, the in-game menus would get occluded by the bezels and it was annoying if you use the correction. Now with Bezel Peek, you can use hotkeys to instantly see the menus hidden by the bezel. However, this editor does not ever use bezel correction in gaming.
Nvidia claims a faster experience with the custom resolutions because of a faster center display acceleration. Of course, we tested Surround’s 5760×1080 resolution and even 3D Vision Surround. Check out the results in the Performance Summary chart and the 3D Vision/Surround section.
A look at the GTX 690
Nvidia has redesigned their GEFORCE logo and the GTX 690 is itself described as “Exotic Industrial Design”. It is the first “designer” card from either Nvidia or AMD and its “looks” are part of the design for efficiency and cooling.
The frame of the GTX 690 cover is made of cast aluminum, and it is protected with trivalent chromium plating. Trivalent chromium gives the GTX 690 its look and is is very durable. The fan housing of the GeForce GTX 690 is made from injection molded magnesium alloywhich is used for its light weight, heat dissipation, and acoustic dampening properties.
A 10-phase power supply with a 10-layer 2oz copper PCB provides high-efficiency power delivery with less resistance, lower power and less heat generation. Lower power and less heat also enhances the board’s longevity, while the added PCB layers provide maximum signal integrity.
To create the intricate geometries required for the fan housing, Nvidia used a form of injection molding called thixomolding, in which liquid magnesium alloy is injected into a mold. Each Kepler GPU has its own distinct cooling unit. Clear polycarbonate windows allow you to see each of these coolers that play such a critical role in cooling the GPUs.
Finally, the GeForce GTX logo on the edge of the board is LED backlit. The lettering is laser-etched, ensuring precise design.
You can see the connectors on the backplate.
We can see that there are three dual link DVI ports and a DisplayPort. To run 3 panel Surround off of a single GTX 680, we used the two DVI connectors and a passive DisplayPort to DVI adapter. We were not able to test 3D Vision Surround on the GTX 680 as we would need the more expensive active DisplayPort to dual-link active DVI adapter but were able to test the GTX 690 without the need for any adapters.
Quiet Gaming
Each GPU has its own dedicated cooling unit; each self-contained cooler consists of a copper vapor chamber and a dual-slot heatsink while an aluminum baseplate provides additional cooling for the PCB and board components.
Nvidia uses a center-mounted axial fan that has optimized the fin pitch and angle at which air from the fan hits the fin stack as the smoother the airflow, the lower the noise output. The section of the baseplate directly underneath the fan is carved with low-profile channels to encourage smooth airflow, and all components under the fan are low-profile to minimize turbulence and create an efficient airflow.
The gamer using a GTX 690 gets treated to a less-perceptible noise. When you listen to the fan alone – even at 80 percent, it’s clean and smooth – and this is especially contrasted with the HD 6990, a very noisy card, and to a lesser extent even with the GTX 590. At 95% fan speed, it is noticeable but muted compared with most other high end cards running at full blast. At regular fan profiles we never reached 60% even under maximum load in gaming; we simply did not hear the GTX 690’s fan over our very quiet Noctua fans and PSU unit. Amazing!
Here you can see the GTX 690 with its cover removed.
And here is the bare PCB.
And now the other side.
It really adds up to a very unique and pleasing design.
SLI, Tri- and Quad-SLI
The GTX 680 is set up for Quad-SLI by using two GTX 590s.
Unfortunately, Nvidia has not yet chosen to enable Tri-SLI by allowing a GTX 690 to work paired with a GTX 680. It would be relatively easy for their driver team to enable and it definitely would have advantages over using three GTX 680s in TriSLI as an “ordinary” high-quality motherboard can be used. Very few motherboards come equipped to supply 3 PCIe slots with enough bandwidth compared to the very popular Ivy Bridge Z77MB which can supply two PCIe 3.0 slots with 8x+8x bandwidth which is probably sufficient for a GTX 690 and a GTX 680.
The specifications look extraordinary with solid improvements over the Fermi-based GTX 590. Let’s check out performance after we look at our test configuration on the next page.
Test Configuration – Hardware
- Intel Core i7-3770K reference 3.50 GHz/Turbo to 3.9GHz, overclocked to 4.6 GHz and 4.8GHz; Turbo is off and HyperThreading is on.
- Gigabyte Z77MX-D3H (latest BIOS, USB/PCIe 3.0 specification; CrossFire/SLI 8x+8x).
- 4 GB OCZ DDR3 PC 1800 Kingston RAM (2×2 GB, tri-channel at 1200MHz; supplied by Kingston)
- GeForce GTX 680, 2 GB (base clocks of 1006/3000MHz and also overclocked, +150/+575MHz), supplied by Nvidia
- GeForce GTX 590, 3 GB reference clocks (607/1707 MHz), supplied by Nvidia.
- GeForce GTX 690, 4 GB reference design and clocks (base clock is 915/3004MHz and also overclocked +150/+550MHz), supplied by Nvidia
- PowerColor Radeon HD 7970, 3 GB with custom cooling at stock clocks (925/1375MHz)
- AMD Radeon HD 6990, 4GB reference design and stock clocked (850/1250MHz), supplied by AMD
- Onboard Realtek Audio
- 2 x 240 GB Kingston HyperX SSDs; one for AMD and one for Nvidia
- Thermaltake ToughPowerXT 775W power supply unit supplied by Thermaltake
- Cooler Master Elite mid-Tower case, supplied by Cooler Master
- Noctua NH-DH14 CPU cooler, supplied by Noctua
- Philips DVD SATA writer
- HP LP3065 2560×1600 thirty inch LCD.
- Three ASUS VG236 23-inch 1920×1080 120Hz LCDs supplied by ASUS/Nvidia and used for Surround/Eyefinity 5760×1080 resolution.
- Asus VG278 27″- 120Hz 1080p display and 3D Vision 2 Glasses supplied by Nvidia/ASUS.
Test Configuration – Software
- Nvidia GeForce 301.24 Beta drivers for GTX 590 and for the GTX 680; 301.33 release drivers for the GTX 690. High Quality
- AMD 12.4 WHQL Catalyst drivers; High Quality – optimizations off; use application settings
- Windows 7 64-bit; very latest updates
- Latest DirectX
- All games are patched to their latest versions.
- VSync is off in the control panel.
- AA enabled as noted in games; all in-game settings are specified with 16xAF always applied; 16xAF forced in control panel for Crysis.
- All results show average, minimum and maximum frame rates except as noted.
- Highest quality sound (stereo) used in all games.
- Windows 7 64, all DX10 titles were run under DX10 render paths; DX11 titles under DX11 render paths.
The Benchmarks
Synthetic
- Vantage
- 3DMark 11
- Heaven 3.0
- Left 4 Dead 2
- Serious Sam 3 BFE
- Crysis
- Far Cry 2
- Just Cause 2
- World-in-Conflict
- BattleForge
- Alien vs. Predator
- STALKER, Call of Pripyat
- Metro 2033
- F1 2010
- H.A.W.X. 2
- Lost Planet 2
- Civilization V
- Crysis 2
- Dirt 3
- Deus Ex: Human Revolution
- Batman: Arkham City
Before we get to the performance charts, let’s look at overclocking, power draw and temperatures.
Overclocking, Power Draw and Temperatures
Overclocking the GTX 690 is just as easy as overclocking the GTX 680. What is surprising is that we matched the GTX 680 overclock on the core and memory exactly – +150MHz on the core and +550MHz on the memory. We did not adjust the GTX 690’s voltage. Temperatures were never an issue and the fan profile remained the same – extraordinarily quiet at maximum load!
We decided to compare our GTX 590 power draw with that of the HD 7970 – both fully overclocked and in the same identical system; the only difference being the video cards. Both systems idled below 100W testifying to these cards’s excellent power management at idle and Ivy Bridge’s extreme efficiency compared to our older Bloomfield i7-920 system.
Here is the power draw of the entire system with a GTX 690 overclocked as far as we could push it and under full load:
450W is very respectable for a dual-GPU card of this incredibly high performance level! Now let’s look at an overclocked-to-the-max HD 7970 in the same system under the exact same conditions. And remember that it takes an overclocked HD 7970 to match the stock performance of a GTX 680.
Radical. The single-GPU flagship HD 7970 when overclocked and overvolted to match a stock GTX 680 uses 14 more Watts at peak load than the dual-GPU GTX 690 flagship! As you can see, it will take some real engineering wizardry to match the performance of the GTX 690 with a dual-GPU HD 7990 and there is no doubt that it will require much more power to do so.
Let’s head to the performance charts and graphs to see how the GTX 690 compares with the last generation dual-GPU flagships – the AMD HD 6990 and the Nvidia GTX 590 – as well as the top video cards of this generation, the HD 7970 and the GTX 680.
Performance summary charts & graphs
Here are the summary charts of 18 games and 3 synthetic tests. The highest settings are always chosen and it is DX11 when there is a choice; DX10 is picked above DX9, and the settings are ultra or maxed. Specific settings are listed on the Main Performance chart at the end of this page. The benches are run at 1920×1200 and 2560×1600 with separate charts devoted to overclocking, PhysX, 3D Vision, 5760×1080 Surround (including 3D Vision Surround), as well as dividing games up into easy to read charts by their DX pathway and by resolution.
All results, except for Vantage and 3DMark11, show average framerates and higher is always better. In-game settings are fully maxed out and they are identically high or ultra across all platforms. As usual, we begin with the synthetics.
Futuremark & Heaven synthetic tests
3DMark11 is Futuremark’s latest DX11-only benchmark and Vantage is DX10. Unfortunately, scores are completely meaningless when they are presented in this way but they do offer supporting data to accompany our game benches. Here is the chart with Vantage and 3D Mark11:
The GTX 690 simply pulls way ahead of any competing card and overclocked +150/+550MHz, it is a beast! Heaven 3.0 is a very demanding benchmark and here it is expressed in a chart.
Again, synthetic tests are interesting but they are not necessarily indicative of real world gaming performance. In all three cases, the GTX 690 “wins” over everything else by a large margin. Next up, let’s look at DX9 games.
DX9 Games
We test the popular Source Engine represented by Left 4 Dead 2 and also a demanding DX9 game, Serious Sam 3, BFE with both at completely maxed out settings. First up is 2560×1600:
Now we see 1920×1200
Both Left 4 Dead 2 and Serious Sam 3 BFE are faster on the GTX 690 than any of the other previously fastest cards in the world. Let’s check out DX10 games
DX10 Games
We test four DX10 games – Just Cause 3, Far Cry 2, Crysis and World in Conflict, Soviet Assault. Here is 2560×1600 resolution.
Now at 1920×1200:
Out of these four DX10 games, the GTX 690 simply stands out from the rest.
DX11 Games
Most of our testing emphasizes DX11 games and we bench 12. Since the charts get too long, we break them up into charts of 6 games each.
First up are the older DX11 games at 2560×1600
Now those same games at 1920×1200:
Now the newer DX11 games at 2560×1600:
[Typo Alert! – The GTX 590 results are wrong for Civ V. Instead of 43.6 at 2560×1600 and 69.2 at 1920×1200, it should read 81.4 at 2560×1600 and 95.5 at 1920×1200]
Now the same DX11 games at 1920×1200:
There is simply no contest – the GTX 690 is the fastest video card anywhere.
Super-Widescreen 5760 x1080, Surround, 3D Vision Surround, and PhysX
Here is the main chart that gives the details for the tests:
Let’s look at 3D Vision at 120Hz versus the same settings with 3D Vision at the popular 1920×1080 resolution:
Only Metro 2033 with completely maxed out details gives the GTX 690 a workout. Now the GTX 690 benchmarked in 3-panel Surround with slightly lesser settings (see the main chart above).
A couple of games would need to have their settings reduced. Just remember that you are playing across three screens and rendering each scene twice for 3D Vision!!
One thing that we found really strange and it may be a driver bug with Kepler – the frame rates are no longer locked to 60Hz in the 3D Vision drivers as we previously tested. Previously, frame rates would be capped at 60 fps if possible. However, now it is very convenient to see exactly what performance penalty 3D Vision takes – sometimes it is more than 50 percent; other times less.
Next up, let’s look at PhysX
PhysX
We test PhysX in two games. Batman: Arkham City makes great use of PhyxX and it is a shame to play the game without it. In both cases, turning on PhysX, although affecting the frame rate, it is enough to play the game with fully maxed out details and AA with our GTX 690.
Let’s check out Overclocking:
Overclocking
We overclocked our GTX 690 +150MHz on the core and +550MHz on the memory. This is a fantastic overclock on stock voltage and stock fan profile and it falls only a tiny bit short of the overclock on our GTX 680 of +175/+575MHz !!
Here are all of our games compared at 2560×1600 – stock versus overclocked:
Here is 1920×1200
As you can see the GTX 680 scales extremely well at 2560×1600 but not as well at 1920×1200. Our CPU at 4.6GHz simply needs to be clocked higher as our CPU scaling chart shows.
CPU Scaling. Is Ivy Bridge Core-i7 3770K at 4.6GHz fast enough for the GTX 690? Not really (!)
In looking at the above charts we notice that the GTX 680 scales really well with an overclock at 2560×1600 but not as good as 1920×1200. So it brings us to CPU Scaling – Is Ivy Bridge fast enough at 4.60GHz for a GTX 690 at 1920×1200? – The surprising answer, is ‘No’, not for every game. Two games that do not scale well at 1920×1200 are Civiliation V and the original Crysis. Let’s check out a stock-clocked and overclocked GTX 690 and the only difference in third position on this chart is that we overclocked our Core i7-3770K another +200MHz, from 4.6GHz to 4.8GHz.
That small +200MHz overclock on the CPU now lets the GTX 690 noticeably perform better instead of waiting on the CPU at a lower clock. It appears that 5.0GHz might ba a very good overclock for the GTX 690 to really show it strengths in gaming at lower resolutions.
Main Overall Summary chart
In the first three columns of the main performance summary chart, the GTX 690 is tested at stock and overclocked; next is a single GTX 680 and then the GTX 590 is sandwiched between the 3GB HD 7970 and the HD 6990. This is the master chart and it has not been made into a graph as there is too much information to put onto a single graph.
[Typo Alert! – The GTX 590 results are wrong for Civ V. Instead of 43.6/69.2, it should read 81.4/95.5]
No matter how you add it up, the GTX 690 is generally faster than any other video card. It also overclocks very well with the stock voltage and fan profile. AMD would have to significantly increase the clocks of the HD 7990 to catch the GTX 690 and we expect that it would use a lot more power and be relatively noisy and difficult to cool. We are eager to see what AMD actually brings with their own upcoming dual-GPU flagship card.
Let’s head for our conclusion.
Conclusion
This has been quite an enjoyable, if far too short, 3-day exploration for us in evaluating our new GTX 690 since it arrived on Monday (this week). It did extraordinarily well performance-wise comparing it to the the GTX 590 where it brings higher performance than any other video card in the world to date. We are totally impressed with the pair of cool-running Kepler chip that has such outstanding overclockability. It slots way above the HD 6990 and GTX 590 and it leaves the GTX 680 and HD 7970 in the dust.
It also looks awesome inside just about any case. Here it is crammed into a Cooler Master Elite 430 mid-tower. We plan to move our system shortly in to a full-tower Thermaltake Overseer RX-I case. Nvidia indicated that the LED logo is programmable and that their partners may have their own light sequence available for end users.
We see good overclockability with extreme quietness at stock voltage and fan profile even when the GTX 690 is highly overclocked. The only issue might be price and we expect that it gives about 95% of the performance of true GTX 680 SLI at $500 each in one strikingly presented package. $999 is expensive yet we are quite certain that it will appeal to the gamer who demands the very best – without any compromises.
Pros
- The GTX 690 is the most powerful video card in the world!
- TDP and power draw is excellent. Performance per watt is better than its competitor’s last generation flagship 40nm HD 6990 (to say nothing about even matching recent single-GPU solutions) and it is nearly dead silent in comparison to any other dual-GPU card.
- Overclockability is excellent – GPU Boost works as advertised.
- The reference design cooling is quiet and efficient; the card and well-ventilated case stay cool even well-overclocked.
- It is possible to use two of these cards for extreme Quad-SLI performance without needing a massive PSU
- 3D Vision 2 and PhysX enhance gaming immersion and both are improved using the GTX 690 compared to the last generation.
- Surround and 3D Vision Surround plus an accessory display can now be driven off of a single GTX 690 without requiring any adapters;
- New AA allows for high performance without jaggies in deferred shading lighting engines
- Adaptive VSynch reduces stuttering while retaining the advantages of minimizing tearing.
- The GTX 690 is the fastest video card – period! And if looks are counted, it is very impressive as an “Exotic Industrial Design”
Cons
- The price. The first thousand dollar production video card takes your breath away – at first. Yet it compares favoribly in every way to GTX 680 SLI which is priced the same.
- No Tri-SLI available to pair a GTX 690 with a GTX 680.
The Verdict:
- If you are buying a flagship video card right now and looking for the highest performance, the GTX 690 is the only choice. We feel it deserves ABT’s highest award – the “Kick Ass” award because it is completely unique in design and performance. We feel that it also deserves ABT’s “Innovation” because it is the first designer card that incorporates important features to cater to the very highest end gamer without compromise.
We do not know what the future will bring, but the GTX 690 brings an excellent top-performer to the GeForce family. With great features like PhysX and the second generation of 3D Vision, you can be assured of immersive gaming by picking this card for 2560×1600 or even higher resolutions including for Surround and/or for 3D Vision Surround.
If you currently game on an older generation video card, you will do yourself a big favor by upgrading. The move to a GTX 690 will give you better visuals on the DX11 pathway and you are no doubt thinking of Quad-SLI if you want to get the ultimate in gaming performance. We expect that many enthusiasts will – like us – upgrade to Intel’s Ivy Bridge – and this is the perfect video card to compliment their fastest processors.
The competition is hot and AMD offers their own set of features including Eyefinity and HD3D but they simply cannot touch the raw power of the GTX 690. However, we expect that they will introduce their own flagship dual-GPU card, the HD 7990 and we look forward to it.
Stay tuned, there is a lot coming from us at ABT. Next up is an evaluation of the flagship Noctua NH-DH14 CPU cooler which has allowed us to increase our Core i7-3770K to 4.6GHz just for this review – and we find we can even clock it higher to 4.8GHz. And you can expect more great reviews from our Mobile Tech guys to put into our new section; expect a Genius product review also this week! And don’t forget to check our forums! Our tech discussions are becoming among the best to be found anywhere!!
Mark Poppin
ABT Senior Editor
Please join us in our Forums
Become a Fan on Facebook
Follow us on Twitter
For the latest updates from ABT, please join our RSS News Feed
Join our Distributed Computing teams
- Folding@Home – Team AlienBabelTech – 164304
- SETI@Home – Team AlienBabelTech – 138705
- World Community Grid – Team AlienBabelTech
just WOW!!!!!
I added a section on Overclocking, Power Draw and Temperatures that compares the overclocked and overvolted HD 7970’s power draw to the overclocked GTX 690.
Also, added the charts that specifically focus on performance scaling that comes from overclocking the GTX 690. Ivy Bridge might be too slow for some games at 1920×1200!
This graphics card really looks amazing! I can’t believe the pure power it packs. The price is quite high, however – I guess Nvidia can justify this as some of the components are quite rare.