Nvidia’s Titan arrives to take the performance crown – 36 Performance Benchmarks

Overclocking, Acoustics and Temperatures


There are some new or reworked Kepler features for Titan over the GTX 600 series. The first is GPU Boost 2.0

GPU Boost 2.0

The original GPU Boost was designed to reach the highest possible clock speed while remaining within a predefined power target so that the GPU would boost to the maximum clock speed it could achieve while remaining under a certain power level. In the case of the GTX 680, that power level was 170 watts. Nvidia noted that GPU power was unnecessarily limiting performance when GPU temperatures was low. Therefore for Boost 2.0, they switched from boosting based on a GPU power target, to a GPU temperature target. This new temperature target is 80 degrees Celsius.boost 2.0 Nvidias Titan arrives to take the performance crown   36 Performance Benchmarks As a result of Nvidia’s change, the GeForce GTX Titan GPU will automatically boost to the highest clock frequency it can achieve as long as the GPU temperature remains at or below 80C. The GPU constantly monitors GPU temperature, adjusting the GPU’s clock and its voltage on-the-fly to maintain this temperature. GPU Boost 2.0 can also deliver quieter noise levels since temperature is controlled to a tighter range around the user target. This in turn helps keep the fan speed stable and reduces the overall system level noise.

In addition to switching from a power-based boost target to a temperature-based target, Nvidia also gave end users more controls for tweaking GPU Boost behavior. Using software tools provided by add-in card partners, Titan users can adjust the GPU temperature target. By adjusting the temperature target higher, the GPU will then boost to higher clock speeds until it reaches the new temperature target.

Due to the change in the way GPU Boost 2.0 functions, the power target setting no longer sets the typical board power, but rather it sets the board’s max power. At the default power target setting of 100%, the max power setting is 250W. At the max slider setting of 106%, max board power is 265W. Note that typical board power will vary based on the ambient temperature. GPU Boost 2.0 is designed ensure maximum performance from water cooled solutions since the temperature now basically determines the clocks.

GPU Boost 2.0 – Overvolting

overvolt 1 Nvidias Titan arrives to take the performance crown   36 Performance Benchmarks Because Titan’s boost clock and voltage level is now tied to the GPU temperatures, Nvidia allows the GPU voltage to go higher than they allowed with Kepler 600 series cards. Just as with the 600 series, voltages on the Titan are limited to a range fully qualified by Nvidia. This voltage range is designed to protect the silicon from long term damage.overvolt 3 Nvidias Titan arrives to take the performance crown   36 Performance Benchmarks However, those who want to push their GPUs to the limit by raising the maximum voltage further, GPU Boost 2.0 enables extra “overvoltaging” capability. This “unlocking” requires users to acknowledge the risk to their GPU’s warranty by clicking through a warning as overvoltaging is disabled by default. Each individual Titan manufacturer may limit the degree of overvoltaging supported by their cards. Support of overvoltaging is optional, and can be completely disabled in the VBIOS if they choose to do so.

Overclocking Titan

Overclocking the GTX Titan is just as easy as overclocking the GTX 680 or GTX 690. What is surprising is that we nearly matched the GTX 690′s overclock of +150/+550MHz on the core and memory. Titan managed a completely stable +130MHz on the core and +550MHz on the memory. We did not adjust the GTX Titan’s fan profile, thermal limit nor voltage for our stock benchmark runs.

For testing overclocking stability, we first used FireStrike from 3DMark 2013.  The first column (below) has the Titan out of the box stock.  The next column shows the performance improvement by just increasing the temperature limit from 80C to 85C.  We found that Titan no longer throttled back Boost during gaming as it did at 80C and our performance became less variable.OCing Nvidias Titan arrives to take the performance crown   36 Performance Benchmarks

Moving up the power slider to 106% and the temperature up to the maximum 94C using EVGA’s Precision showed very little performance gain over simply setting the temperature limiter to 85C.  Next (in the 4th column), we overclocked the memory +550Mhz and got a decent boost but not as much as overclocking the core +130Mhz (5th column), which was our highest overclock on stock fan profile and voltage. Finally we overclocked the memory and the core in the second to last column while the last column shows Titan on the upper limit of stability with the voltage unlocked and maxed out.

We would suggest that for absolute stability, +155MHz (not +165MHz, sufficient for a Firestrike run) might be the maximum we can run which is about +25Hz more with the unlocked voltage than at stock.  We saw 1188MHz as our highest boost at 1.200V, the highest overvoltage Titan would allow.  It’s probably not what extreme overclockers are looking for since Nvidia has still locked down the voltage and the TDP pretty tightly.

Temperatures running at stock settings were an issue for Boost as the fan profile remained extraordinarily quiet at maximum load, but they allowed the temperature to reach 80C which throttled back Boost and made some of our results variable.  We found that setting the upper limit to 85C no longer limited Bost; our other choice was to set the fan profile higher so that it would reach 60%.  For us, the VGA fan became noticeable over 60% and much more so at 75%.

Let’s head to the performance charts and graphs to see how the GTX Titan compares with the GTX 690 as well as with the last generation dual-GPU flagships – the AMD HD 6990 and the Nvidia GTX 590 – as well as with the top video cards of this generation, the HD 7970 and the GTX 680.


Pages: 1 2 3 4 5 6


Founder and Senior Editor of ABT.

11 Responses

  1. Bo_Fox says:

    Wow, so you’re actually able to use the 7970 for these games at 5760, at
    PLAYABLE frame rate, with PhysX @ HIGH, running on the CPU?

    Wow! GTX 680 isn’t any faster than 7970GE by much at all…. whoa! Batman: AC and Borderlands 2????

    So, GTX 680 basically loses its PhysX advantage right there! HMMMMM?????

    You rule! Since no other sites are showing this!

  2. Bo_Fox says:

    Also, your site is basically the only site to still include BOTH GTX 590
    and HD 6990 with the latest drivers, compared against each other more
    than 1.5 year after release!

    Another golden one for ABT!

    89 BENCHMARK RESULTS FOR EACH CARD (no minimum or maximum crap, or
    low-resolution crap, or anything else – just 1080p, 25×16, 3x1080p, and
    PhysX)! (While also wisely selecting the maximum playable settings for
    each game for the article!)

    Isn’t that the new record or what? For any review site out there on the internet!!!!!!!!!!!!!!!!!!!

  3. DOOM$ says:

    so dusty lol

  4. Bo_Fox says:

    Oh, Apoppin explained the PhysX results – that the average does not represent the minimums very well, especially when PhysX comes into play. So, never mind my first comment!

  5. nod says:

    all other sites usually show titan,690,680 and 7970….590 and 6990 maybe older gen…but they are also still high end cards and should be mentioned more in the benchmarks….so a big thank you for a comprehensive comparison

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>