NVIDIA’s GTX 480 Performance Testing

12 Responses

  1. jonny says:

    From reading all the reviews online I don’t see a reason to either of the two new nvidia cards. Most of the reviews show at the higher resolutions ati cards take back the lead or get really close. High resolutions are one of the primary reasons to buy a high end card. The only reason to go nvidia is really for physx at this point. The audio through hdmi isn’t as good as what ati is offering either. I have seen multiple sites say that they had these cards running at 99 degrees and in sli they are expected to draw close to 800 watts. Most sites are reporting that they are taking more power than the spec for the card shows is the max. Price just isn’t worth it either. This launch was 6 months late and still isn’t actually launching until two weeks into April. I fully expect a refresh from ATI for current cards and possibly price drops or adjustments that give them a clear advantage. Also remember because this launch is so delayed that ATI is 6 months ahead for the next gen of graphics cards because of all the delays from nvidia and AMD and ATI are finally getting all their eggs in a row after their merger and the gpu’s are getting die shrinks faster than they where before and leading to cooler running gpu’s. Not only will Nvidia ferni cards ear up you wallet now they will continue to do so with with the power draw and also to increasing cooling for your pc and possible extra costs of air condition this upcoming summer.

  2. Jon Worrel says:

    Thanks for the review Mark! Although my opinions remain moderately negative and disheartened against Nvidia’s GF100 first-generation chips and the entire 40nm fabrication process in general, I am somewhat enamored by the marketing team’s ability to turn this generation of enthusiast GeForce products into a highly-demanded revenue stream with only two major marketing points.

    When I was over at Nvidia’s headquarters in late October and asked several engineers what their initial impressions were of the GF100 architecture, the consensus simply responded with, “oh..Fermi got the flu.” At first, I wasn’t quite sure how to comprehend the response, as it was answered in a very general, casual fashion. What I soon learned from Fudo, Theo Valich and several technical marketing managers at Nvidia was that the chip’s original November launch plans had been completely removed from the 2009 timeframe.

    Over the next few months, it was very disheartening to read several commentaries on the state of the fabrication process regarding the rampant production issues of high power walls, leaky transistors (resulting in the shader drop from 512 cores to 480 cores) and variable chip temperatures (due to bad bumps).

    What I am surprisingly thankful for out of all of this mess, however, was Nvidia’s ability to produce a wonderful tessellation architecture with unparalleled SLI scalability across additional GPU cores. I’m quite sure that many journalists, analysts and consumers did not see it coming either, but it was a very impressive comeback and a much needed marketing point for the green team.

    As much as I am looking forward to the *properly designed* second-generation GF100 chips between Q4 2010 and Q1 2011 with great anticipation, I still have a level of respect for the company’s ability to take a product afloat a sinking ship and rescue it with incredible multi-GPU scalability and 3D Vision Surround!

  3. Joe Smith says:

    Dude your review is broken. On the Left 4 Dead page you said the GTX 480 stumbles badly vs. the 5870…

    But it doesn’t.. at least any worse than anywhere else.. It’s just that your chart is broken!!! Instead of the far left being “0fps” it’s “74fps”, effectively zooming in on the gap, making it look wayyyy bigger than it actually is.

  4. Joe Smith says:

    I see this problem actually applies to alot of the charts. I suggest you just fix them in excel to avoid confusing or misleading people. There are also some cases where, because of this, the delta looks much bigger than it actually is in NVIDIAs favor.

    In excel you can right click the X-axis and go into it’s properties and set the minimum value to 0. This will force your graph to always show the full chart, instead of the zoomed in version.

  5. apoppin says:

    Thanks for the suggestion, Joe.

    I stand by what I said about the GTX 480 “stumbling badly” compared to HD 5870. Let’s look at the context:

    The GTX 480 is generally leading the HD 5870 until we get to Left 4 Dead. Then it *stumbles*

    In my own personal opinion, minus 4-1/2 frames is significant when you consider that the GTX leads the Radeon in the other benches. That is why I suggested that it could be immature drivers holding back the GTX in this Source Engine benchmark.

    As to my charts being misleading, I pointed this out in Unigine Heaven 1.0 benchmark:


  6. Wow! If you look at the graph it looks like a lot, but there is only 0.1 FPS difference!! Beware of just looking at the graph. Our HD 5870 can can keep up with the GTX 480 at the maximum resolution in Heaven benchmark.
  7. I want to make absolutely sure that my readers look carefully at the numbers, not just at the charts and at the “gap”. Marketing and PR often use a chart to further their own agenda.

    Also, if one is comparing just two or maybe three video cards, it is good to emphasize the difference between them and I believe that it is not necessary to start with “0” on a graph; especially if you are dealing with reasonably a high FPS.

    At any rate, always look at the frame rate. The numbers are not misleading in any way and the charts are representative of the close performance between these two most excellent video cards – HD 5870 and GTX 480; each in its own price range and each with a compelling set of features.

  • apoppin says:

    To Jon Worrel,

    Many thanks for your kind words. And sorry for the delay in my response. There have been issues with our editorial staff being able to post comments. :O

    I feel positively about the GTX 480. Perhaps you were expecting a bit more and were disappointed.

    I look at this launch differently. It is as though AMD had brought an HD 2900-XTX to the market and it had beat the 8800-GTX performance wise instead of only matching the 8800-GTS. I believe that NVIDIA pulled off quite a feat. I remember Jensen’s words from GTC, “We want a General Purpose Processor that can do amazing graphics”.

    IMO, they delivered – in spades. And I am not referring to any proprietary “features” the new GF100 brings to the market. Besides, I am mostly saving CUDA, 3D and Surround for my Part Two of this review – O/C’d GTX 480 vs. PowerColor HD 5870 PCS+ (which I am lead to believe is a Super-Overclocker).

    I believe that this new GF100 chip is the beginning. I believe that NVIDIA could have brought it to market 6 months from now with tamed thermals and all SPs enabled. But that did not seem to make sense from a business standpoint. If you do not like the TDP of the GTX 480, the GTX 470 is 35W lower. And of course it will be respun. It is only logical that they would aim to improve yield, performance and likely bring out a 512 SP-enabled “ultra” – perhaps even with the current stepping.

    I also do not doubt that we will see some kind of “GX2” if their ultra cannot catch the HD 5970. It is their nature to be so competitive – which is wonderful for us as consumers and enthusiasts.

    NVIDIA has recently brought hotter and hungrier GPUs to market for the enthusiasts – that is just the way they do things with their philosophy of monolithic dies. AMD achieves a similar level of performance with a completely different philosophy. No one can say who is right or wrong – just who they prefer.

    I really like both companies. I am impressed with HD 5870 for about $400 with a CoD MW2 game tossed in. But then I am also impressed by the raw performance of GTX 480 fo $500. I love them both and I look forward to pitting CrossFired HD 5870s against SLI’d GTX 480s.

    And you can look at it another way. I would say that GTX 480 TriSLI would match the performance of your HD 5870 QuadFire with a very similar thermal output and even be price competitive. I also realize that it would not be such a great “upgrade” for you.

  • turingpest says:


    “I believe that this new GF100 chip is the beginning. I believe that NVIDIA could have brought it to market 6 months from now with tamed thermals and all SPs enabled.
    … I also do not doubt that we will see some kind of “GX2? if their ultra cannot catch the HD 5970. It is their nature to be so competitive – which is wonderful for us as consumers and enthusiasts.”

    all undoubtedly true. as jonny has already pointed out however, the problem for nvidia is that they are a good 6 months behind ati, who not only have a lot of manoeuvrability with pricing and will probably produce a hd 5890 to do even more damage to nvidia in price/performance terms, but will also be introducing a completely new architecture in 6 months time. and if this doubles evergreens compute power with a similar die size/tdp, then fermi as the leading directx11/directcompute architecture will be superseded.
    of course it remains to be seen what the middle and bottom tier fermi cards will be like – which is of course where the money is made – but nvidia are up against it both time and tech-wise. fundamentally, fermi has left nvidia unable to turn tight. lucky for them they have a legion of loyal customers and a large marketing budget. and then, as you rightly point out, all of this is very good for us consumers.

  • Marcin3891 says:

    Hello fellas

    What is the main question here is, how long will it take for Nvidia to release a PROPER fermi based GPU with all 512 cores enabled (keep in mind, 480 and 470 and only half baked chips that didn’t pass the whole 512 core testing most likely (imo, same story as usual), and their prime stock of 512 core cards goes to 3d studio/design/engineering rendering cards) and how long will it take for them to release properly polished off, shiny and squeaky clean set of drivers – and how much more time does Nvidia still have until ATI slaps in the face with their (I’m assuming the name) 6xxx line of cards.

    I remember when I bought my sli combo of 9800gt’s, had it for a while on some earlier drivers, and than suddenly came out newer revision, and brought a good performance gain across the board, some titles benefit greatly, but all games I’ve played showed an increase in smoothness.

    At this very moment, any arguments about 480 this, and 5870 that are steam let out too soon, I have a sneaky suspicion that the delay for proper release of the cards until 12th of April isn’t only hardware related, somewhere there is an entire floor of hungry programmers that aren’t getting any food until they compile proper drivers ;)!

    For next 2 weeks, we can only sit and wait, most likely new drivers will show up within 7~10 days (before the 12th), than we’ll see again what the numbers are telling us.


  • jonny says:

    My main issue with the late release date and the not overwhelming numbers is that they are compared to 6+ month old ATI cards. I believe someone from ATI said there will be a refresh at some point for ATI cards. So that should speed them up. Then of course ATI has a 6 month head start on the next gen of cards and with the issues that ferni has brought up I don’t see nvidia being able to make that gap up with a product that can compete and stay cool at a lower power draw. And if you already have a high end ATI card I don’t see much of a reason to switch to nvidia. As far drivers I have been super impressed with ATI so far this year. They are bringing new features and bumping up performance for games as well. Nvidia is pulling drivers because of issues only to re-release them later. I haven’t been impressed with nvidia drivers for a long time.

  • apoppin says:

    Thanks, Aaron .. already fixed.

  • I’m glad I had some more time on my hands and could give you a much closer look. Very impressive work and an impressive site. Added to my bookmarks. :)

  • Bo_Fox says:

    Great review.. I suspect the GTX 480 to be just like HD2900XT was back then on 80nm. At least it’s not as much of a “failure” as the HD2900XT, but it’s way more power-hungry. True, the HD2900XT was considerably slower than 8800 Ultra, but the power consumption and heat output was pretty much the same. Heck, the consumption was actually equal to less hungry 8800GTX according to some reviews.

    Anyways, the point here is that Nvidia made sure that they came out at the top with the fastest single GPU card but at some cost. I mean, the real point here is (LOL) that the Fermi architecture appears to be designed with the future in mind, in the same way that the R600 was designed. With a reduced number of TMU’s (albeit more efficient ones), the ratio seems better suited with the number of shaders for better scaling in the future. On a smaller 28nm shrink, we’d be seeing a far better design with at least double the specifications for more than 100% increase in performance (similar to RV770 versus the original R600) while consuming even less power.

  • Leave a Reply

    Your email address will not be published.