Introducing AMD’s HD 6790
Resident Evil 5
Resident Evil 5 is a survival horror third-person shooter developed and published by Capcom that has become the best selling single title in the series. The game is the seventh installment in the Resident Evil series and it was released for Windows in September 2009. Resident Evil 5 revolves around two investigators pulled into a bio-terrorist threat in a fictional town in Africa.
Resident Evil 5 features online co-op play over the internet and also takes advantage of Nvidia’s 3D Vision technology. The PC version comes with exclusive content the consoles do not have. The developer’s emphasis is in optimizing high frame rates but they have implemented HDR, tone mapping, depth of field and motion blur into the game.
Resident Evil 5‘s custom game engine, ‘MT Framework’, already supports DX10 to benefit from less memory usage and faster loading. Resident Evil 5 gives you choice as to DX10 or Dx 9 and we naturally ran the DX10 pathway. There are two benchmarks built-into Resident Evil 5. We chose the variable benchmark as it is best suited for testing video cards. Here it is at 2560×1600 resolution with maxed out in-game setting plus 8xAA:
Here are the results at 1920×1200 resolution:
Finally we test at 1680×1050.
All of our video cards turn in respectable performances and their overall playability is similar at 1920×1200 except for the GTS 450. And this time we see the GTX 550 Ti lead over HD 5770 at all clocks at its target resolution of 1680×1050 but the HD 6790 is faster still at all tested resolutions.
I still think that 6790 is the 5830 of its generation. Too many cuts leads to a crippled chip that exists only because AMD marketing wanted to sell you a chip that would otherwise be thrown in the trash bin because it had too many defects to pass as a 68xx/69xx. That might be good marketing but it’s not a good deal for the buyer.
Beating the GTX 550 is an accomplishment, sure, but not much of one, since the 550 is such a garbage card to begin with.
If you’ve got $150 to spend on a video card, just save up and buy a 6950 for $250. That extra $100 has a great deal of marginal value. As opposed to, say the $150 delta between a GTX 570 and 580, which is just like throwing money away.
This generation of GPUs at 40 nm has been rather underwhelming on the whole. No true spiritual successor to the 8800 GT from either the red or green team. And with DX 11 adoption at a virtual trickle, thanks to the negative effects of consolization, it would appear that progress will be slow until the next-generation of consoles appears.
Bring on 28 nm.
On the bright side, another great review by ABT.
100% agreed with above comment!
Well, I’d say that GTX 460 1GB is almost like the 8800GT of its time, but only if you could find one for $150 with rebates.
Both companies are desperately trying to keep the prices up. Now, a $500 GTX 580 is starting to look a bit “mediocre” with some recent games like Metro 2033, Mafia 2, etc.. The price to pay for eye candy on the PC is rather high, and many games are console ports from consoles that are “several” years old, or a few PC generations behind.
I find it to be really misleading when AMD claims that the 6790 has 256-bit memory when the sawed-off ROPs limit access to only half of the available bandwidth, as the card behaves exactly like as if it has 128-bit bus. For more on this, if you want to discuss on the forums here, I started a thread: http://alienbabeltech.com/abt/viewtopic.php?f=6&t=22733
I also believe that all Barts GPU’s are VLIW4-based like the rest of Northern Islands. It’s something else that appears to be in a dimly-lit area.. when one shines a candle in that area, something just doesn’t look right.