Introducing AMD’s HD 6790
Lost Planet
Lost Planet: Extreme Condition is a Capcom port of an Xbox 360 game. It takes place on the icy planet of E.D.N. III which is filled with monsters, pirates, big guns, and huge bosses. This frozen world highlights high dynamic range lighting (HDR) as the snow-white environment reflects blinding sunlight as DX10 particle systems toss snow and ice all around.
The game looks great in both DirectX 9 and 10 and there isn’t really much of a difference between the two versions except perhaps shadows. Unfortunately, the DX10 version doesn’t look that much better when you’re actually playing the game and it still runs slower than the DX9 version.
We use the in-game performance test from the retail copy of Lost Planet and updated through Steam to the latest version for our runs. This run isn’t completely scripted as the creatures act a little differently each time you run it, requiring multiple runs. Lost Planet’s Snow and Cave demos are run continuously by the performance test and blend into each other.
Here are our benchmark results with the more demanding benchmark, Snow. All settings are fully maxed out in-game including 2x or 4xAA/16xAF. Let’s start with 1920×1200 resolution with 2xAA.
The resolution is set too high for our target cards. Next we test at 1680×1050 and with 4xAA:
This time, the stock HD 6790 comes pretty close to GTX 460 performance. The HD 6870 is edged by the GTX 570 while the HD 5770 convincingly beats all flavors of GTX 550 Ti. In fact, the HD 6870 is faster than our GTX 560 Ti .
I still think that 6790 is the 5830 of its generation. Too many cuts leads to a crippled chip that exists only because AMD marketing wanted to sell you a chip that would otherwise be thrown in the trash bin because it had too many defects to pass as a 68xx/69xx. That might be good marketing but it’s not a good deal for the buyer.
Beating the GTX 550 is an accomplishment, sure, but not much of one, since the 550 is such a garbage card to begin with.
If you’ve got $150 to spend on a video card, just save up and buy a 6950 for $250. That extra $100 has a great deal of marginal value. As opposed to, say the $150 delta between a GTX 570 and 580, which is just like throwing money away.
This generation of GPUs at 40 nm has been rather underwhelming on the whole. No true spiritual successor to the 8800 GT from either the red or green team. And with DX 11 adoption at a virtual trickle, thanks to the negative effects of consolization, it would appear that progress will be slow until the next-generation of consoles appears.
Bring on 28 nm.
On the bright side, another great review by ABT.
100% agreed with above comment!
Well, I’d say that GTX 460 1GB is almost like the 8800GT of its time, but only if you could find one for $150 with rebates.
Both companies are desperately trying to keep the prices up. Now, a $500 GTX 580 is starting to look a bit “mediocre” with some recent games like Metro 2033, Mafia 2, etc.. The price to pay for eye candy on the PC is rather high, and many games are console ports from consoles that are “several” years old, or a few PC generations behind.
I find it to be really misleading when AMD claims that the 6790 has 256-bit memory when the sawed-off ROPs limit access to only half of the available bandwidth, as the card behaves exactly like as if it has 128-bit bus. For more on this, if you want to discuss on the forums here, I started a thread: http://alienbabeltech.com/abt/viewtopic.php?f=6&t=22733
I also believe that all Barts GPU’s are VLIW4-based like the rest of Northern Islands. It’s something else that appears to be in a dimly-lit area.. when one shines a candle in that area, something just doesn’t look right.