Introducing AMD’s HD 6790
Call of Juarez
Call of Juarez is one of the very earliest DX10 games. It is loosely based on Spaghetti Westerns that became popular in the early 1970s. Call of Juarez features its Chrome Engine using Shader Model 4 with DirectX 10. Our benchmark is built into Call of Juarez. It runs a simple flyby of a level that is created to showcase its DX10 effects. It offers good repeatability and it is a good stress test for DX10 features in graphics cards, although it is not quite the same as actual gameplay because the game logic and AI are stripped out of this demo.
Performing Call of Juarez benchmark is easy. You are presented with a simple menu to choose resolution, anti-aliasing, and two choices of shadow quality options. We set the shadow quality on “high” and the shadow map resolution to the maximum, 2048×2048. At the end of the run, the demo presents you with the minimum, maximum, and average frame rate, along with the option to quit or run the benchmark again. We always ran the benchmark at least a second time and recorded that generally higher score.
Here are Call of Juarez DX10 benchmark results, first at 1920×1200 (there is no 2560×1600 option available in the benchmark):
Now we test at 1680×1050:
Here the HD 6790 beats the GTX 550 Ti which in turn leads over the HD 5770. The GTS 450 brings up the rear and the GTX 460 is solidly faster than either of the new cards.
I still think that 6790 is the 5830 of its generation. Too many cuts leads to a crippled chip that exists only because AMD marketing wanted to sell you a chip that would otherwise be thrown in the trash bin because it had too many defects to pass as a 68xx/69xx. That might be good marketing but it’s not a good deal for the buyer.
Beating the GTX 550 is an accomplishment, sure, but not much of one, since the 550 is such a garbage card to begin with.
If you’ve got $150 to spend on a video card, just save up and buy a 6950 for $250. That extra $100 has a great deal of marginal value. As opposed to, say the $150 delta between a GTX 570 and 580, which is just like throwing money away.
This generation of GPUs at 40 nm has been rather underwhelming on the whole. No true spiritual successor to the 8800 GT from either the red or green team. And with DX 11 adoption at a virtual trickle, thanks to the negative effects of consolization, it would appear that progress will be slow until the next-generation of consoles appears.
Bring on 28 nm.
On the bright side, another great review by ABT.
100% agreed with above comment!
Well, I’d say that GTX 460 1GB is almost like the 8800GT of its time, but only if you could find one for $150 with rebates.
Both companies are desperately trying to keep the prices up. Now, a $500 GTX 580 is starting to look a bit “mediocre” with some recent games like Metro 2033, Mafia 2, etc.. The price to pay for eye candy on the PC is rather high, and many games are console ports from consoles that are “several” years old, or a few PC generations behind.
I find it to be really misleading when AMD claims that the 6790 has 256-bit memory when the sawed-off ROPs limit access to only half of the available bandwidth, as the card behaves exactly like as if it has 128-bit bus. For more on this, if you want to discuss on the forums here, I started a thread: http://alienbabeltech.com/abt/viewtopic.php?f=6&t=22733
I also believe that all Barts GPU’s are VLIW4-based like the rest of Northern Islands. It’s something else that appears to be in a dimly-lit area.. when one shines a candle in that area, something just doesn’t look right.