Intel E6850 Bottleneck Investigation

24 Responses

  1. Leon Hyman says:

    Masterful review BFG!!

    I tip my hat to you for finally debunking that CPU myth and proving to the naysayers that the GPU is always going to be the single, most important component for gaming.

  2. Bassplayer says:

    So you want to say a quad core isn’t necessary for gaming, but didn’t compare it to a quad core?!

    Of course the graphics cards are going to affect it more, but clocking the chip up and down isn’t going to simulate more cores.

    You’ve proven that using an E6850 over a lower clocked Core2Duo isn’t going to make much of a difference. This has nothing to do with quad cores.

    I agree with your assessment, but not your methodology.

  3. BFG10K says:

    Leon and Bassplayer, thanks for your comments.

    Bassplayer, I’ve shown that even an E6850 @ 2 GHz almost completely saturates a GTX285. If the GTX285 is the bottleneck then it doesn’t matter how many extra cores are thrown into the system, because the GPU still holds back performance in the same way.

    A GPU bottleneck isn’t somehow removed if you use more CPU cores. That’s why I don’t even need to test a quad-core, because I can already see the bottleneck with a dual-core.

    So whether I have ten cores or two cores, the GTX285 will still be the primary bottleneck. That’s the crux of the article.

  4. apoppin says:

    Bassplayer, please look at my recent review:

    http://alienbabeltech.com/main/?p=13034

    I tested i7 920 vs. Q9550S vs. Phenom II 720 X3 – from their stock clocks to 4.0 GHz with 15 games at 16×10/19×12, maxed plus 4xAA/16XAF – and my results support BFG10K’s conclusions.

    My current testing with HD 4870-X3 TriFire and adding Ph II 550 X2 and 955 X4 are also showing no different, so far. Not to mention, that an earlier review I did with Phenom versus Penryn showed the same trend.

    The only time a quad-core is useful in gaming is when the game is optimized for it. And so far, only a tiny handful of mostly RTS games really benefit from a quad.

  5. kiss4luna says:

    hey,bfg10k, thanks for the article.

    can you test DCS:Black Shark if being convenience?

    you can get it here:
    http://www.digitalcombatsimulator.com

  6. BFG10K says:

    kiss4luna, all of the games I test are purchased by me, which means I actually play them too. I don’t generally buy games just to benchmark them. Sorry.

  7. k00giking says:

    I agree that the cpu bottlenecks that people talk about so much are very very overblown.

    But they still do matter in gaming. Of course not as much as a video card but they are still very important.
    The games you tested are older games. They don’t take use of more than 2 cores decently.

    Just about every recent release however is coded to use more than 4 cores. And there are some very LARGE performance differences even when playing at good resolutions and high settings. Recent Source games, Capcom games will make use of more cores because the engines they use have been updated for multi core support.

    I’m going to show a couple of examples.

    Reisdent Evil 5
    http://www.pcgameshardware.com/aid,690488/Processor-benchmarks-with-Resident-Evil-5-Core-i7-reigns-Phenom-strong-Update-Lynnfield-results/Practice/

    A 2.4GHz Q6600 beats a 3.0GHz e8400

    The 3.1GHz Phenom II X2 550 gets 54.2FPS and the Phenom II X4 945 @ 3.0GHz gets 81.5

    The settings were at 1680 x 1050 all settings max.

    DiRt 2

    http://www.pcgameshardware.com/aid,700780/Dirt-2-CPU-benchmarks-with-DirectX-9-and-DirectX-11-Phenom-doing-well-quad-cores-rule/Practice/

    Max Details @ 1680 x 1050

    62% increases for Quad Cores over Dual Cores

    e8400 @ 3.0GHz gets 37FPS

    Q9650 @ 3.0GHz gets 63FPS

    Dragon Age Origins

    http://www.pcgameshardware.com/aid,698761/Dragon-Age-Origins-CPU-benchmarks-75-percent-boost-for-quad-cores/Practice/

    e6600 @ 2.4GHz gets 28FPS vs the Q6600 @ 2.4GHz that gets almost double the fps at 49FPS.

    The infamous Grand Theft Auto 4

    http://www.pcgameshardware.com/aid,669595/GTA-4-PC-CPU-benchmark-review-with-13-processors/Reviews/?page=2

    Prototype

    http://www.pcgameshardware.com/aid,688240/Prototype-CPU-Benchmarks-System-Requirements-and-Screenshots/Practice/

    Left 4 Dead 2

    http://www.pcgameshardware.com/aid,699110/Left-4-Dead-2-CPU-benchmarks-Phenom-II-very-strong/Practice/

    ARMA 2

    http://www.pcgameshardware.com/aid,687620/ArmA-2-tested-Benchmarks-with-18-CPUs/Practice/

    Batman Arkham Asylum

    http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-9.html

    All of these games has been tested with high settings at decent and very common gamer resolutions like 1680 x 1050.

    There are lots and lots more but I don’t feel like posting any more links. I think that I’m able to make show my point with these games. Just about every recent release comes with great quad core support. I’d be more surprised if a recent game didn’t come out with quad core support.

  8. BFG10K says:

    k00giking, I’ve checked all of your links and they can be categorized thusly:

    GTA 4: 1280×1024 with no AA or AF.
    ArmA 2: 1280×1024 with no AA.

    A quad-core i7 gamer isn’t going to be gaming at 1280×1024 with no AA. This is a completely unrealistic setting for any mid-range or high-end system.

    RE 5: 1680×1050 with no AA or AF.
    Dirt 2: 1680×1050 with no AA or AF.
    L4D 2: 1680×1050 with no AA or AF.

    Slightly better than above, but someone with a high-end quad-core i7 still isn’t going to be gaming at 1680×1050 with no AA or AF.

    Dragon Age: 1680×1050 with 4xAA.
    Prototyped: 1680×1050 with 4xAA.

    Good AA, but I’d again question why at least 1920×1200 wasn’t being used. If their monitor tops out at such a low resolution then they should invest in a bigger monitor.

    Batman: 1920×1200 with no AA.

    A more realistic resolution for a high-end quad-core, but no AA is used. It’s realistic to expect someone using an i7 to be using at least 2xAA, if not 4xAA.

    The fact is, when games are configured to use the highest playable settings, the GPU usually becomes the primary bottleneck by far, even in the titles you listed.

    Thanks for your comments, because I appreciate them.

  9. apoppin says:

    Oops, I accidentally deleted this comment by Marco Elsy:
    -ABT EiC apoppin

    “Unfair test, leaving gpu at stock and lowering the cpu sequentially would have given you quantitative results all this proves that underclcoking your gpu by 33% gives you on average 33% less performance??

    Where as sequentially lowering the cpu would have shown you what effect the clock speeds have on any give game.

    Nice try, shame the test was flawed”

  10. BFG10K says:

    Marco, I fail to see why underclocking incrementally will paint a different picture.

    If a 33% CPU underclock shows little to no performance drop, what purpose would an incremental underclock of less than 33% serve?

  11. shiggz says:

    Great article. It sure was nice to see some hard numbers put to these old questions. I recently read an x-bit article about swapping back and forth between quad core and dual core and that the experience was equal frame-rates but an overall smoother experience with less stuttering or hitching. So I finally sprung for a q9400 and it has helped smooth some hitching for me. While people like to refer to quad core being able to re-encode video or burn discs in the back ground. I found even something simpler like a usenet client unraring or a torrent client running in the background it is a smoother experience with quad cores.

    In short Quad cores are not needed for Frame rates in modern games but a reasonable priced quad can be a nice improvement for smoothness if you have even moderate background tasks running.

  12. Geraldo Abbehusen says:

    Congratulations for this review, finally i found someone that proves the truth, few people use 1280×800 or lower resolution, then in this case only you should go for a iCore i7 4.0ghz.

    I am happy for my 2 year old Q6600 and will hold it for another 2 years :)

  13. BFG10K says:

    I have an i5 750 now (an article is coming soon) and I’m seeing the same thing as I saw in this article: almost no performance change from the CPU because my GTX285 bottlenecks me by almost 100%.

  14. Maurice says:

    This seems to prove to me that in CPU heavy games, that the processor is significantly limiting.

    9% in Crysis? That’s the difference between a 275 and a 285. It’s certainly notable.

    Now this normally only applies to the enthusiasts with top-end cards, but as these new ATI 5-series and soon-released Fermi cards come through, there will be a far greater number of more powerful GPUs in the general population and more demanding games requiring an overclock or upgrade.

    We all know that the GPU is far more limiting than the CPU, we’ve known it since we first became interested in these things, but you’ve proved in some games there is a CPU bottleneck with a E6850.

  15. BFG10K says:

    How is it significant when the GPU affected performance by 30% in the same game? 9% vs 30% isn’t even close.

    The 9% performance drop comes from lowering the E6850 to 2 GHz, which is the same 33% underclock the GPU experienced. In other words, the result is from a dual-core processor running at 2 GHz. That’s classed as a low-end CPU these days.

    So, we have a low-end CPU paired with a high-end single GPU, and the GPU still affects performance by a factor of more than three compared to the CPU.

    Additionally, here’s the same test, same setting, and same benchmark comparing an E6850 to an i5 750:

    http://alienbabeltech.com/main/?p=15920&page=3

    There was absolutely no performance change in Crysis between the two processors, because the GTX285 is the bottleneck by 100%. Meanwhile if I dropped a GTX480 into my system I’d see a massive performance gain in Crysis, even with my E6850 running at 2 GHz.

    The bottom-line is this: I can upgrade my processor and see small performance gains in the odd rare title. Or I can upgrade my GPU and see massive performance gains in every tested game. You’ll see this when I eventually benchmark a GTX480.

    It just makes sense to sink as much money into the GPU as possible, and not worry about the CPU too much. The two aren’t even close in terms of overall gaming influence.

    The problem is people don’t look at the full picture, and instead focus on fringe titles at unrealistic settings (e.g. 1280×1024 with 0xAA) just to convince themselves the CPU is a bigger influence than it really is.

  16. Fred says:

    I found the article incredibly interesting. I wonder if someone could advise?

    If I have a Q6600 and am gaming at stock 2.4ghz with an 8800GT at 1650×1080, would my CPU mimic the above findings if I purchased an upgrade in the form of a GTX 470?

    I’ve read so many forum posts telling me that I’m wasting my time buying a GTX 470 if my Q6600 is only running at 2.4ghz.

    That doesn’t make sense to me. Either the GTX 470 will either give me a substantial upgrade from an 8800GT or it won’t?

    Yes, it’s a question :)

  17. BFG10K says:

    Like I said in your forum topic (http://alienbabeltech.com/abt/viewtopic.php?f=6&t=20745), a GTX470 will still provide you with a substantial performance gain over your 8800 GT, even with your CPU at stock.

  18. taltamir says:

    try this test with supreme commander, flight sim X, and mass effect 2.

  19. tyranous says:

    I have a 5870 1GB + E6600 clocked to 2.88GHz from 2.33Ghz. I saw a very big improvement in games. My resolution is 1680*1024(cant go higher) and i was testing at x8 AA and x16 AF!

  20. Jaap Olsthoorn says:

    Hi there, great article, and one of the reasons I prefer investing in a video card over a processor.

    However, I recently purchased a 460GTX 1G which is to replace my excellent 8800GT. My processor is an overclocking beast, an e2160@3.2ghz which, in most tests, performs about as well as an E6750.
    I have noticed no gain in FPS after the upgrade. There is absolutely no difference in many games (dirt, far cry, dawn of war 2) and only a small gain in crysis.

    I believe I have a CPU bottleneck now. It’s either that or my card needs an RMA 😀 What do you think?

  21. BFG10K says:

    Jaap, thanks for dropping by. What settings do you play those games at? And how are you determining there’s no performance gain?

  22. Jaap Olsthoorn says:

    Dirt 2 @ 1680×1050 (max for monitor) with 2xAA and everything on High

    Crysis 2xAA same resolution, gpu bench

    dawn of war: most stuff on High, no AA since that will absolutely kill the framerate. (DOW 2 seems veeery CPU dependent)

    Mafia2: Everything High, 0-4xAA not much of an impact. DX9 had a huge impact though 😛

    Personally I don’t care much about AA so I usually leave it off. I did notice an impact of turning AA on in Assassins creed, but it was quite minimal. (1/2 fps on 30 avg)

    Anyway, I’ll be able to test it soon, got a second hand q9550, now i just hope I didn’t get it for nothing 😛

  23. BFG10K says:

    Jaap, 1680×1050 is a little on the low side to tax a GPU so it’s quite likely you’re CPU limited, especially if you don’t use AA.

    But if AA kills the framerate in DoW, you should find things are better on the GTX460.

  24. Jaap Olsthoorn says:

    It wasn’t. But DoW seems to be an incredibly unreliable benchmark for me. I got 23 to 34 averages on the same machine without driver changes.

    Anyway, got a new q9550@3.4 now, everything is much better. Double framerate in Dirt, 1.6x framerates in AC2 almost double in Far Cry 2 and about 1,5 in mafia 2. So yeah, at this resolution and these settings, I was CPU limited.

    Although I think it might have been the amount of L2 cache, 1 mb isn’t much to write home about.

Leave a Reply

Your email address will not be published.