Intel i5 750 Performance Test: 2 cores vs 4 cores
Introduction
The topic of how much CPU power you really need frequently surfaces in tech forums all over the Internet. In past articles I’ve demonstrated that the CPU makes little to no difference to gaming performance providing you always use the highest playable settings. In these situations the GPU is far more likely to bottleneck the system than the CPU, because increasing image quality settings keeps the CPU load the same in most cases, while the GPU load increases.
I demonstrated this by underclocking my E6850 to 2 GHz and then comparing the effects to underclocking my GPU by the same amount (33%). In most cases I witnessed a linear performance drop from the GPU with little to no performance drop from the CPU, indicating the GPU was the primary bottleneck. I then upgraded my E6850 to an i5 750 and again observed little to no performance changes in a large selection of games at the settings I play them at.
What I’m going to do today is compare two cores versus four cores on a quad-core i5 750 running at stock speed (2.66 GHz). Since we’re trying to compare core count and nothing else, turbo boost will be disabled. Obviously this is required since the two core configuration is more likely to turbo than the four core configuration, and we don’t want that since it introduces an unknown variable into our testing.
As usual, the settings used will be the actual settings I play the games at, and in most cases the games have been played from start to finish using them. I have a high-end single GPU (GTX470) and as such I generally run 1920×1200 with 2xAA as a minimum, and I’ll gladly reduce other game settings to attain it if needed.
Hardware
- Intel Core i5 750 (2.66 GHz, two and four cores enabled, Turbo Boost disabled).
- 4 GB DDR3-1333 RAM (2×2 GB, dual-channel).
- Gigabyte GA-P55-UD3 (Intel P55 chipset, F6 BIOS).
- nVidia GeForce GTX470 (1.28 GB, reference clocks).
- Creative X-Fi XtremeMusic.
- 30” HP LP3065.
Software
- Windows 7 (64 bit).
- nVidia driver 197.41, high quality filtering, all optimizations off, LOD clamp enabled.
- DirectX June 2010.
- All games patched to their latest versions.
Settings
- 16xAF forced in the driver, vsync forced off in the driver.
- AA forced either through the driver or enabled in-game, whichever works better.
- Highest quality sound (stereo) used in all games.
- All results show an average framerate.
1920×1200
In this set of games we see absolutely no performance difference between two cores and four cores, and I’m only using 2xAA. Clearly the GTX470 completely bottlenecks performance here.
2560×1600, Part 1
Yet again the two extra cores make absolutely no difference to performance except in Far Cry 2, which is known to benefit from more than two cores. In this case there’s a 6.39% performance gain.
Alternatively, I could swap my GTX470 for a GTX480 and get a 30% performance gain (according to online benchmarks). Thus a “dual-core” i5 750 + GTX480 will yield far better gaming performance than a real i5 750 + GTX470. Feel free to substitute my “dual-core” for any other reasonably clocked dual-core (e.g. E8600).
The 30% performance gain from the GPU would likely allow moving from 2xAA to 4xAA while still having a higher framerate than we started with. Meanwhile, going from 45.04 FPS to 47.92 FPS (like the chart shows) will have a negligible effect on gameplay.
Once again, the fringe occasion where we find the CPU makes a difference is vastly eclipsed by the difference the GPU makes. So ask yourself, do those extra two cores really matter in this case?
2560×1600, Part 2
Here we see the same trend of two cores being just as fast as four cores, except in Unreal Tournament 3, which shows a 6.99% performance gain. This is similar to what we saw in Far Cry 2, but with the GTX480 being about 25% faster than the GTX470 in UT 3 (according to online benchmarks), exactly the same reasoning applies. The CPU’s extra performance in this case simply doesn’t matter in the context of what a graphics card upgrade can give you.
And yes, we’ve all seen Left 4 Dead benchmarks showing 140 FPS vs 110 FPS at 1680×1050 with no AA or AF on a quad versus a dual, but honestly, running the game at such settings on a GTX470 is foolish. Nobody should be doing that. 2560×1600 with 16xAF and 4xTrSS is vastly more immersive, and I played the game from start to finish at such settings.
Conclusion
Once again we see that the CPU makes little to no difference in a wide range of situations if you use the highest playable settings. Out of 21 tested games, even the two that showed slight performance gains (Unreal Tournament 3 and Far Cry 2) would’ve shown far bigger differences from a GTX480 compared to a GTX470 with regards to performance and/or image quality.
It’s reasonable to expect anyone with a single high-end GPU like a GTX470 to be running something like 1920×1200 with 2xAA, while CF/SLI owners will obviously go higher, like 2560×1600 with 4xAA. After all, that’s the whole point of buying high-end graphics cards. Nobody buys high-end graphics cards to run games at 1280×1024 with no AF or AA so they can show four cores getting 200 FPS while two cores “only” get 150 FPS.
If you’re currently on a decent dual-core platform with a low or middle class video card, absolutely do not be afraid of upgrading to a high-end graphics card if you have a 1920 (or better) monitor and/or you like using AA. When games are configured to use the highest playable settings, in the vast majority of cases the graphics card will influence gaming performance the most, often to the point of completely bottlenecking the system.
Of course if you’re building a brand new rig from scratch, get yourself a mid-range quad-core (e.g. an i5 750) so you can future-proof yourself for 2-3 years. In addition, it goes without saying that you should buy the fastest graphics card you can afford.
Please join us in our Forums
Follow us on Twitter
For the latest updates from ABT, please join our RSS News Feed
Join our Distributed Computing teams
- Folding@Home – Team AlienBabelTech – 164304
- SETI@Home – Team AlienBabelTech – 138705
- World Community Grid – Team AlienBabelTech
It should be noted that these findings do not apply to BFBC2. you need a quad core for that one. everyone knows that.
fausto: yes, it’s true, BFBC2 can use more than four cores. But the question is, does it matter? It didn’t matter in the case of my UT 3 and Far Cry 2 results, and both games can use more than two cores too.
Here are some 5870 Crossfire BFBC2 benchmarks:
http://www.techspot.com/article/255-battlefield-bad-company2-performance/page7.html
At 2560×1600 with 2xAA (bottom graph), there’s a complete flatline because the GPU bottlenecks the system at ~73 FPS, even with an i7 920 underclocked to just 2.22 GHz. This is on 5870 CF, which is a very powerful GPU system; a single GPU will flatline much earlier, perhaps even at 1920×1200.
Now here are some CPU scaling tests in the same game:
http://www.legionhardware.com/articles_pages/battlefield_bad_company_2_tuning_guide,7.html
Remember the 73 FPS flatline on the 5870 CF system above? All we need is a core i3 540 to reach it. They’re probably two different benchmarks, but my point still stands.
Providing you run your games at the highest playable settings, any decent dual-core CPU is capable of saturating your graphics system into being the primary bottleneck in the vast majority of cases.
I hope people understand this is the fundamental point I’m trying to get across in all of these CPU articles.
Anyway, thanks for your comments.
Would have liked to see more 1920×1200 and 1920×1080 results. Most people fall into these resolutions and few are using a 30″ monitor at 2560×1600 . 30 inch monitors are still out of reach for many bacause they’re pretty darn expensive.
I’m doubting there would be more than 5% difference in most games 1920×1080, but maybe a few would be 10% or so.
Me thinks GTA IV should be included here. My E6750 is @ 3.5Ghz from 2.66. I was messing around inside my PC shutting it on and off and the BIOS thought there were issues so it reset to defaults. I didn’t notice until I launched GTA IV and performance was shit. After I re-applied my OC it performed like I remember. GPU was GTX 275 @ 1920×1080, 16xAF with optimal settings used.
I suspect that if you included a few RTS games, like Supreme Commander Forged Alliance, where CPU matters more, you’d find that it does matter in some cases, although whether or not it justifies the costs (rather than spending the difference on a better GPU) is open for debate. It all depends on what you need your CPU for.
hello and appreciate the info : I have surely found something new through your blog. I nonetheless came across a few on site difficulties using this website. I had been thinking about if your web hosting service is ok? Not I’m complaining, but poor loading instances times will probably influence your ranking bing and might damage your high-quality articles on this site. Anyway I am adding this RSS to my email and can look for much more of your fascinating posts..