Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Skylake-X And Kaby Lake-X Thread
#1
https://www.techpowerup.com/232272/intel...f-schedule
Quote:The rumor mill turns, and Ages come and pass, leaving memories that become legend. However, some of those really do turn to reality, like recent accounts of an AMD Polaris 20 chip surfacing in the latest RX 500 series. This time, Intel is in the crosshairs, with the company's high-performance Skylake-X and Kaby Lake-X desktop components being pegged for release between June 19th and July 9th. This would place an announcement on the new chipset and CPUs debut to drop around Computex 2017, which kicks off on May 30 and runs through June 3 in Taipei.

Skylake-X and Kaby Lake-X parts are supposed to use the same LGA 2066 socket, with Skylake-X said to include anywhere between six to 10 cores, support quad-channel DDR4 memory and have a metric ton of PCIe 3.0 lanes. Kaby Lake-X parts, meanwhile, are reportedly limited to just four cores, dual-channel memory and just 16 PCIe lanes from the CPU - which gives an impression of a simple, Kaby Lake desktop CPUs being repackaged for the new socket.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#2
https://www.techpowerup.com/232709/intel...ks-surface
Quote:Leaks came from the SiSoft Sandra Benchmark, again, which shows the Intel Core i7-7740K running on a Gigabyte X299 Gaming 3 motherboard at 4.2GHz base and 4.5GHz Turbo clocks. The X-series family of processors is expected to have a wide range of various core-configurations from 4, 6, 8, 10 and 12-core processors, on the new X299 platform. At the same time, Intel will also have a Skylake-X CPU, expected to be for the top-end 12-core/24-thread line with 44 lanes of PCIe Gen 3.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#3
Noctua confirms that Skylake-X and Kaby Lake-X will use Socket LGA 2066: http://techreport.com/news/31807/noctua-...aby-lake-x
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#4
MSI teases what looks like an X299 motherboard: https://www.eteknix.com/msi-teases-upcom...set-based/
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#5
http://techreport.com/news/31898/rumor-c...66-in-june
Quote:There's a lot to talk about here. Most profound is further fuel for the notion that Intel will once again be increasing the core count of its top-shelf Core-series processor, this time to 12 cores. The idea of having to buy a ten-core processor to run a pair of graphics cards with a full 16 lanes of PCIe 3.0 for each remains as annoying as ever.

A few discrepancies popped up between the information posted by Sweepr and the information in the image posted by dooon. Sweepr claims that the Skylake-X-based Core i9 processors will have TDP specs ranging up to 160W, but all of the chips in the image appear to say 140W. Sweepr also claims that the Skylake-X processors will have 1 MB of L2 cache. That would be four times as much as today's Skylake and Kaby Lake processors, and might help to explain the relatively paltry amount of L3 cache in the Skylake-X CPUs. The Core i7-6950X ten-core CPU has 25MB of L3 cache, for comparison's sake.

The information didn't come with any pricing details, so we'll have to wait and see what the bottom line looks like for these chips—assuming this information is even accurate. If so, that means we'll be seeing these parts next month save for the 12-core Core i9-7920X. That part is supposedly launching in August.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#6
https://www.techpowerup.com/233366/intel...ks-surface
Quote:These should be two of the top performing processors in Intel's line-up, and the i9 7900X (10-core) and 7920X (12-core) have been tested on integer and floating point calculations. The 10-core i9-7900X (3.1 GHz base frequency, no Turbo listed)) scores 107 points in single-core benchmarks, and 1467 points in the multi-core test. The 12-core, 2.9 GHz base frequency 7920X, however, scores a head-scratchingly-higher 130 points in the same single-threaded benchmark, despite carrying the same architecture at... hmm... lower clocks. Maybe this processor's Turbo is working as expected, up to 3.25 GHz (average), and that's the factor for the higher single-core performance?

The fact that these scores were sourced from userbenchmark.com means that there isn't much reason to compare them between processors - it's an environment lacking the usual control usually found in reviews - so that may also be the reason for the discrepancy. Multi-threaded performance is more inline with what we'd expect to see, though, coming in at 1760 points. You should also note that the Core i9-7900X shows a base clock of 3.1 GHz, 0.2 GHz short of its official 3.3 GHz specification, which means we're probably not looking at final silicon. As always, and this should go without saying, take leaked benchmarks with some amount of salt (a pinch or a truckload, your mileage may vary.)
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#7
https://www.techpowerup.com/233828/the-s...processors
Quote:A new, leaked slide on Intel's X-series processors shows 18, 16, 14, and 12-core configurations as being available on the upcoming X299 platform, leveraging Intel's turbo Boost Max Technology 3.0 (which is apparently only available on Intel's Core i9-7820X, 7900X, 7920X (which we know to be a 12-core part), 7940X (probably the 14-core), 7960X (16-core) and the punchline 7980XE 18-core processor, which should see a price as eye-watering as that name tumbles around on the tip of the tongue. There is also mention of a "Rebalanced Intel Smart Cache hierarchy".
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#8
https://techreport.com/review/31986/inte...m-revealed
They range from 4C/4T to 18C/36T, and from $242 to $1999.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#9
https://www.techpowerup.com/233865/intel...e-soldered
Quote:If you had your eyes on those new Intel HEDT processors, which were posted just today with some... Interesting... price-points, you'll be a little miffed to know that Intel has gone on and done it again. The few cents per unit that soldering the CPU would add to the manufacturing costs of Intel's HEDT processors (starting at $999, tray-friendly prices) could definitely bring the blue giant to the red. As such, the company has decided to do away with solder even on its HEDT line of high-performance, eye-wateringly-expensive CPUs in favor of their dreaded TIM.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#10
http://www.tomshardware.com/news/asrock-...34586.html
I'm a little late to mention this, but this is another impressive feat by ASRock.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#11
http://www.anandtech.com/show/11463/inte...chitecture
Quote:Secondly, Kaby Lake-X uses a thermal interface paste, rather than an indium-tin solder that we have seen on HEDT processors in the past. This raises a number of points, such that extreme overclockers will have to delid to get the best out of the processor, but also it offers a mixed message from Intel. Intel is consistently saying that they’re for enthusiasts, and want to provide the best solutions available especially for the cutting edge of overclocking. Using a cheaper thermal paste over an indium-tin solder is one way so save 1 cent on a $350 CPU. Intel might argue that delidding a CPU is a common thing for extreme overclockers to do, and it makes little different to the majority of users who will buy the processors. That may be true, but it doesn’t show commitment. Does Intel need to show commitment? It depends on how strong they what their marketing message to be at the end of the day.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#12
Linus is super unhappy about the X299 launch. Watch the whole thing:


Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#13
More discussion of X299:




Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#14
http://techreport.com/news/32081/intel-c...in-june-19
This is straight from Intel.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#15
4, 6, 8, and 10-core Skylake-X and Kaby Lake-X CPUs are shipping June 26: http://www.anandtech.com/show/11542/inte...ailability
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#16
http://hexus.net/tech/reviews/cpu/107017...x/?page=10
Quote:Wouldn't it be nice if the world's leading chip giants launched new processor series that worked perfectly and made implicit sense?

That's the hope, but having witnessed AMD's Ryzen arrive with teething issues ranging from memory support to hesitant in-game performance, it is frustrating to find that Intel's Core X-Series isn't immune to certain missteps, either.

X299 motherboards don't appear to be quite ready, there are question marks surrounding the Skylake-X processors due later this year, and at the lower end of the Core X spectrum, Kaby Lake-X is nothing short of puzzling.

Plenty of reason for discord among enthusiast users, yet look past the confusion and you may see cause for optimism. The world's first Core i9 processor, the 7900X, is ultimately a 10-core powerhouse offering excellent IPC performance and outstanding multi-core prowess in a single $999 chip armed with plenty of overclocking headroom. Said ingredients make it an automatic choice for power users seeking the ultimate PC experience, and the chip's benchmarking potential makes for a fitting debut of the Core i9 brand.

Bottom line: the price tag remains a stumbling block and software optimisations are needed, but anyone willing to splash a thousand bucks on a new CPU need look no further than the Intel Core i9-7900X. Over to you, Threadripper.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#17
http://techreport.com/review/32111/intel...part-one/8
http://techreport.com/review/32111/intel...part-one/9
Quote:At idle, the X299 platform paired with the Core i9-7900X sips only a bit more power than the Core i7-7700K. Under load, however, the 7900X system consumes the most power at peak load by a wide margin. Of course, peak power draw only tells part of the efficiency story.

To really get a sense of how efficient the Core i9-7900X is, we need to take the task energy consumed over the course of our Blender benchmark into account. Not only does the Core i9-7900X cut 94 seconds off the Ryzen 7 1800X's bmw27 render time, it does it while expending only just a bit more power to do so. That's impressive performance per watt.
...
Our value scatter tells the entire story of the Core i9-7900X: if your workload scales to many threads, this chip is generally the one to run it on. The server version of Skylake delivers an unusually large performance boost for a modern Intel CPU revision in many tasks. Core for core and thread for thread, the already-beastly Core i7-6950X can sometimes lag the 7900X in the range of 10% to 20%. All that oomph comes for a jaw-dropping $724 less than the 6950X's initial suggested price, too. Competition is a wonderful thing.

In a milestone for Intel's high-end desktop platform, the Core i9-7900X mostly ends the tradeoff between single-threaded swiftness and multi-threaded grunt typical of some older Intel high-end desktop chips. For lightly-threaded workloads, the i9-7900X's improved Turbo Boost Max 3.0 behavior lets it trail our single-thread-favorite Core i7-7700K by only a few percentage points at most. In typical desktop use, then, the i9-7900X and its TBM 3.0-enabled brethren should feel about as snappy as their mainstream desktop cousins. I need to get the i9-7900X paired up with a GeForce GTX 1080 Ti soon to see whether that single-threaded performance translates to similar gameplay smoothness.
...
We've always been loath to recommend the top-end CPU in Intel's high-end desktop family (and yes, that is this chip for the moment). Despite the Ryzen-inspired price reshuffling that's coming with Core X, the i9-7900X still isn't a great value. The star of the Core X lineup may actually be the Core i7-7820X, whose eight cores and 16 threads have clocks similar to those of the i9-7900X. You may lose a couple of cores in the bargain, but even so, the i7-7820X should perform better than a Ryzen 7 1800X for not that much more money. We hope to play with one of these more attainable Skylake-X CPUs soon.

Of course, the performance of the Core i9-7900X is beyond question: it's the fastest single-socket CPU we've ever tested. The X299 platform may need a little polish yet to let Core X chips really shine, but the performance bar the i9-7900X is already setting promises an exciting standoff this summer as AMD prepares its Ryzen Threadripper CPUs for launch. If you need as many cores and threads as possible from your desktop, times have never been more exciting. Stay tuned as we see whether the i9-7900X has got game.

http://www.tomshardware.com/reviews/inte...92-11.html
http://www.tomshardware.com/reviews/inte...92-12.html
Quote:As mentioned, we had to use Alphacool's Eiszeit Chiller 2000 to achieve usable overclocking results. More conventional thermal solutions just wouldn't cut it. All-in-ones like Corsair's H100i and Enermax's LiqTech 240 hit their limits at stock frequencies under Prime95. The custom loop threw in the towel at 4.6 GHz.

Why can't those liquid coolers keep up with a CPU like the -7900X? Back in the day, a normal all-in-one was good enough to keep the Core i7-5960X running cool, even overclocked to 4.8 GHz. We measured power consumption numbers of up to 250W back then. So, why did we have to force a constant 20°C in the loop to even start experimenting?

The reason that Skylake-X is so much harder to cool traces back to the thermal paste Intel chose to use instead of solder between the processor die and heat spreader. Although paste is cheaper, it's also less than ideal for cooling performance.

Intel’s digital temperature sensors report reliable results from 35 to 40°C and up, prompting us to only include values above that range in our analysis. The difference between the water cooling block's temperature, which is held at a constant 20°C, and the CPU temperature reported by Intel's sensors shows just how bad of a choice thermal paste really was.

We measured the CPU heat spreader’s temperature the same way we did when AMD launched Ryzen 7 1800X, by using a thin copper plate. The resulting curve shows very clearly that waste heat can't be dissipated quickly enough. A solution good enough for a thermal lightweight like Intel's Core i7-7700K just doesn’t work for Core i9-7900X.


This curve represents the temperature delta, which is to say the thermal difference between Core i9-7900X’s cores and the top of its heat spreader. The outcome is shocking:

In the end, the delta between the cores and top of the heat spreader reaches 71°C, and that's using one of the best cooling setups money can buy. Naturally, lesser thermal solutions start running into trouble at stock frequencies when you run a stress test.

To illustrate our point, we plotted the temperature for all of the Core i9-7900X’s cores at stock settings running Prime95 or LuxRender. A good custom water-cooling loop does fairly well, which shouldn't come as a surprise. However, no other thermal solution will be able to keep up. Even the motherboard manufacturers we spoke to agree, telling us about their all-in-one liquid coolers running out of headroom as soon as they ran Prime95 without limiting AVX.

A Tcore of up to 65°C and a heat spreader temperature of approximately 24°C make for a difference of more than 40°C. That's at 230W. Once the 300W line is crossed, even the Alphacool Eiszeit Chiller 2000 taps out. This isn’t even difficult to do: with a Core i9-7900X running at 4.6 or 4.7 GHz, using the voltages needed to get there, even simple rendering applications trigger those levels. The highest power consumption numbers we saw were just north of 300W, which had the CPU hitting its 100°C thermal limit consistently. An emergency shutdown followed soon after.

Next, we measured power consumption under a constant load using different coolers. For a temperature increase of approximately 40°C, power consumption increases by five percent. This isn’t just an acceptable result, but a really good one. The values above 100°C are not as reliable due to throttling. Consequently, we made an exception and used a low-pass filter that smoothed out the brief decreases.

Everything could have been great, if it wasn't for the thermal paste between Intel's die and heat spreader. Admittedly, most workstation or semi-pro users won't overclock, cutting down on the number of customers affected by this problem. But we all know that affluent enthusiasts attracted to Skylake-X's balance between high frequencies and core counts will have to face a significant cooling challenge. Your choices come down to high-end all-in-one packages or a custom water-cooling loop. Air cooling is completely out of the question if you expect the -7900X to run comfortably under full load.

Intel’s market dominance burdens the company with certain expectations when it launches new hardware. Naturally, we expect more performance. And although we're quick to deride incremental updates, forward progress is what we want to see. At no point is a step backward alright in our books, and we saw a handful of those in today's tests.
...
As it stands, aggressive Turbo Boost frequencies and a re-balanced cache hierarchy go a long way to improving on Broadwell-E's minor weaknesses. When the Core i9-7900X does well, it really shines. Often, the chip beats every competitor we throw up against it, including Core i7-6950X. In other workloads, latency imposed by its mesh topology causes Core i9 to stumble. That isn’t to say performance falls off completely. But we do see anomalies unfitting of a $1000 CPU. If you're strictly a gamer, Core i9-7900X won't make you want to buy a new CPU, motherboard, and memory kit.

Enthusiasts also want to see robust overclocking capabilities, and Skylake-X does offer a higher frequency ceiling than Core i7-6950X. You're going to cope with a lot of heat in the process, though. Given Intel’s insistence on using thermal paste between its die and heat spreader for longer-term reliability, the processor can’t dissipate heat as effectively, so thermal performance becomes a limiting factor. Plan on investing in a beefy open loop if you want to push the Core i9-7900X much further than its stock frequencies.

Core i9-7900X performs well in our productivity, workstation, and HPC tests. The mesh-imposed disparities aren't as pronounced in those benchmarks. But we also have re-run Ryzen 7 1800X benchmarks to think about. Pressure to size up has pushed AMD's flagship down to $460, less than half of what a Core i9-7900X would cost. While Intel may capture the top 1% of high-end enthusiasts with Skylake-X, everyone else has to consider whether Ryzen may be the smarter buy.

Moreover, AMD's upcoming Threadripper CPU has to have Intel worried. How do we know? The X299 motherboards we used needed firmware updates to address very serious performance issues right up until launch. Intel didn't seem nearly as ready for Skylake-X's introduction as we'd expect. A number of Core i9s with even more cores won't be ready until later this year. However, it looks like Intel couldn't get the four-, six-, eight-, and 10-core models out fast enough. They'll ship later this month.

Unfortunately, this story won't be ready to wrap up until we have Threadripper to compare against. Given Core i9-7900X’s high price and performance caveats, enthusiasts should probably hold off on a purchase until we know more about the competition, even if Skylake-X looks like a bigger step forward than Intel's past HEDT designs.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#18
http://www.gamersnexus.net/hwreviews/296...ere/page-6
Quote:That aside, we can look at the i9-7900X from the perspective of our benchmarks. Intel claimed that the CPU would excel in multi-stream output scenarios, and it appears that the company was accurate in this statement. Our game stream testing shows fewer than 1% dropped frames with reasonably high quality streaming to two services, which is more than can be said for most other CPUs on the market. Intel’s done a good job here; however, this does come with its caveats. One of them is that, as is always the case with encoding on the gaming system, frametime variability goes up and potentially impacts the playback fluidity. This would mostly be noticeable for competitive streamers who are working with games like CSGO, DOTA2, or Overwatch. In these instances of demanding consistent frametimes, we’d still recommend a secondary capture machine to remove all doubt. In instances where the capture and gaming must be done on the same system, the i9-7900X makes it possible to multi-stream while still carrying gameplay at a reasonable clip. It’s just a matter of that frametime consistency.

For a single stream output, Ryzen still does quite well. The R7 1700 would overclock to readily match an 1800X in our tests and didn’t drop frames in our Twitch-only testing. The 7900X is a bit superfluous for single-game streaming, but does make multi-streaming possible in a way which hasn’t been as affordable before.

Intel gets praise here, as it is deserved. The lowered 10C price makes that performance all the more reasonable for streamers working with a single box. If that frametime consistency isn't a concern, or if working with lower framerate games than what we've tested so far (again: DOTA, CSGO), then Intel has managed to build and price a CPU that would allow a cheaper streaming+gaming build than two individual machines. There's just still a reason for a capture machine, depending.

But there are plenty of use cases where the 7900X makes no sense. VR gaming is one of those, despite Intel’s marketing. The i9-7900X is a monumental waste of money for VR gaming, given its performance is not only imperceptibly different from both the i7-7700K ($330) and R7 1700, but also technically worse than the $330 7700K. “Technically,” of course, because that objective lead by the 7700K isn’t really relevant -- no user will ever see or experience that difference, as all three CPUs output 90Hz to the HMD and do so within runtime limitations. That said, there’s no reason to buy a worse product for more money. We made that exact point with the 1800X review, and we’re making that point again now. The i9-7900X has no place in a machine built only for VR gaming. Buy any $300 CPU instead. Maybe there’s something to be said for “VR content creation,” but then you’re really entering into the realm of a productivity/office machine – there are other benefits to that, like reduced encode/rendering times, that are proven to be beneficial.

And speaking of those productivity tasks, we did see a substantial uptick generationally for Intel’s *900 series HEDT CPUs. The 7900X completed our Blender render in approximately 26% less time than the stock i7-6900K, and in approximately 23% less time than an overclocked R7 1700 CPU. Overclocking the i9-7900X to 4.5GHz (1.275 Vcore) furthered that, improving us another ~12% over the 7900X stock CPU. That time is valuable to the right audience, like a production studio or environment where money is made on time regained. For a lot of folks, though, like enthusiast artists, part-time freelancers, and folks not pulling a full salary on this kind of work, the R7 1700 CPU isn’t too terribly far behind and runs significantly cheaper on both the CPU and platform. It’s also a lot more memory and platform-limited, naturally, but that’s where you’ve got to use your own judgment. If the $700 is worth the additional memory support and speed, it probably means you’re making enough money on this work to write it off as an important expense. If that’s not the case, save the money and go with an R7 1700 CPU, spend 15 minutes on a trivial overclock, and be happy with competitive performance at one-third the price.

Both CPUs have their use cases and audiences, is the point.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#19


Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#20
http://techreport.com/news/32120/msi-mak...orsair-h75
Quote:MSI is offering a little relief to both builders' wallets and Core-X CPU temperatures with its latest promotion, offering a free Corsair Hydro Series H75 AIO liquid cooler with the purchase of select MSI X299 motherboards at Newegg.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#21
https://www.extremetech.com/gaming/25122...gaming-cpu
Quote:That’s the conclusion PC Gamer came to during their work testing the new Skylake-X, and it’s a conclusion we’d generally agree with based on our tests of previous Intel CPUs. It’s not that you can’t game on HEDT chips — in fact, we’ve seen great results from both Ryzen 7 and the Core i7-6900K so far this year. But these cores are fundamentally designed to be workstation products and aren’t solely focused on gaming.

Here’s the basic issue: Jump back 10 years, and you’ll find plenty of multi-threaded software. But most of it was confined to professional markets rather than focusing on the consumer space. Consumer software that was multithreaded, meanwhile, tended to be dual-threaded or quad-threaded at most. If professional applications like Maya, 3ds Max, HPC software, and video rendering workloads have tended to be at the forefront of adopting multi-core support, games have lagged behind by a significant margin.

Many titles now support quad cores readily enough, but scaling above that point has been anemic, for several reasons. First, game developers tend to optimize for the most common usage scenarios, and most laptops are still dual-core systems. In fact, the split on Steam shows that while quad-core chips have a small advantage overall, a huge chunk of the market is still on two physical CPUs (some of these will be Hyper-Threading-capable Core i3s, but not all of them).

It’s long been known that game developers optimize software for a wide range of GPUs, to ensure that games can run properly on the largest amount of hardware possible. But optimizing for CPUs is important, too. With AMD now offering six-core / 12-thread chips far below Intel’s price points and quad-core / eight-thread chips for ~$169, we should start to see more support for these additional threads, but it’s going to be a slow ramp.
...
But that said, we’d still recommend striking a balance between higher clock speeds and thread counts if you want to game, as opposed to leaping for the most expensive HCC processors on the market. Don’t assume that more cores always equals faster performance — and given the relatively slow rate at which games have added additional core support, don’t count on this suddenly changing in the next few years. DirectX 12 does offer some unique options for taking advantage of additional CPU resources, but it’s still going to be a few years before we see games targeting that API (and its capabilities) as a primary focus. DX11 and DX9 support have a large market presence and that’ll take time to replace.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#22
https://lanoc.org/review/cpus/7566-intel...l=&start=3
Quote:For temperature testing, I retested all of the CPUs in this review using AID64’s stability tester with just an FPU load to heat things up the most. Now remember these numbers aren’t perfect, you have to rely on unreliable software/in CPU results combined with things like AMD adjusting the numbers by 20 degrees as well. That said I included them all just for reference. The i9-7900X didn’t do too bad with a peak of 58 degrees using the Noctua cooler but the i7-7740X, like the i7-7700K didn’t do very well. If I were just using these numbers I would guess that the 7740X was using TIM under the heatspreader and the 7900X was soldered, but everything (pre-launch) is saying that both will have TIM so I’m not sure why it would register so much lower beyond untrustable software based testing. SO take these numbers with a grain of salt right now.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#23
http://www.gamersnexus.net/guides/2963-i...benchmarks

Check out Elric's reason number 5:


Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#24
Speaking of X299 motherboards: http://www.anandtech.com/show/11550/the-...-tested/17
Quote:The gaming story is unfortunately not quite as rosy. We had last minute BIOS updates to a number of our boards because some of the gaming tests were super underperforming on the new Skylake-X parts. We are told that these early BIOSes are having power issues to do with turboing, as well as Intel’s Speed Shift technology when the GPU is active.

While these newer BIOSes have improved things, there are still some remaining performance issues to be resolved. Our GTX1080 seems to be hit the hardest out of our four GPUs, as well as Civilization 6, the second Rise of the Tomb Raider test, and Rocket League on all GPUs. As a result, we only posted a minor selection of results, most of which show good parity at 4K. The good news is that most of the issues seem to happen at 1080p, when the CPU is more at fault. The bad news is that when the CPU is pushed into a corner, the current BIOS situation is handicapping Skylake-SP in gaming.

I'm going to hold off on making a final recommendation for gaming for the moment, as right now there are clear platform problems. I have no doubt Intel and the motherboard vendors can fix them – this isn't the first time that we've seen a new platform struggle at launch (nor will it be the last). But with pre-orders opening up today, if you're a gamer you should probably wait for the platform to mature a bit more and for the remaining gaming issues to be fixed before ordering anything.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#25
http://www.gamersnexus.net/hwreviews/296...-it/page-4
Quote:There’s no reason to buy this CPU. It’s a 7700K – which is a perfectly fine piece of silicon – but on a more expensive platform. Coupling the 7740X with an X299 results in a hamstrung, crippled motherboard the likes of which further complicates an already complex landscape of HSIO. X299 and the HEDT CPUs, like Skylake X, do have a place on the market. We can’t find a place for the 7740X, and anyone thinking that they might buy one and upgrade later should instead consider just going Z-series + 7700K. It’ll be cheaper, it won’t offer features that will go unused (those aren’t free – you pay for those features), and will ultimately offer the same performance. If going for X299, go full HEDT.

Taking a half-step will only stand to waste more money, as upgrading from a $330-$350 CPU doesn’t make much sense. An upgrade from a G4560 to a 7700K might make sense, but that’s a sub-$100 part.

We’re also not clear on why KBL-X must exist in its LGA2066 form factor; Intel was not able to supply an adequate answer for this. If the 7740X and 7640X were simple refreshes on the existing 200-series motherboards, that’d be a completely different story – there’s nothing wrong with a slight bump and a refresh, particularly at the same price. There is something wrong, though, with a refresh of the same hardware as unnecessarily relocated to a new platform (which must be purchased), while also stripping features out of the original product (IGP). Sure, the IGP doesn’t really go used – but there’s no reason to remove it and then charge the same price. Doubly so if Intel’s argument is that one can upgrade from KBL-X to SKY-X; if that’s the argument, let them live on the IGP, too.

Hard pass. If this CPU interests you, we’d suggest the 7700K instead. That’s still a good processor, it’s on a more mature chipset and platform, and it makes far more sense than these. If X299 interests you, go HEDT or consider Ryzen for production and CPU rendering workloads.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#26
And X299 motherboards are still having issues, though not as bad as Ryzen's initial issues: http://www.pcgamer.com/the-ongoing-testi...-i9-7900x/
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#27
https://www.techpowerup.com/234744/intel...r-der8auer
Quote:It would seem Intel's X299 platform is already having some teething issues, with user "der8auer" of overclocking fame claiming the platform is essentially a complete "VRM disaster." In the video in which these claims are made, he levies the blame to both Intel and the motherboard manufacturers "50/50." For Intel's part, he blames them for the short product launch which was pulled in from August to June, giving the motherboard manufacturers in der8auer's words "almost zero time for developing proper products."

In the video, der8auer elaborates to basically claim a completely lack of consistency among the quality of VRMs and their heatsinks in various manufacturers. In his first test, he takes a CPU that is known to do 5.0 GHz and on a Gigabyte Aorus branded mainboard found himself unable to even hit 4.6 GHz with dangerously high VRM temperatures. He goes on to blame the heatsinks on the VRMs, going so far to call the Gigabyte solution more of a "heat insulation" device than a cooler, as a simple small fan over the bare VRM array did many magnitudes better than a simple standard install with the stock VRM cooler attached.

der8auer also went on to criticize the lack of voltage input in the form of many boards having only a "single 8-pin connector" which der8auer claims is not nearly enough. He claims cable temperature of nearly 65 degrees Celsius on the 8-pin cable which is obviously disconcerting, though Techpowerup has been in discussions with renowned PSU-tester Jon Gerow (Jonnyguru) who feels the "all-in-one" cable design on the Super Flower PSU shown in the video may be partially to blame here for the heat level with that current draw. It's hard to tell which part is more at fault for that temperature and we will update that as we know more.
...
One thing is for certain: The VRM situation is far from consistent at this point in time, and overclocking results on one board may not be consistent to another. Heatsinks may be inadequate, and as far as Overclocking is concerned, it may get interesting folks, and not in a good way. In the end der8auer concluded he couldn't really give a solid recommendation to any of the launch boards put past his desk, all of them having one issue or another with VRM heat at some point.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#28
https://www.techpowerup.com/234922/updat...m-disaster
Quote:We have some updated information on the X299 Platforms VRM issues from the same overclocker who initially discovered the issue, renowned overclocker der8auer. In an updated YouTube video, der8auer first updated his viewers with new information on his testing techniques, and basically concluded that all issues initially detected (throttling included) are still is an issue even after extensive testing, only in some instances it is difficult to detect not only if you are throttling, but even specifics such as what precisely is throttling. He goes into extensive detail, but a brief summary of the videos main points can be found below for your consumption.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#29
http://www.tomshardware.com/reviews/-int...117-4.html
Quote:Ultimately, we’re looking at power consumption numbers similar to some high-end graphics cards when we start messing with Skylake-X. AMD’s FX-9590X doesn’t even come close to these results, if that means anything to you. This means motherboard manufacturers need to start spending money on better components and cooling solutions to take care of those components. Otherwise, long-term reliability will be hard to guarantee. Ultra-durable and military-class components don't have to be exclusive to top-end products; they can bolster mid-range platforms, too.

Motherboard manufacturers could have and should have known that Intel's Skylake-X CPUs would consume power indiscriminately, in spite of the company's laughably low TDP specifications. Everyone has access to Intel's datasheets, not just us.

Different motherboards will be affected to different degrees by our findings, and the one we tested isn't a flagship model by any means. You shouldn't generalize our results to mean there's an impending VRM-related disaster hanging over the heads of all Skylake-X owners. Boards from every manufacturer across a number of price points need to be tested before such a claim can be made. What we do know is that, although the problem originates under Intel's heat spreader, fault is also found with motherboard manufacturers as well.

We already have a high-end motherboard on its way from the same manufacturer, and we'll report our findings after running the same battery of tests.
...
So, what’s the bottom line? Intel is pushing the envelope once again with a factory-overclocked Xeon processor doing double-duty as a high-end desktop masterpiece. We're getting the sense, though, that the revered Core architecture can't be pushed much further. Everything works well enough this time around, at least. And if Intel hadn't chickened out and put thermal paste between its die and heat spreader, there might have been a happier ending for everyone involved in this story.

As it stands, even a custom water-cooling loop has to throw in the towel at 250W, long before most motherboard voltage converters hit their limits. Under normal operating conditions, the CPU, and not the motherboard, always throttles first.

Nevertheless, motherboard manufacturers aren’t blameless when it comes to the issues we encountered at launch and continue battling today. Using more thermodynamic expertise and less flashy plastic pieces would have paved the way for brawnier motherboards at the same price points. This would have ended the speculation before it even started. Anything designed to be just good enough always leaves you with a bad aftertaste, particularly since you never know when you might need a little extra headroom.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#30
https://www.techspot.com/review/1442-int...page4.html
Quote:Going into this review, we suspected Kaby Lake-X wouldn't be anything too impressive, mostly because these quad-core CPUs have no place on a high-end desktop platform that they can't fully drive. The fact that they provide weaker performance and greater power consumption than the original Kaby Lake processors only adds insult to injury.

These performance issues could be blamed on the X299 board we used and motherboard manufacturers sure appear to be somewhat of a scapegoat for this sloppy release. That said, the Asrock X299 Taichi appeared flawless for me when testing the higher-end 7800X, 7820X and 7900X CPUs.

After recording such odd results I looked around to see what other reviewers found during their testing. Mixed results are the norm with Kaby Lake-X being slower for the most part across a range of hardware. However, the confusing power consumption figures I recorded haven't been seen by everyone.
...
Here we are five weeks later and I'm confident in suggesting that you shouldn't buy a Kaby Lake-X CPU. Glad to have that off my chest.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#31
http://techreport.com/news/32245/rog-ram...-x299-vrms
Quote:Now Hartung is back in a new video discussing how some simple changes on Asus' ROG Rampage VI Apex motherboard avoid these temperature and power limits. While his comments reflect his feelings about Asus' overclocking-focused board specifically, the broader takeaway is that relatively simple improvements could be incorporated into X299 boards from all manufacturers to improve VRM temperatures and performance headroom.
...
The professional overclocker presents test results that show his i9-7900X-plus-Rampage-VI-Apex test system stable with a 4.9 GHz overclock. The CPU was pulling a hair-raising 340W from the VRMs in this setup, but the system wasn't throttling due to the temperatures of that circuitry. The processor was limited instead by the thermal dissipation capacity of Hartung's liquid cooler. Attaching a small fan to the VRM heatsink reduced VRM temperatures from a hot-but-within-component-specification 103° C to a much-less-alarming 87° C under these conditions.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#32
ASUS is revising the VRM for the ROG Rampage VI Apex: https://www.techpowerup.com/235255/the-v...ge-vi-apex
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#33
https://www.techpowerup.com/235267/bench...for-gaming
Quote:However, what really paints Intel's i7-7800X in a bad light is that its performance continues to be lacking even when it has a frequency advantage over the 7700K. As you can see in the performance metrics, a Core 17-7800X overclocked to 4.7 GHz (with a 500 MHz advantage over stock clocks of the 7700K and 200 MHz over its Boost clock) still performs slower than it. The stock 7700K has 5% higher minimum and maximum framerates than the 7800X, despite being clocked lower, having a ridiculously lower amount of L2 cache, and having about the same total L3 cache (which actually results in an about 30% lower available L3 cache per core.) And these lower frame rates are delivered with a 41% higher idle power consumption, and 23% higher gaming power consumption. Check the source link for some detailed benchmarks. As for all this, it seems that while Intel likes to take digs on AMD for their "glued-together" desktop dies repurposed for servers, Intel's 7800X, which has its cache hierarchy and core interconnect re-architected for servers, may be little more than a repurposed server CPU for the desktop crowds...
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#34
http://www.tomshardware.com/news/intel-c...35038.html
Quote:The Threadripper 1920X also retails for only $799, which is quite the savings compared to the Intel twelve core model, which weighs in at $1,189. It's notable that the Intel price list represents a whopping $10 price reduction compared to Intel's previously-announced -7920X pricing.
...
On another note, the new price list also indicates that Intel isn't reducing prices in the face of the Ryzen onslaught. Unfortunately, the "% Decrease" column is barren. Intel did lower per-core pricing of its new Core i9 models compared to the previous generation, and many predict it will follow the same tactic as it releases newer models. At least it's something.

The renewed CPU wars may not be lighting a pricing fire, but it's apparent that Intel is attempting to stave off the Threadripper onslaught. We've reached out to Intel for an official release date and will update as more information comes to light.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#35
At time index 00:44, "a 2nd generation DDR2 memory controller" ROFLMAO


Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)