Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD Making iGPU For Intel Mobile 8th Gen CPU
#1
http://www.tomshardware.com/news/amd-int...35852.html
Interestingly, this was Intel's idea.
Quote:The dawn of the chiplet marks a tremendous shift in the semiconductor industry. The industry is somewhat skeptical of the chiplet concept, largely because it requires competitors to arm their competition, but the Intel and AMD collaboration proves that it can work with two of the biggest heavyweights in the computing industry. Not to mention bitter rivals. Industry watchers have also largely been in agreement that EMIB would not filter down to the consumer market for several years, but the announcement clearly proves the technology is ready for prime time.

DARPA initially brought the chiplet revolution to the forefront with its CHIPS (Common Heterogeneous Integration and Intellectual Property (IP) Reuse Strategies) initiative, which aims to circumvent the limitations of the waning Moore's Law.

Intel plans to bring the new devices to market early in 2018 through several major OEMs. Neither Intel nor AMD have released any detailed information, such as graphics or compute capabilities, TDP ratings, or HBM2 capacity, but we expect those details to come to light early next year.
Reply
#2
https://www.extremetech.com/gaming/25883...canyon-nuc
Three Hades Canyon units are planned, with TDPs up to 96W. Also, one of the commenters linked to this leaked graph from PCPer:
[Image: core-radeon-leak.png]
Reply
#3
First sighting of AMD's iGPU, called Vega M: https://www.techpowerup.com/239923/intel...systeminfo
Reply
#4
https://techreport.com/news/33014/furthe...union-leak
Quote:Intel's leak also confirmed that the i7-8809G boasts Radeon RX Vega graphics power alongside its Intel CPU. Intel didn't reveal any information about the processing resources or graphics memory available from the Radeon RX Vega M GH processor, but the package power figure and our back-of-the-napkin divvying-up of that figure suggest that this could be a Radeon RX 550 or Radeon RX 560-class GPU.

We already know that RX Vega parts tend to scale well down the voltage-and-frequency curve from our experience with Vega desktop cards' power profiles, and we've already seen how well eight Vega compute units perform in a 25 W package aboard the Ryzen 5 2500U APU, so it's possible that drawing conclusions about this GPU's weight class from Polaris chips is pessimistic. We'll really need to wait for further details to peg the position of this chip in the GPU hierarchy.

https://www.techpowerup.com/240133/intel...e-dual-igp
Quote:Things get interesting with the way Intel describes its integrated graphics solution. It mentions both the star-attraction, the AMD Radeon RX Vega M GH, and the Intel HD Graphics 630 located on the "Kaby Lake" CPU die. This indicates that Intel could deploy a mixed multi-GPU solution that's transparent to software, balancing graphics loads between the HD 630 and RX Vega M GH, depending on the load and thermal conditions. Speaking of which, Intel has rated the TDP of the MCM at 100W, with a rider stating "target package TDP," since there's no scientifically-correct way of measuring TDP on a multi-chip module. Intel could build performance-segment NUCs with this chip, in addition to selling them to mini-PC manufacturers.

Specifications of the RX Vega M GH continue to elude us. All we know is that it has its own 4 GB HBM2 memory stack over a 1024-bit wide memory interface, ticking at 800 MHz (204.8 GB/s memory bandwidth), and a GPU engine clock of 1.19 GHz. Even if this chip offers performance in the neighborhood of the discrete Radeon RX 570 4 GB, it should make for a killer entry-level gaming solution. Motherboards based on it could quickly capture the gaming iCafe, entry-gaming PC, and performance AIO markets.
Reply
#5
Intel NUC with AMD iGPU launching in March, codenamed Hades Canyon: https://techreport.com/review/33042/inte...revealed/4
Reply
#6
LTT has a good 5 minute long summary of Hades Canyon, it reportedly hit 5 GHz in the lab:


Reply
#7
Intel won't be sticking with AMD forever: https://www.techpowerup.com/240625/intel...ctic-sound

Nvidia responds: https://www.techpowerup.com/240629/nvidi...intel-emib
Reply
#8
https://hothardware.com/news/intel-core-...umd-mx-150
Quote:We've got some results from multiple Rise of the Tomb Raider benchmark runs to give you all a taste of what the Radeon RX Vega M GL GPU can do. The XPS 15 2-In-1 we tested here was configured with a Core i7-8705G processor (3.1GHz base, 4.1GHz boost), the aforementioned Radeon RX Vega M GL graphics engine, 16GB of DDR4 memory, and an ultra-fast NVMe SSD.

Running at 1920x1080 resolution, the XPS 15 2-in-1 was able to maintain an average frame rate of nearly 35 frames per second with High image quality settings dialed in (29.69 on Very High in the video above). Not bad, for a roughly 4.5 pound machine that measures only 16 mm thick. Compared to a similar 8th Gen Core system with Intel's own integrated UHD 620 graphics, it was no contest. Even on Medium quality settings, the Intel UHD 620 was only able to manage about 8 frames per second. In fact, Intel's own 8th Gen IGP can't even run the game on High IQ because it runs out of frame buffer memory.

In our video above, we're fairly certain that the 29.69 frames per second run on the XPS 15 2-in-1 was accomplished using the Very High IQ setting (re-confirming with Dell). For further comparison, an 8th Gen Core processor paired with an NVIDIA MX 150 GPU was able to put up an average frame rate of around 23 frames per second in the same benchmark but at High IQ settings.
Reply
#9
https://techreport.com/news/33317/report...uc-surface
Quote:Of course, what you're here to see is the performance. Playwares put the Hades Canyon NUC through a variety of both synthetic and real-world game benchmarks at stock clocks and also with a bit of GPU overclocking. Most of the tests use a resolution of 1920x1080 with "ultra" or "very high" in-game settings. At those settings, the performance of the Radeon RX Vega M GH is a bit of a mixed bag. Playwares describes the chip's performance as falling between desktop GeForce GTX 1050 Ti and GTX 1060 cards, or around where an RX 570 sits.

It's important to remember that the system in question is an engineering sample and may not have fully-tuned firmware or software. It's also worth remembering that this is a NUC. Equipped with two SO-DIMMs and a Samsung 960 EVO SSD, Playwares' sample of the NUC8i7HVK drew just 140 W from the wall on average with a peak of 155.3 W. Even overclocked, the power draw only peaked at 186 W. The NUC comes with a 230 W power adapter that should be solid for the lifetime of the machine. Curiously, Playwares reports that the CPU did not throttle even as it approached 110° C.

Overall, Playwares seems to have come away from Hades Canyon impressed with the little machine. The site remarks the NUC gets very noisy when overclocked, but also says that since the RX Vega M GH's HBM2 runs at a relatively low clock rate, overclocking it is "essential." Playwares ultimately says the system's gaming performance is adequate in general and impressive given its size. The Google translation of the article is fairly comprehensible, so even if you don't read Korean you still might enjoy heading over to Playwares and check out the review. Thanks to TechPowerUp for the tip-off.
Reply
#10
https://hothardware.com/news/intel-hades...s-salivate
Quote:All of this sounds good, but we really need to see pricing to determine how viable a solution Hades Canyon will be for enthusiasts and gamers. Given that cryptocurrency miners have pretty much raided supplies of current generation GPUs from the low-end to the high-end, it will be interesting to see how Intel prices Hades Canyon (and Kaby Lake-G in general). With a relatively potent GPU under the hood, it could be an attractive proposition for gamers and crypto-miners alike, that otherwise might not be able to secure a GPU to further their mining obsessions, if the price is right.
Reply
#11
Wow it's as fast as a GTX 1060 in Tomb Raider. That is super impressive. These chips are going to sell very well and it's a big blow to nVidia.
Reply
#12
http://www.tomshardware.com/reviews/inte...536-9.html
Quote:Intel aims its NUC8i7HVK at enthusiasts with unlocked CPU, GPU, and HBM2 ratio multipliers. We observed solid stock performance from the svelte little system, and even managed to overclock effectively. But thermal constraints kept us from truly tapping into the maximum potential of every component. For gaming, the best results came from tuning the GPU and HBM2, while productivity-oriented apps responded best to higher CPU frequencies.

It's fair to say that the NUC8i7HVK is fast enough to deliver smooth frame rates at 1080p using high-quality settings in most games. If you want to run at a higher resolution or know your favorite games are more demanding than the ones we tested, then you may have to hold out for a return to normalcy with graphics card pricing. Otherwise, The NUC 8 VR's AMD Vega-based GPU comes close to an Nvidia GeForce GTX 1060 in some of the platform-bound situations we discovered (and after overclocking).

Support may prove to be an interesting challenge for Intel, and we'll have to keep an eye on how the company handles driver updates. We're told that day-zero game drivers will become a thing, but they'll naturally need to originate from AMD. Whether Intel spends time validating that software before pushing it live remains to be seen. Enthusiasts will expect nothing less than timely optimizations for new titles, along with stability on par with Radeon RX Vega add-in cards.

Intel and AMD's competition with Nvidia in their respective fields may have been the impetus for a truly surprising cooperative effort. But the end result is an incredibly powerful solution packed into a very small form factor. While we wouldn't suggest that enthusiasts try replacing their gaming rigs altogether with Intel's NUC8i7HV, this compact platform makes for an interesting alternative to mainstream machines with mid-range GPUs currently selling at inflated prices.
Reply
#13
http://www.tomshardware.com/news/kaby-la...36844.html
Quote:In any case, this adds to a growing body of evidence that Kaby Lake-G's Radeon graphics may be more Polaris than Vega. Truthfully, it's all a bunch of branding buzzword bingo, anyway. And underlying architectures may not mean much to gamers beyond their correlation to performance. That said, we do want to see companies stay consistent with their messaging so consumers can be sure of what they're getting. Transparency helps as well, and while we're not quite pointing fingers here, both AMD and Intel could have done a better job explaining just what is (and isn't) included in these intriguing new chips.
Reply
#14
https://www.gamersnexus.net/hwreviews/32...-nuc8i7hvk
Quote:Let’s get the non-political stuff out of the way first: We found Intel’s (and AMD’s) Hades Canyon NUC to possess incredible processing power for its form factor, particularly for gaming. Performance of the high-end unit puts it between GTX 1050 Ti and RX 470 performance levels, which is hands-down impressive for the Vega M GH unit that’s included. On a CPU level, Intel’s contribution to the unit places it between i5-8400 and i5-8600K levels of performance in production (Blender) workloads, which is also admirable for a unit with the thermal and power limitations of a NUC.
...
On the political side, Hades Canyon has a lot of interesting implications for the industry. We recently commented that nVidia’s partnership with ARM (announced at GTC 2018) seemed like a stab at AMD and Intel. Intel’s partnership with AMD allows the companies to couple what each does best – Intel’s full-solution engineering and CPUs with AMD’s significantly higher-powered graphics solution. Together, the two fiercely compete with an industry largely owned by nVidia. NVidia’s SFF box partners, like Zotac with the Zbox, suddenly find themselves embroiled in a battle with two silicon makers who are out for blood. This also has implications for Kaby Lake G and Vega M in portables, like laptops, where nVidia presently holds the vast, overwhelming majority of the gaming market. Kaby Lake G and Vega M work well together. If Intel and AMD can keep the daggers behind their backs for long enough to stake some of the laptop market, nVidia will undoubtedly feel something it hasn’t felt in a long time: competition.
Reply
#15
https://techreport.com/review/33506/inte...eviewed/16
Quote:Finally, Hades Canyon is quiet at stock speeds. Our sound meter registered between 36 dBA and 39 dBA from the NUC at 18" (0.5 m) from the system. It's totally possible to enjoy headphone-free gaming with Hades Canyon if you like, and it won't drive other people crazy under load if they're sharing a space with you. Overclocking this NUC makes a racket, though it's a wonder that a system this small can be overclocked at all.

Most surprisingly, this fully-unlocked NUC doesn't even sell for that much of a premium over similarly-specced Mini-ITX systems. By our reckoning, a complete Hades Canyon NUC is just about 11% more expensive than a Mini-ITX PC with similar features, an overclocking-friendly motherboard and processor, and comparable graphics performance. What's really incredible is that even with a diminutive Cooler Master Elite 110 housing that Mini-ITX system, the NUC is just 1/13 the volume (power supply excluded). That's amazing performance-per-liter, even if you do have to find somewhere to hide the NUC's power brick.

All told, Hades Canyon nails everything we want in a tiny gaming PC and then some. Now that Intel has competent gaming hardware to play with, the company will need to back it up with the day-one driver support gamers expect from AMD and Nvidia. Given that AMD's driver team is providing support to Intel behind the scenes, the blue team shouldn't have a major challenge keeping up, but we won't know for certain until some time passes. For the moment, I'm not aware of anything else quite as small or fast or quiet as the NUC8i7HVK, and Intel's fine execution on all of those points make this system an easy TR Editor's Choice.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)