Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Vega Thread
#81
https://www.techpowerup.com/236632/amd-r...erformance
I'd be skeptical, just like TPU is.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#82
Koduri admits that Vega is not optimized for gaming: https://www.eteknix.com/amd-vega-not-optimised-gaming/
TPU adds its own observations: https://www.techpowerup.com/236697/on-am...tweetstorm
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#83
http://www.gamersnexus.net/guides/3040-a...wer/page-2
Quote:The primary takeaway is that CUs are far less impacting to performance on Vega than raw clocks. In some tests, we didn’t even have to bypass the power limit in order to surpass stock V64 – but doing so helps keep up as V64 becomes overclocked, something we’ll look into more later. A 50% offset and modest 9% / 980MHz OC gets us to V64 performance levels. In some games, the extra 50% power (going to 100% offset, increasing current to 30-33A at 12.3V) pushes us toward and into double-digit percentage gains over the V56 OC, while other games give us ~7-8%. It just depends on the game, turns out, but results are promising in some instances.

One thing we’ve learned for sure is that V64 is hardly worth a consideration. Fifteen to twenty minutes overclocking even a reference V56 – let alone a partner model – gets us to V64 stock performance. The power mods just make it that much better. They’re probably not worth it in 95% of use cases, but the fact that AMD provided an insanely over-built VRM really does invite the play. Might as well make use of it.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#84
https://www.techpowerup.com/236748/rx-ve...eum-mining
Quote:Now granted, Vega's strength in mining tasks - Ethereum in particular - stems mainly from the card's usage of HBM2 memory, as well as a wide architecture with its 4096 stream processors. By setting the core clocks to 1000 MHz, the HBM2 memory clock at 1100 MHz, and power target at -24%, Reddit user S1L3N7_D3A7H was able to leverage Vega's strengths in Ethereum's PoW (Proof of Work) algorithm, achieving 43 MH/s with just 130 W of power (104 W of these for the core alone.) For comparison, tweaked RX 580 graphics cards usually deliver around 30 MH/s with 75 W core power, which amounts to around 115 W power draw per card. So Vega is achieving 43% more hash rate with a meager 13% increase in power consumption - a worthy trade-off if miners have ever seen one. This means that Vega 64 beats RX 580 cards in single node hashrate density, meaning that miners can pack more of these cards in a single system for a denser configuration with much increased performance over a similarly specced RX 580-based mining station. This was even achieved without AMD's special-purpose Beta mining driver, which has seen reports of graphical corruption and instability - the scenario could improve for miners even more with a stable release.

Moreover, S1L3N7_D3A7H said he could probably achieve the same mining efficiency on a Vega 56, which isn't all that unbelievable - memory throughput is king in Ethereum mining, so HBm2 could still be leveraged in that graphics card. It seems that at least some of that initial Vega 64 stock went into some miner's hands, as expected. And with these news, I think we'd be forgiven for holding out to our hats in the expectation of increased Vega stock (at the original $499 for Vega 64 and $399 for Vega 56 MSRP) come October. Should the users' claims about RX Vega 56 efficiency be verified, and coeteris paribus in the mining algorithms landscape for the foreseeable future, then we can very much wait for respectable inventory until Navi enters the scene.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#85
https://www.techpowerup.com/236831/psa-f...ck-shaders
Quote:When TechPowerUp released GPU-Z v2.3.0 earlier this week, AMD Radeon RX Vega 56 users who had flashed their graphics cards with the video BIOS of the higher RX Vega 64, discovered that their stream processor count had shot up from 3,584 to higher counts under 4,096. Some of these users felt it more or less explained the performance jump experienced after the BIOS flash. Some users even saw wrong stream processor-counts of their untouched RX Vega 56 reference-design cards. TechPowerUp GPU-Z v2.3.0 incorrectly reports the stream processor count of flashed RX Vega 56 graphics cards, and some RX Vega 56 graphics cards out of the box; due to some under-the-hood bug in the way it reads the registers of AMD's new GPUs. We are working on an update to GPU-Z, which will fix this bug.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#86
https://www.techpowerup.com/236826/do-in...l-sanction
Quote:Over the past couple of months, inflation in AMD Radeon GPU prices, in part fueled by silicon shortages, and in part by non-gamers (read: crypto-currency miners) buying up graphics cards, have impacted the AMD Radeon brand in the eyes of its target audience - PC gamers and graphics professionals. It was initially believed that market forces are driving the inflation, and that AMD had little to do with the price inflation. We then uncovered a clue that not just end-users, but even retailers are being sold AMD Radeon graphics cards at prices way above AMD's launch SEP. A Tweet by an official AMD Twitter handle shows that inflated AMD Radeon graphics card prices has the company's official sanction.

"@AMDGaming," a verified Twitter handle held by AMD, which promotes the company's products targeted at gamers, such as AMD Radeon graphics cards, and Ryzen processors; posted a promotion in which an XFX branded Radeon RX 570 graphics card, which is being sold at USD $279, including a free coupon for a "Quake Champions" pack free, was made to appear as if at its price, it's a great deal. The RX 570 was launched at USD $169 for the 4 GB variant, and $199 for the 8 GB variant. The XFX Radeon RX 570 4 GB RS (the card being marketed in the Tweet) was launched at $179. The Tweet was met with angry reactions for how blatantly AMD was marketing price-inflated Radeon graphics cards, without actually doing something about taming the prices.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#87
https://www.techpowerup.com/236874/amd-t...or-vega-11
Quote:Due to these factors, it seems that AMD is looking to change manufacturers for both their chip yield issues, and packaging yield problems. ASE, which has seen a 10% revenue increase for the month of August (not coincidentally, the month that has seen AMD's RX Vega release) is reportedly being put in charge of a much smaller number of packaging orders, with Siliconware Precision Industries (SPIL), who has already taken on some Vega 10 packaging orders of its own, being the one to receive the bulk of Vega 11 orders. Vega 11 is expected to be the mainstream version of the Vega architecture, replacing Polaris' RX 500 series. Reports peg Vega 11 as also including HBM2 memory in their design instead of GDDR5 memory. Considering AMD's HBM memory history with both the original Fury and and now RX Vega, as well as the much increased cost of HBM2's implementation versus a more conventional GDDR memory subsystem, this editor reserves itself the right to be extremely skeptical that this is true. If it's indeed true, and Vega 11 indeed does introduce HBM2 memory to the mainstream GPU market, then... We'll talk when (if) we get there.

As to its die yield issues, AMD is reported to be changing their main supplier for their 7 nm AI-geared Vega 20 from GlobalFoundries to Taiwan Semiconductor Manufacturing Company (TSMC), who has already secured orders for AI chips from NVIDIA and Google. TSMC's 7nm and CoWoS (chip-on-wafer-on-substrate) capabilities have apparently proven themselves enough for AMD to change manufacturers. How this will affect AMD and GlobalFoundries' Wafer Agreement remains to be seen, but we expect AMD will be letting go of some additional payments GlobalFoundries' way.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#88
http://www.gamersnexus.net/industry/3050...ce-gouging
Quote:The undertone of content pertaining to AMD GPU prices in particular, with much focus on Vega, has driven a paraphrased dialogue of “we’re trying to do what we can to stop miners from getting these cards.” Early press briefings with the company, including those conducted at the Vega press event, explicitly included phrases like “we can’t hold a gun to [retailers’] heads” to get prices lower, or discussion about getting cards into the hands of gamers, not miners. Hands clean, it’s on the retailers. Statements issued to press have indicated that AMD “has no control” over the pricing situation which, although largely true, does seem mismatched with new initiatives.

It would seem odd, for instance, that AMD’s official social media accounts should suddenly embrace these higher prices, posting tweets such as:

“Need a new GPU? Grab this @XFX_Playhard RX 570 from @BestBuy for only $279 and get the Quake Champions Pack free! http://bit.ly/2gNkI8x

“Good news! @BestBuy has XFX Radeon RX 570s back in stock and at a great price of $279.99: https://www.bestbuy.com/site/xfx-amd-radeon
...
AMD has now become the proverbial accessory to a crime against consumers: Best Buy might be robbing the bank, but AMD is holding the bag open as the retailer furiously stuffs it with cash. AMD’s tweets are openly encouraging – just a few months after condemnation to a room of press – these inflated prices, using phrasing that boldly includes an adverb such as “only.” The card is “only” $100 over what it was just months ago, just like it’s “only” 10% off of a price hike that’s “only” 67% higher than what it should be. It’s two-faced. Up until these tweets, the company could garner consumer support through trying to help with the demand, trying to help with the prices, and taking the consumer’s side. Now, though, it’s clear that there’s limited action behind that veneer.

We have to grant that AMD is in a tough spot: There are likely agreements in place that require cross-promotion or social exposure to either retailers or board partners (Best Buy & XFX, in this case). AMD can’t very well tell consumers to avoid Best Buy, Newegg, Amazon, or any other retailer, and can’t tell those retailers to sod off. Retailers are a critical part of the ecosystem, and AMD needs to keep ties strong with those retailers and with consumers. It all hangs in the balance, but the balance gets tilted when the company suggests one thing to consumers, then another on social media. That’s the two-facedness of it: The retailers are to blame; oh, by the way, here’s a great deal on an RX 570 for only $280.

We’d encourage our audience to not become a part of this attempt at establishing a new pricing norm. Intentional or not, AMD’s endorsement of these inflated prices helps retailers to establish that new norm, rewriting history such that consumers think prices have always been this way, and that $280 is genuinely a good deal on a once-sub-$200 product. You can always wait: Wait for supply to pick-up or wait for mining to die down. There are also alternatives, like ripping a card out of an older system (in the interim), using an IGP, buying used, or buying competing products – like the $200-$220 GTX 1060 3GB or $280 GTX 1060 6GB cards, which hold price equivalence with an RX 570. The 570 was never really meant to compete with 1060s – that’s the job of the 580, but those range up to $350 for 8GB models. The 1060 6GB cards are still running about $30 higher than they should, but that’s a far cry from a $100+ jump that’s simultaneously condemned and promoted by the maker of the product. Again, waiting is always an option, as 1060 prices are also higher than they should be.

It’s disappointing, really: We recommended the X70 series at launch, starting with the 470 and leading to the 570. The card made far more sense than a GTX 1050 Ti at $145-$165, and wasn’t that much more money. It was also reasonably close in performance to an RX 580, but $30-$60 cheaper at launch. The 1050 Ti has remained stagnant in price, with nearly all higher-tier cards climbing at least somewhat from the spiked demand. That’s nVidia and AMD alike. Some of this disappointment is just a lack of supply mixed with retailer, distributor, or board partner decisions that have driven up prices. The rest stems from AMD’s new endorsement of those prices, which certainly puts ideas in the heads of retail giants: “They’re OK with this. They’ll even promote it.”

Mining provides a cover for some of the pricing scenario, but not this. These tweets, however short the combined <280 characters may be, betray the consumer and perpetuate a new norm of video card prices. We’d suggest that consumers don’t play into this trap. Wait, buy used, buy something else, or repurpose parts. One thing's for sure: A $100 up-charge is not a sale, and it's doublethink to imply as much. That much is on the retailers.

Best Buy did not reply to a request for comment.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#89
https://www.techpowerup.com/236943/micro...r-3-orders
Quote:This is a good way of limiting access to GPUs for mining conglomerates or particularly affluent individual miners, which would otherwise - as has been the case - buy up the entire inventory. It also marks a particularly strong position from MicroCenter, since usually, for retailers and e-tailers as well as for AMD, a sale is a sale, independent of use or buyer case. The company is likely missing out on some additional orders from miners by going this route, and the fact that they are willing to do so really speaks to how strong their vision is for how the market should be behaving. Likely, it isn't that difficult to circumvent this imposed restriction - but the simple fact that it exists is of note. And while this isn't a new approach (we've seen some retailers do the same around RX Vega 64's launch), this might make it more likely for other retailers to follow suit.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#90
The TPU article on MicroCenter has been updated.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#91
From the TPU article:
Quote:Update: The story initially mentioned that the $10,000 per card from three cards and up was an actual store policy, and it has been updated to reflect its nature as a deterrent instead.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#92
https://www.gamersnexus.net/guides/3053-...ifferences
Quote:If Vega 56 is able to stick near the 1070’s price, it’s AMD’s strongest argument from the Vega line. The biggest downside is the boosted power consumption, but if that’s not a concern to you, Vega 56 is a good buy if assuming a similar price between them. Prices are so volatile right now that we’ll refrain from hard numbers, and just suggest checking that the cards are relatively close. We’d strongly encourage solving for thermals with an aftermarket cooler or a board partner card, then overclocking. Vega 56 can outmatch or equal Vega 64 with the right mods, including powerplay tables and BIOS mods. For these gaming workloads, the only reason Vega 56 would underperform versus Vega 64 is AMD’s power limit, which is higher on V64. You can fix that with a BIOS flash or registry mod.

As for the shaders, it looks like there’s not a big difference for the games we tested. There’s probably an application out there that likes the extra shaders, but for gamers, we’d say hard pass on Vega 64 and strongly consider Vega 56 as a highly modifiable counter.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#93
Looks like aftermarket Vegas are coming in mid-October: https://www.techpowerup.com/237156/custo...id-october
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#94
http://www.tomshardware.com/news/amd-veg...35514.html
Quote:XFX and Sapphire confirmed that they both have custom boards in the works, but they could not say when they might be ready. PowerColor said that it will have its own custom cards, with mass production scheduled for the beginning of November, but it hasn't yet received the DRAM it needs. (VisionTek didn’t immediately reply to our queries about their future offerings.)

AMD also has partnerships with Asus, Gigabyte, and MSI to build Radeon graphics cards, but these three companies don’t have exclusive deals with AMD. As such, they aren’t driven by necessity and have the luxury of choosing which components to support. We spoke with all three companies, and their responses indicated that their support for the Vega architecture is less definitive than AMD’s exclusive partners.

We already knew that Asus is on board with Vega. The company announced in August that it would be releasing a pair of ROG Strix Vega cards with Asus’ custom cooling solution. Asus confirmed that those cards are still coming, although the release date has been pushed back from September to early October.

Although a Gigabyte rep said it’s likely that the company would be producing a custom Vega card, they would not or could not confirm with 100% certainty that it will. If it does, we likely won’t see it until the end of the year, or later.

MSI’s response surprised us. The company traditionally offers re-engineered graphics cards with custom PCB designs for all high-end GPU platforms, but it appears to be skipping the Vega lineup. A company representative told us that MSI “won’t be making a custom card anytime soon,” but could offer no additional information.

So what gives? Sources tell us that there is too much variance in the quality of the chips AMD is providing. AIB partners are unable to figure out a stable overclocked GPU frequency that works for all cards, and therefore cannot provide any sort of warranty on factory-tuned cards. Further, there continues to be discrepancies between the temperatures the GPU is reporting and what AIB partners are finding in actual measurements. This is true of the actual GPU and the capacitors below the GPU. We have some follow-up testing that will reveal more about these issues.

Finally, as we reported last month, there have been issues due to the different packages for Vega, making it difficult to efficiently mass produce custom Vega cards. We were seeing Vega with molded and unmolded packages, which we noted impacted package height. We were even seeing a third package--we assume, using SK hynix HBM. As we wrote then:
...
Generally speaking, AIB partners seem optimistic about shipping Vega cards in 2017, and some pointed out that custom Polaris cards came a couple months after the reference card launch. By that timing, we should be seeing some custom Vega cards at the end of September, or at least in October. We’re not getting a strong feeling that will be the case, however.

We’ve reached out to AMD for comment, but the company didn’t immediately reply.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#95
https://www.techpowerup.com/237226/amd-c...ed-rx-vega
Quote:Yes, this is the third post today about AMD introducing multi GPU support for RX Vega with their Crimson ReLive 17.9.2 beta drivers but it had to be made. First up, the caveats- we were only able to test the driver on a CrossFire setup involving one RX Vega 64 and one RX Vega 56 GPU so results with two of the same may differ. Secondly, these are beta drivers so there is a level of lenience here I am willing to afford to AMD. That said, the driver which came with its own announcement and internal results had to show something good and this means showing good scaling across multiple games.
...
Enabling CrossFire with GPUs that are based off the same die and family but with a different shader count has been something AMD has done for a few generations now, and we continue to appreciate this remaining an option. However, this is no excuse for the performance we are seeing here, be it a driver issue or otherwise. As can be seen from the chart above, a lot of games not only show poor scaling relative to a single RX Vega 64 but actually negative scaling. The negative scaling can potentially be a result of Vega 56 slowing down the Vega 64 card here, but then we are back to zero to minimal scaling again. Of the four games that do, three were in AMD's results chart that went around earlier today as well. This is really disappointing performance, and to make it worse we also faced visual artifacts and display corruption in a few games including Witcher 3 (pictured), Prey and Dishonored 2. We also tried having RX Vega 56 be the primary card to see if that changes anything, and the results were within error margins more often than not with a few results actually being higher (5-10%) than with RX Vega 64 as the primary card. There was also a blue screen that greeted us during the switch, so do not attempt to do so often.

There is another interesting train of thought departing this station of results. AMD has had CrossFire profiles for the previous Polaris (and older) architecture GPUs for most of the games tested here, and they showed consistent, if not great, scaling with those older cards. But based on the inconsistent scaling across game titles here, it appears that AMD may have to create new CrossFire profiles for all games with their RX Vega GPUs based off the Vega microarchitecture. This merits more analysis and testing, but we are definitely curious whether this is indeed the case and, if so, what is the reason behind it knowing that Vega is an evolution of GCN, and not a complete departure from it.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#96
https://www.extremetech.com/computing/25...cal-issues
Quote:I’m not going to pretend to have a secret source inside AMD on this one, but it’s hard to look at Vega and not wonder if HBM was a fundamentally bad call. The company’s molding issues are directly related to the height of the HBM2 memory. Meanwhile, it took AMD over two years to launch Vega. That’s the longest gap AMD or ATI has ever gone between high-end GPU refreshes, which previously took 12 – 16 months at most. HBM2’s rollout has been slower than anticipated in general, particularly the highest speed memory.

We don’t know for certain that HBM is the culprit here, but it certainly seems like the most obvious place to look. Because there’s always a considerable lag between when a GPU design project kicks off and when the final product tapes out, it’s entirely possible that AMD was too far along in the design process to start over when HBM2’s growing pains became clear.

The big signal here will be whether AMD’s future GPUs are based on GDDR6. If HBM2 was the culprit for the problems Vega seems to have, AMD won’t keep using it. AIBs are still expected to bring custom Vega cards to market in general, but those samples may not be in-market until closer to the holiday season.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#97
Summary:
  • Retail sample aftermarket Vega 64 from ASUS
  • Theoretically only 2–7% faster than reference
  • In practice, with stock voltage his sample OCed worse than his stock card, even with 17.8.2 drivers that aren't affected by a bug that harms OCing
  • Undervolting helps with noise reduction and OCing
  • The issue with OCing is the power limit
  • Manual OCing doesn't work very well, automatic OCing is the way to go



Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#98
The plot is thickening: https://www.techpowerup.com/237379/gigab...rx-vega-64

Edit: Source is an old link to Tom's that I already posted.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#99
https://www.gamersnexus.net/guides/3072-...-2-revisit
Quote:There’s some scaling in 3DMark Firestrike GT1, which is poly and tessellation intensive, and scaling in Unigine synthetic benchmarks. Even when there is scaling at the more realistic upper-end of performance, though, it’s not much – we’re talking 1-3% for an extra $100-$150. Not at all worth it, and often not replicable in gaming scenarios. Again, there are likely compute applications and some very specifically-made games that could benefit from the CU increase, but 97% of the performance comes down to clocks – if not more.

Given how easy it is to flash V56 and overclock, we’d recommend just going that route. Save the money, OC Vega 56, and walk away with more money and functionally equivalent performance.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
This is from Videocardz, so some of you may be skeptical: https://www.techpowerup.com/237493/gigab...d-pictured
Quote:Apparently, Gigabyte has received a new batch of AMD RX Vega GPUs just in time for the mid-October expected release window of RX Vega custom cards, and are already at work on a Gaming OC custom version of the graphics card with a dual fan configuration (likely a revised WindForce 2X cooler). The card is expected to feature an output configuration of 3x HDMI + 3x DisplayPort, and in Videocardz image comparison, occupies slightly more space in our usual three dimensions than the AMD reference design (as expected).
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/237491/amd-r...rt-7-dx-12
I wonder why this is. Is DX12 really the cause of this? I have my doubts.
Quote:8x MSAA was used in all configurations, since "the game isn't all that demanding". Being it demanding or not, the fact is that AMD's solutions are one-upping their NVIDIA counterparts in almost every price-bracket in the 1920 x 1080 and 2560 x 1440 resultions, and not only by average framerates, but by minimum framerates as well. This really does seem to be a scenario where AMD's DX 12 dominance over NVIDIA comes to play - where in CPU-limited scenarios, AMD's implementation of DX 12 allows their graphics cards to improve substantially. So much so, in fact, that even AMD's RX 580 graphics card delivers higher minimum frame-rates than NVIDIA's almighty GTX 1080 Ti. AMD's lead over NVIDIA declines somewhat on 2560 x 1440, and even further at 4K (3840 x 2160). In 4K, however, we still see AMD's RX Vega 56 equaling NVIDIA's GTX 1080. Computerbase.de contacted NVIDIA, who told them they were seeing correct performance for the green team's graphics cards, so this doesn't seem to be just an unoptimized fluke. However, these results are tremendously different from typical gaming workloads on these graphics cards, as you can see from the below TPU graph, taken from our Vega 64 review.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
Patch for Forza Motorsport 7 boosts Nvidia performance, fixes stability issues: http://www.tomshardware.com/news/forza-7...35609.html
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)