Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Vega Thread
#1
https://www.techpowerup.com/234573/radeo...artner-rep
Quote:An MSI company representative posting on Dutch tech-forums confirmed our worst fears, that the RX Vega will have a very high power draw. "Specs van Vega RX gezien. Tering wat power heeft die nodig. Wij zijn er aan bezig, dat is een start dus launch komt dichterbij," said the representative who goes by "The Source" on Dutch tech forums Tweakers.net. As a gentleman scholar in Google Translate, and citing VideoCardz which cited a native Dutch speaker; the MSI rep's statement translates as "I've seen the specs of Vega RX. It needs a damn lot of power. We're working on it, which is a start so launch is coming closer."
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#2
https://www.techpowerup.com/234662/vega-...ce-preview
Quote:It's PC World's comments on the Vega card's gaming performance that might pique your interest. In its report, the publication comments that the Radeon Pro Vega Frontier Edition offers gaming performance that is faster than NVIDIA's GeForce GTX 1080, but slightly slower than its GTX 1080 Ti graphics card. To back its statement, PC World claims to have run the Vega Frontier Edition and TITAN Xp in "Doom" with Vulkan API, "Prey" with DirectX 11, and "Sniper Elite 4" with DirectX 12.

https://www.techpowerup.com/234657/falco...n-pictured
Quote:Gaming PC builder Falcon Northwest teased a picture of its upcoming Tiki compact high-performance desktop built on the AMD Radeon theme. The silver-bodied beast shows off a Radeon Pro Vega Frontier Edition graphics card through an acrylic cutout on its side, and will be one of the first pre-built desktops you can buy with the $1,000-ish air-cooled Radeon Pro Vega Frontier Edition.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#3
And Vega Frontier Edition is released: http://techreport.com/news/32163/radeon-...999-and-up
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#4
https://www.techpowerup.com/234737/amd-r...enchmarked
Quote:Update 1: #define has made an update with a screenshot of the card's score in 3DMark's FireStrike graphics test. The user reported that the Pro drivers' score "didn't make sense", which we assume means are uncooperative with actual gaming workloads. On the Game Mode driver side, though, #define reports GPU frequencies that are "all over the place". This is probably a result of AMD's announced typical/base clock of 1382 MHz and an up to 1600 MHz peak/boost clock. It is as of yet unknown whether these frequencies scale as much with GPU temperature and power constraints as NVIDIA's pascal architecture does, but the fact that #define is using a small case along with the Frontier Edition's blower-style cooler could mean the graphics card is heavily throttling. That would also go some way towards explaining the actual 3DMark score of AMD's latest (non-gaming geared, I must stress) graphics card: a 17,313 point score isn't especially convincing. Other test runs resulted in comparable scores, with 21,202; 21,421; and 22,986 scores. However, do keep in mind these are the launch drivers we're talking about, on a graphics card that isn't officially meant for gaming (at least, not in the sense we are all used to.) It is also unclear whether there are some configuration hoops that #define failed to go through.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#5
http://www.gamersnexus.net/news-pc/2972-...e-and-more
Quote:That’s it. The baseplate has some thermal pads for the FETs & inductors, alongside some fins near the DisplayPort & I/O, but is otherwise plain. The cooler is a large, aluminum block with standard 90-degree fin pitch and standard fin density, using flat-top fins to force air through the cooler block more efficiently. This exhausts out the back of the case, as expected. Conduction is handled by a copper protrusion in the copper baseplate, which covers both the GPU & HBM2 (two stacks). This is a vapor chamber cooler.

As for measurements, we took a pair of calipers to get some rough measurements. These may be off by 0.25-0.5mm. Here’s what we came up with:
  • 30mm x 30mm total size of interposer + GPU (does not include substrate)
  • 20.25mm x ~26mm GPU die size
  • 10mm x ~12mm HBM2 size (x2)
  • ~4mm package height (this is the one that’s least accurate, but gives a pretty good ballpark)
  • 64mm x 64mm mounting hole spacing (square, center-to-center)
  • PCB ~1mm

Our VRM & component-level analysis will come next, conducted by GN resident overclocker Buildzoid. Power, thermals, and noise will be around the same time for publication.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#6
https://www.techpowerup.com/234846/amd-c...graph-2017
Quote:Also, when asked about the Frontier Edition's (lacking) gaming chops, AMD's Jason Evangelho has come out with the warning that we all expected, and that we ourselves conveyed here: "it's premature to worry about a product's gaming performance by judging a different product NOT optimized for gaming."

https://www.techpowerup.com/234848/amd-r...at-484-mm2
Quote:For the math-savvy around here (or even just for those of you who have read the headline), that particular equation should solve towards a perfect 484 mm² die area. Good news for AMD: this isn't the company's biggest die-size in consumer GPUs ever. That dubious honor goes to the company's Fiji XT silicon which powered the company's R9 Fury X, coming in at a staggering 596 mm² in the 28 nm process. For comparison, AMD's current Polaris 20 XTX-based RX 580 chip comes in at slightly less than half the confirmed RX Vega's die-size, at a much more yield-friendly 232 mm². NVIDIA's current top-of-the-line Titan Xp comes in at a slightly smaller 471 mm² die-size.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#7
https://www.techpowerup.com/234924/amd-r...-per-month
Quote:The folks at Videocardz have put together an interesting chart detailing the 687F:C1 RX Vega's score history since benchmarks of it first started appearing, around three months ago. This chart shows an impressive performance improvement over time, with AMD's high-performance GPU contender showing an improvement of roughly 15% since it was first benchmarked. That averages out at around a 5% improvement per month, which bodes well for the graphics card... At least in the long term. We have to keep in mind that this video card brings with it some pretty extensive differences from existing GPU architectures in the market, with the implementation of HBC (High Bandwidth Cache) and HBCC (High Bandwidth Cache Controller). These architectural differences naturally require large amounts of additional driver work to enable them to function to their full potential - full potential that we aren't guaranteed RX Vega GPUs will be able to deliver come launch time.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#8
https://www.techpowerup.com/235067/amd-a...ule-leaked
Quote:Disclaimer things first: take this with a grain of salt, since this hasn't seen the amount of confirmations we'd like.
...
However, it seems that AMD's BIOS is only scheduled to be sent to AIB partners on August 2nd, and AIB partners still have no word on launch dates from AMD, which would hamper their ability to move on to mass production of their designs. This may mean a paper launch from AMD, or perhaps a launch with only AMD reference designs being available for order. Remember that final BIOS is a particularly important part on partner's design customizations, since these usually include info on stock AMD-defined power and temperature limits, power states and fan curve, which partners leverage in building their customized cooling solutions.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#9
And the liquid-cooled Vega Frontier Edition is out: https://www.techpowerup.com/235154/liqui...d-1-489-99
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#10
https://www.extremetech.com/gaming/25241...ega-launch
Quote:There are a few ways this could play out, based on what we know so far. The 56-CU Vega XL could be positioned between the GTX 1060 and the 1070 with the 64-CU Vega XT (air-cooled) landing between the GTX 1070 and GTX 1080. The air-cooled and water-cooled Vega XTX (OC) would then try to close the gap with the 1080 and approach the 1080 Ti. This makes sense if AMD wants to match the GTX 1080 with its air-cooled XTX, then surpass it by a solid margin with its water-cooled variant.

Alternatively — and this assumes game benchmark scores are low due to driver issues or possibly a need for that water cooler — the Vega XL actually drops in between the 1070 and 1080; the air-cooled Vega XT has a win-some-lose-some battle with the GTX 1080; the air-cooled RX Vega XTX (OC) beats or at least matches the GTX 1080 in most games; and the water-cooled RX Vega XTX (OC) performs well enough to be a slightly slower (but also slightly less expensive) competitor to the GTX 1080 Ti. But based on what we’ve seen to-date from the Vega Frontier Edition, AMD needs to pull two rabbits out of its hat to make that happen. First, it needs to pick up at least 10 percent in average game performance thanks to better drivers. Second, it needs to pick up 8-12 percent of clock speed over the Vega Frontier Edition’s 1.6GHz.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#11
http://www.gamersnexus.net/guides/2990-v...erformance
Quote:The trouble with this solution is that it is imperfect by nature. First, every chip is not made the same; ours may undervolt better or worse than others out there, and that means there’s no easy “use these numbers” method. You’ll ultimately have to guess and check at stability to find the numbers that work, which means more work is involved in getting this solution to be rock-steady. That’s not to say it’s difficult work, but it’s certainly not as easy as plugging a card in and using it. We found that some games required 1120mv to remain stable, while others were fine at 1090mv. Ideally, you’d make a profile for each application – but that’s a bit annoying, and becomes difficult to maintain. The next option would be to choose the lowest stable voltage for all applications (in our case, that might be 1120mv). You lose some of the efficiency argument when doing this, as the bottom-end is cut off, but still gain overall.

A straight +50% overpower configuration is a huge waste of power down the PCIe cables, which results in running hotter than necessary and thereby louder.

Software is also buggy and frustrating. No, not everyone sees the same issues – that’s the nature of buggy software. It is difficult to precisely pinpoint the issue causing HBM2’s brutal downclocking of -445MHz, but we have seen it happen routinely and on multiple systems with multiple environments. We think that this has to do with manually configuring all 7 DPM states and their corresponding voltage states; when we only configured DPMs 4-7, the downclocking issue did not occur. Fan speed curves are also inaccurate, and report about 200RPM higher than what the user requests. Wattool has the same bugs as WattMan, and Afterburner can’t adjust voltage (yet). The point isn’t to say that it’s impossible to undervolt like we did, it’s just to say that you should really be aware of all the different variables when tweaking. It’s possible to inadvertently hinder performance (in major ways) if HBM2 underclocks without the user’s knowledge. Keep an eye on it.

As for the task at hand, it seems the best possible configuration is to overpower the core (+50%), undervolt the core (roughly -110 to -90mv), and run a fan RPM that keeps temperatures at or below 80C. That’ll depend on your cooling solution, case, and case/room ambient temperatures.

This yields a decent boost to application performance (professional and gaming) without costing the insane +90W draw of a straight +50% overpower configuration.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#12
https://www.pcper.com/reviews/Graphics-C...king-and-C
Quote:Which leads to the real question: what does this mean for the upcoming RX Vega product? While I know that testing the Frontier Edition in ONLY gaming is a bit of a faux pas, much of our interest was in using this product to predict what AMD is going bring to gamers later this month. It is apparent now that if the clocks are in the 1600 MHz range consistently, rather than dipping into 1400 MHz and below states as we found with the air cooler at stock, Vega in its current state can be competitive with the GeForce GTX 1080. That’s an upgrade over where it stood before – much closer to GTX 1070 performance.

The question is then can AMD provide that type of gaming experience at the $499 price point and will it require liquid cooling to do so? AMD probably isn’t holding back on some magic air cooler performance tricks it might have for RX Vega, so I don’t expect performance and clock consistency to change much. BUT, board partner cards that don’t design only for blower-style integrations will likely be able to add a lot of RX Vega performance and capability. This is clearly an instance where a better cooling solution can make a dramatic impact; in recent years and with current GeForce cards that isn’t the case. I do think we will see a liquid-cooled version of RX Vega but at a higher cost because of the complexity of the design used here. We might also see that HBM2 memory clocked up another 100 MHz or more, boosting things ever so slightly.
...
Despite the added performance, the Frontier Edition pricing scale leaves a lot to be desired. You are going to shell out $500 more for this version of the card, netting you at most 15% added performance for 50% added cost. Yes, the cooling solution design is impressive, and the card maintains an under 65C temperature in our testing at stock settings under full load, but it’s a steep price to pay. If the cooler really DOES cost enough to warrant that change, then it’s bad news for the potential for a water-cooled RX Vega card and its ability to stay within a reasonable price window to compare to GeForce products.

Before doing this testing, I had dismissed the liquid-cooled version of the Vega Frontier Edition completely, but the performance and improvement in clock rate/temperature we saw today gives me hope that RX Vega will be able to turn in a better-than-expected performance when it hits the scene. There are still plenty of ways that AMD could drop the ball, but if they can launch RX Vega at similar performance to the GTX 1080, as we saw here with the card in its 350-watt state, and prices it correctly, there will be a line up for buyers. And that is despite the obvious power efficiency disadvantage that AMD will have to accept.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#13
350W new tech to compete with 180W year old tech will be the battle cry on the forums.

The response will be:

A. I don't want a card that might have less VRAM than advertised like the 970.
B. I want a card that plays Game X better because I play a lot of Game X.
C. I want new technology like HBM2.
D. I keep my card a long time and AMD drivers just keep getting better.

Lather, rinse, repeat.

It's always the same.

My guess is something went horribly wrong with Vega, and what launches will be a 2nd or 3rd revision of it. AMD just doesn't have the cash to operate at the level of intel and AMD.

However; I think the level they do operate at is pretty amazing given the loss of staff and cash they've been through.
Reply
#14
This has the markings of AMD leaving the segment, they aren't playing the same game anymore.

Vega is 484mm squared, the 1080ti is 471mm squared- Vega is bigger, not much, but it is bigger. 

Power consumption- 1080ti is guzzling down a whopping 250 Watts, in order to get Vega to perform *at its advertised clock rates* it is looking like it needs 350 Watts... 

It's built using HBM 2, adding considerable overhead for what amounts to, at the moment, no real benefit. 

If this part was launching alongside Pascal, it would be a run of the mill case of AMD coming up short, but that isn't where we are at- We are staring down Volta.

Given AMD's rightly earned reputation for epicly failing to fill retail channels heavily for product launches, and given the utter catastrophe that has been Vega making its way to an actual shipping consumer product, they are looking to have one ramp up quarter of possible sales at *passable* margins before Volta destroys their balance sheets. 

This part is markedly more expensive to make than the 1080ti, it requires a more expensive cooler, uses far more expensive RAM- and on a cost basis I am doing AMD a *huge* favor comparing this to the 1080ti. While nVidia has been printing huge margins and selling out of 1080tis for quite a while now, the much lower margin Vega- if it was performing at the same level of performance, would be doing so for smaller margins on an insanely narrow time window before they are dealing with Volta. 

In this wet dream scenario for Vega, they would have at most a quarter of ramped production to not only recoup their R&D costs but also try to finance closing the gap which is now looking at being close to two generations- two generations behind the company that many are seeing as the company that is going to knock Intel off of its computing throne that it has had for decades. 

AMD is not competition for nVidia at this point, I don't know why there is any real confusion on this point. By volume AMD has about somewhere between one half and one third the market share nV does, entirely at the bottom of the segment in all aspects- most importantly margin wise. Right now nV's competition is themselves and *maybe* Intel or Google on the outside. Data Center/AI/Auto is where nV's growth is coming from, it will soon be the majority of their income- with better margins and significantly longer return periods per device developed. As a general example- Audi is about to launch the first in the world Level 3 autonomous driving vehicle(Tesla could probably argue their newest are, but Audi is actually claiming it making them liable). This system is based on the Tegra K1- you know- the chip that debuted in January of 2014.

*This* is the major element that some people seem to not be grasping- nVidia is now entering into mass production with parts that should be generating in the hundreds of millions of dollars off of *KEPLER* based parts. Just entering. 

I bring this up in the Vega thread to try and impress an accurate level of assessment on how this architecture actually stacks up. Validation for these segments, particularly automotive, takes years- actual returns on investments are also years out. AMD, with this architecture has shown the given an extra year, a larger die, more expensive platform and a borderline shocking amount of additional power, they can soundly get their ass kicked- without nVidia even needing to get Volta out the door. 

We are past the hypothetical phase on these future nV revenue streams. They are going into mass production- and the pipeline is already full for the next four, and barring Intel/Mobileye popping up something insane real quick five years, and Vega...... Vega was the last chance AMD had to really try and become remotely competitive with nVidia in at least one segment. 

Now, Datacenter is dominated by Intel with nV taking a big slice- and AMD being almost non existent(and nigh completely absent in the GPU space). Auto is being dominated by nVidia with Intel playing on the fringes and AMD not even being contemplated, and, oh, there are these desktop GPU things they have going too. 

The regular refrain from the AMD camp was they didn't have enough R&D to compete with nVidia(heh, Larrabbee, but I digress) because of Ryzen. Well, now you are competing with revenue streams that have margins that obliterate desktop GPUs- and pay dividends out many years after you are into straight black(Kepler products entering mass production now, 2H 2017).

This isn't AMD having a run of the mill bad generation. All of the above would be if Vega were going toe to toe with the 1080ti- they aren't even close to that stupidly low bar(given an extra year, bigger die, additional platform costs and almost 50% higher power budget)- they aren't even fucking close.

This has the markings of AMD leaving the segment, they aren't playing the same game anymore.

I don't say that lightly or without extensive reasoning.
Reply
#15
I will just post here what I posted on ArsTechnica.  None of this surprises me.  I predicted this sort of thing would happen the day I learned Eric Demers had abandoned ship for Qualcomm.

gstanford Wrote:This is what happens when the Chief GPU architect leaves a company. 5 years since Eric Demer left AMD after giving them GCN. 5 years of twiddling with what he left them, desperately rearranging the deck chairs on the Titanic multiple times.

and quite frankly the only reason Eric Demers worked as hard as he did on GCN before leaving is because AMD was being paid by Sony to put the features Sony wanted on the chip.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#16
(07-19-2017, 01:43 PM)BenSkywalker Wrote: A whole bunch of stuff that makes pretty good sense.

Ben I think you need to copy/paste your post to AnandTech Video Cards forum.

There are a bunch of guys over there like AtenRa, SlowSpyder, Nurtured Hate, etc ad infinitum who want to talk to you about this.

Big Grin

Actually you should do it just to make them earn their free Vegas, the sight of this will launch a dozen PM meetings between them and their mothership.

I disagree on one thing-

I don't see AMD leaving the segment.

A. It's how they make a living. They're not going to just say, "The jig is up, we're beat. Time to fish.". They've lingered around in the CPU market for decades, rarely competing.
B. They have the console market that NVIDIA does not seem to want. They need to keep advancing their GPU tech.
Reply
#17
https://www.techpowerup.com/235297/amds-...t-gtx-1080
Quote:All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.

AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#18
Ben!
Post on ATVF!
The readers there haven't seen the Raja quote where he says his driver team wishes Vega were the same as Fiji enough times yet.

Only when that has been posted 500X will people believe writing drivers is hard and Vega will slay! Smile
Reply
#19
Here, I'll write my request BoFox style!

"Strain turgid against the BVD boundaries of ABT and explode all over the AT forum in a frenzy of fappening!"

(heh- that guy was fun to read, I'm boring compared to him!)
Reply
#20
LOL this thread is pretty great. I don't agree with BenSkywalker at all politically, but he was really spot on with his post about Vega. Very well thought out and very accurate IMO.
Reply
#21
(07-19-2017, 11:54 PM)SickBeast Wrote: LOL this thread is pretty great.  I don't agree with BenSkywalker at all politically, but he was really spot on with his post about Vega.  Very well thought out and very accurate IMO.
Fully agreed.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#22
An interesting tidbit from here:

http://www.pcgamer.com/amd-takes-radeon-...-gtx-1080/

Quote:There's a more concerning aspect in the "costs $300 less" argument, since the FreeSync monitor retails for $800 compared to $1300 for the G-Sync display. That puts the G-Sync display plus GTX 1080 at $1850 (MSRP), compared to $1550 for the FreeSync display plus RX Vega (MSRP), but the real price difference is the $500 delta between the monitors. This suggests AMD is looking at around $700 (give or take) for the RX Vega, which would put it into direct competition with the GTX 1080 Ti. If the demonstration really used a GTX 1080 and not a 1080 Ti, performance would appear to be in Nvidia's favor.

$700USD for GTX 1080 performance. No thanks. I'm not quite sure what AMD is thinking.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)