Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[EDIT]hmmm.....whats going on at AMD? (and history of 3Dfx Rampage)
Seems pretty clear that Amd has really really shot themselves in the foot with vega....

well... thats impossible....they have already blown off both their feet long ago. This is bound to be pretty damaging.

Here is another thing to consider. Amd quarterly report had some interesting news in it... increased inventory. Its actually concerning. Some may be quick to write it off as just Ryzen, but it is creeping on up there. Amd has shown this ability to miss judge and over stuff channels. They have been stuck writing off inventory in massive losses time and time again. I think should not be easily ignored.
Reply
https://www.techpowerup.com/233185/nvidi...ed-in-2017
Jensen is confident against Vega.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
Dual-GPU liquid cooled Vega card is likely: https://www.techpowerup.com/233208/linux...u-solution
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
AMD has a press conference on May 31: https://www.techpowerup.com/233214/amd-c...lmost-here
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233325/entir...n-june-5th
Quote:Reports are doing the rounds regarding alleged AMD insiders having "blown the whistle", so to speak, on the company's upcoming Vega graphics cards. This leak also points towards retail availability of Vega cards on the 5th of June, which lines up nicely with AMD's May 31st Computex press conference. An announcement there, followed by market availability on the beginning of next week does sound like something that would happen in a new product launch.

On to the meat and bones of this story, three different SKUs have been leaked, of which no details are currently known, apart from their naming and pricing. AMD's Vega line-up starts off with the RX Vega Core graphics card, which is reportedly going to retail for $399. This graphics card is going to sell at a higher price than NVIDIA's GTX 1070, which should mean higher performance. Higher pricing with competitive performance really wouldn't stir any pot of excitement, so, higher performance is the most logical guess. The $399 pricing sits nicely in regards to AMD's RX 580, though it does mean there is space for another SKU to be thrown into the mix at a later date, perhaps at $329, though I'm just speculating on AMD's apparent pricing gap at this point.

Next up is the RX Vega Eclipse, which will reportedly retail for $499, going head to head with NVIDIA's GTX 1080 (in fact, slightly cheaper than the majority of AIB versions of the card). The line between the Core and the Eclipse is a little blurry here, since we know that the GTX 1070's performance can easily be overclocked to reach a stock 1080 - their performance delta isn't that great. If the RX Vega Core does bring with it higher performance than the GTX 1070 (justified by its higher pricing), then that would place it close to GTX 1080 (stock) performance. Since AMD would be trying to avoid its RX Vega Core from eclipsing (eh) its RX Vega Eclipse graphics card in the price/performance department, one can expect - with reservations - that the performance delta between the Core and the Eclipse is higher than their respective pricing indicates. So I would expect the RX Vega Eclipse to offer performance that's greater than the GTX 1080's.

Finally, we have the crème de la crème of Vega, the RX Vega Nova. This graphics card is reported to retail for $599, a full $100 cheaper than NVIDIA's GTX 1080 Ti, while looking to directly compete with it. Considering this pricing, and admitting that the leak pans out correctly, this would mean we won't be seeing a Vega card that's capable of competing with NVIDIA's Titan Xp graphics card (at least, not a single-GPU solution...) AMD simply would not sell their top of the line Vega for $599 if it was competitive with that NVIDIA titan of a graphics card. Based on AMD's previous pricing strategy, I would expect the company to deliver roughly the same performance as the GTX 1080 Ti, looking to use its Nova not as a pure performance product, but as a price/performance contender. What do you make of this leak?
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233329/amd-v...compubench
This may just be an accelerator card.
Quote:We can at least assume this is an RX Vega graphics card - and if other leaks are true, this could be the top of the line RX Vega Nova graphics card. The GPU as identified by CompuBench carries 64 Cus (Compute Units), which, paired with AMD's typical (and confirmed for Vega) design of 64 processing cores per CU, yields the expected 4096 stream processors from the top end Vega. And it would seem that Vega having been built from the ground up with higher frequency in mind is true, with this GPU in particular carrying a 1600 MHz frequency, a far cry from AMD's RX 580's 1430 MHz.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233352/amd-e...days-event
So Vega is getting discussed at AMD's meeting today.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
AMD announces Vega Frontier Edition, with 16 GB HBM2 VRAM: http://www.tomshardware.com/news/amd-veg...34427.html
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233388/amd-a...for-gamers
Hmm.
Quote:The final configuration of Vega was finalized some two years ago, and AMD's vision for it was to have a GPU that could plow through 4K resolutions at over 60 frames per second. And Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K. Afterwards, Raja talked about AMD's High Bandwidth Cache Controller, running Rise of the Tomb Raider, giving the system only 2 GB of system memory, with the HBCC-enabled system delivering more than 3x the minimum frame-rates than the non-HBCC enabled system, something we've seen in the past, though on Deus Ex: mankind Divided. So now we know that wasn't just a single-shot trick.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
There will also be a liquid cooled version of Vega Frontier Edition: https://www.techpowerup.com/233405/amd-t...ition-gpus
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
(05-17-2017, 10:46 AM)SteelCrysis Wrote: https://www.techpowerup.com/233388/amd-a...for-gamers
Hmm.
Quote:The final configuration of Vega was finalized some two years ago, and AMD's vision for it was to have a GPU that could plow through 4K resolutions at over 60 frames per second. And Vega has achieved it. Sniper Elite 4 at over 60 FPS on 4K. Afterwards, Raja talked about AMD's High Bandwidth Cache Controller, running Rise of the Tomb Raider, giving the system only 2 GB of system memory, with the HBCC-enabled system delivering more than 3x the minimum frame-rates than the non-HBCC enabled system, something we've seen in the past, though on Deus Ex: mankind Divided. So now we know that wasn't just a single-shot trick.

from that article
Quote:Vega Frontier Edition is the Vega GPU we've been seeing in leaks in the last few weeks, packing 16 GB of HBM2 memory, which, as we posited, didn't really make much sense on typical gaming workloads. But we have to say that if AMD's Vega truly does deliver only a 1.5x improvement in FP32 performance (the one that's most critical for gaming at the moment), this probably paints AMD's Vega as fighting an uphill battle against NVIDIA's Pascal architecture (probably ending up somewhere between GTX 1070 and GTX 1080). If these are correct, this could mean a dual GPU Vega is indeed in the works, so as to allow AMD to reclaim the performance crown from NVIDIA, albeit with a dual-GPU configuration against NVIDIA's current single-chip performance king, Titan Xp. Also worth nothing is that the AMD Radeon Vega Frontier Edition uses two PCI-Express 8-pin power connectors, which suggests a power draw north of 300 Watts

The first part in bold- All i have to say is, WHAT???? This is an official slide, what is this guy talking about? "if AMD's VEGA truly does deliver only a 1.5x improvement"??? Surely this guy knows about AMD marketing hype. There is mountains of evidence, over and over, AMDs current GPU division takes best case scenarios to the next level. Exaggerations to the max, all the way to obscene. Polaris was pre-launch was completely full of outrageous figures that were proven to be extremely far out when the cards actually launched. So expecting the 1.5x to be an underestimation seems irrational to me. Its more of a question of if Vega will actually meant that performance.

But you know this reminds me.....many many many months ago, my VEGA projection, It was that AMD would find it hard to catch the 1080 with vega. I went further to project vega performance to land somewhere between the 1070 and 1080 non ti


One thing i didnt expect was how far out the Vega launch would be. Now this is truly the surprise to me.
Reply
https://www.techpowerup.com/233443/raja-...hould-wait
Quote:In a blog post detailing AMD's Vega Frontier Edition graphics card, which we covered in-depth at the time of its announcement in AMD's Financial Analyst Day 2017, AMD's Radeon Technologies Group head Raja Koduri clarified that current machine learning poster child, the Vega Frontier Edition GPU, can also be used for gaming (who's to say some researchers, or pioneers, as AMD is so fond of calling them, won't be visiting Talos 1 themselves between coffee breaks?)

However, it is Raja Koduri's opinion that you should wait for Vega's gaming GPUs, since the Frontier Edition is "optimized for professional use cases (and priced accordingly)", and that if you want to game on AMD hardware, you should wait "just a little while longer for the lower-priced, gaming-optimized Radeon RX Vega graphics card." He then threw in a free "You'll be glad you did," as if Vega hasn't been a long, long time coming already.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233478/amd-r...-amds-labs
Quote:What we should be paying more attention to, though, is the partial graphics card that stands to the frontier Edition's right side. It's only a partial, granted, but the black and red color scheme is reminiscent of... well... AMD's gaming Radeon graphics cards. Could this actually be meant as a tease for one of the gaming-oriented RX Vega graphics cards?
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.extremetech.com/gaming/24961...later-year
Quote:For the past few months, I’ve been wondering if HBM2 has run into more problems than initially expected. When AMD introduced HBM, it seemed to be taking a leadership position on memory technology, much as it did with its adoption of GDDR4 and GDDR5 (Nvidia skipped GDDR4 and didn’t adopt GDDR5 until after AMD had done so). At the time, it seemed that AMD would be the only company to field HBM, but that Nvidia and AMD would both use HBM2 for their next generation of 14nm graphics cards. Instead, we saw Nvidia opt for GDDR5X, while AMD had no high-end solution at all.

Fast forward to 2017, and we’ve got AMD still stuck on GDDR5, GDDR6 waiting in the wings, and no sign of HBM2 on anyone’s consumer graphics cards. Nvidia has rolled HBM2 out on its highest-end Quadro line, but those GPUs still cost thousands of dollars more than your typical consumer parts. Nobody is pushing HBM2 to consumer hardware yet, and Raja’s specific willingness to call out HBM2 as the tech that’s been difficult to get into consumer parts as “not easy,” seems to highlight what the hold-up is.

Several years ago, it looked as if HBM and HBM2 would become the de facto replacement for high-end graphics cards. Lower-end cards would still use GDDR5, but even this would eventually become less common, as RAM loadouts rose above 4GB for lower-end cards. That forecast looks much less certain now. If AMD and Nvidia can’t get HBM2’s costs under control, it’ll remain a fringe technology, limited to the highest-end SKUs. Once GDDR6 is widely available, both firms might move away from it altogether.

Of course, it’s still possible that these are growing pains and that HBM2 will sort itself out, but the longer it takes to get the hardware into market, the less likely that looks. AMD is already working on follow-up designs to Vega and Ryzen, and acknowledge during its FAD this week that it’s simultaneously building the follow-up parts to those follow-up parts at the same time. HBM2 simply hasn’t emerged as the affordable next-generation technology we were hoping it would be, and I suspect — though again, this is a suspicion, not a claim — that HBM2 is the reason why Vega’s consumer launch is running late.*

* — AMD would undoubtedly quibble with whether Vega is late, since the chip will ship in a professional card within the first half of 2017, but let’s be realistic here. Consumer Vega hardware was expected in-market by Q2 2017, and AMD still isn’t saying when the consumer cards will launch.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
http://techreport.com/news/31948/amd-say...-of-months
Quote:That statement touched off a firestorm of speculation on Reddit today after a commenter misconstrued Su's vague statement as a definite confirmation that the Radeon RX Vega would arrive "a couple months" after the Vega Frontier Edition. In reality, Su simply established a launch window for the range of Vega-powered products, which so far comprises the Radeon Vega Frontier Edition, the Radeon Instinct MI25 accelerator, and at least one Radeon RX Vega consumer card.

Given the counsel-approved vagueness of Su's statement, any Vega product could launch at any time over the next couple months, and that's all we know. More definite information, if there is any to be shared, will presumably need to wait for Computex next week.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233720/rosen...-for-intel
Quote:On the back of impressive performance, yield, and cost metric for AMD's market-warping Ryzen and server-shaking EPYC processors, securities firm Rosenblatt Securities' Hans Mosesmann has affirmed a "Buy" rating for AMD's stock, while saddling Intel with a seldom-seen "Sell". All in all, there have been a number of changes in Intel's market ratings; there seems to be a downgrade trend towards either "Hold" or "Sell" scenarios compared to the usual "Buy" ratings given by edge funds and financial analysts - ratings which are undoubtedly affected (at least in part) by AMD's Ryzen and EPYC execution.
...
Additional news (well, more like rumors at this point, but analysts may have more information than we do) on Ryzen's yields beating expectations, at over 80% for fully-functional 8-core dies, also served to shake this recommendation. This speaks to AMD's current momentum in the high-performance x86 market. Hands and hats down to AMD, Jim Keller and his team, as well as to Lisa Su's leadership, for this momentous fight-back, clawing their way to relevance again.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/233887/amd-r...in-a-month
Quote:AMD at its Computex 2017 event announced that you may have to wait a lot longer for the consumer graphics variant of its "Vega" architecture. The Radeon RX Vega, the consumer graphics product based on the architecture, will launch at SIGGRAPH 2017, that's 30th July thru 3rd August. The Radeon Vega Frontier Edition, on the other hand, will launch by late-June, 2017. This card has a full-featured "Vega 10" silicon, and will be overpriced. We're not exactly sure who its target audience is, but it could mostly be enthusiasts wanting to try out "Vega" or for software/game-developers to begin optimizing their games for "Vega."
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
They showed off a dual Vega setup running Prey @4K, you know, the game that a 1080Ti handles solo just fine......

*IF* Vega came out and embarrassed the 1080Ti nV still wouldn't look all that bad with Volta right around the corner- failing to go toe to toe against the 1080Ti is a big failure, slotting in between the 1080 and 1070 is catastrophic. Volta is still right around the corner, given how absurdly sluggish AMD is to get *anything* out, they need to be on par with the Volta parts. Launching a brand new hyped for years part that finishes fourth in a two horse race is a fucking comical level of failure.
Reply
http://www.tomshardware.com/news/apple-v...34667.html
Quote:Vega's inclusion is the most surprising aspect of the iMac Pro. Apple previously restricted most of its iMacs to integrated graphics. This made them ill-suited for gaming--concerns about many games not supporting macOS aside--simply because the devices weren't powerful enough. We don't know what kind of performance Vega will offer at 5K resolutions, but it's bound to be better than whatever Intel's integrated graphics could provide. (To be clear: We assume Apple would've picked a processor with onboard graphics if it weren't going the Vega route.)

But we suspect Apple's sudden fondness for dedicated graphics results from its push to join the VR revolution. You're not going to get a worthwhile VR experience on the graphics found in the current iMac, Mac Mini, or even Mac Pro. Combine the graphical horsepower required to power VR with the need to push a bunch of pixels to a 5K display, and you have a pretty good reason to use modern graphics. Given its existing relationship with AMD, then, it's not surprising that Apple picked the soon-to-debut Vega GPU.

The iMac Pro is set to launch in December with a starting price of $4,999.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
https://www.techpowerup.com/234403/amd-r...g-revealed
Quote:Radeon Pro Vega Frontier Edition, being a somewhat "enterprise-segment" product, was expected to have slightly lower TDP than its consumer-graphics sibling, since enterprise-segment implementations of popular GPUs tend to have slightly restrained clock speeds. Apparently, AMD either didn't clock the Radeon Pro Vega Frontier Edition low, or the chip has extremely high TDP.

According to specifications put out by EXXACT, a retailer which deals with enterprise hardware, the air-cooled variant of the Radeon Pro Vega Frontier Edition has a TDP rated at 300W, while its liquid-cooled variant has its TDP rated as high as 375W. To put this in perspective, the consumer-segment TITAN Xp by NVIDIA has its TDP rated at 275W. EXXACT is claiming big performance advantages in certain enterprise benchmarks such as SPECVIEWPERF and Cinebench. In other news, the air-cooled Radeon Pro Vega Frontier Edition is reportedly priced at USD $1,199; while the liquid-cooled variant is priced at $1,799. Based on the 14 nm "Vega 10" silicon, the Pro Vega Frontier Edition features 4,096 stream processors and 16 GB of HBM2 memory across a 2048-bit memory interface.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)