The following warnings occurred:
Warning [2] is_dir(): open_basedir restriction in effect. File(/forum/images//english) is not within the allowed path(s): (/home/alienbab/:/tmp/:/var/tmp/:/opt/alt/php81/usr/share/pear/:/dev/urandom:/usr/local/php74/lib/:/usr/local/php81/lib/:/usr/local/php56/lib/:/usr/local/php74/lib/:/usr/local/php80/lib/:/usr/local/lib/php/) - Line: 440 - File: global.php PHP 7.4.33 (Linux)
File Line Function
[PHP]   errorHandler->error
/global.php 440 is_dir
/showthread.php 28 require_once




Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Navi Dicussion Thread
#81
https://www.extremetech.com/computing/31...etely-fake
Quote:There’s a new set of slides being passed around that supposedly showcase AMD’s upcoming Radeon 6900 XT. They’re completely fake. Here’s how you can tell:

In the first slide, the branding has been updated at the upper right, but the branding on the actual GPU hasn’t been. Also, that’s a Radeon 5700 cooler with a Vega water-cooler next to it, and there’s a clear flaw in the image where the radiator attaches to the card.

The specs themselves are pretty reasonable. I’m not saying how accurate I think they are, but the specs are the only part of the slide that isn’t instantly fake. At the very least, I’d have to get out a calculator and run some numbers first.

The fact that there’s a price on the card is another way you know this slide is fake. Price is always the last thing a company decides on.
Reply
#82
https://www.tomshardware.com/news/amds-r...nd-crashes
Quote:AMD recently rolled out the Radeon Software Adrenalin 2020 Edition Driver version 20.20.01.05 for members of Microsoft's Windows Insider Program that work with Microsoft Windows Subsystem for Linux (WSL). As spotted by a Redditor, the latest software package incorporates a new feature where users fill a simple form to report bugs and crashes.

Prior to the AMD Bug Report Tool, the option for reporting bugs consists of a link that opens an internal web browser that sends Radeon users to AMD's website, which seemed like a primitive method of doing things. As simple as it may sound, the updated bug reporting integration now opens a tool where users can send crash reports directly to AMD.

More importantly, the tool now grabs all your system's specifications automatically, information that you previously had to manually input every time you want to create a report for a bug.
Reply
#83
https://www.extremetech.com/gaming/31300...-navi-gpus
Quote:There are rumors going around that Big Navi might dramatically faster than expected, with performance estimated at 1.95x – 2.25x higher than the 5700 XT. This would be an astonishing feat, to put it mildly. The slideshow below shows our test results from the 5700 XT and 5700. The 5700 XT matched the RTX 2070 (and sometimes the 2080) well, while the 5700 was modestly faster than the RTX 2060 for a slightly higher price. A 1.95 – 2.25x speed improvement would catapult Big Navi into playable frame rates even on the most demanding settings we test; 18fps in Metro Exodus at Extreme Detail and 4K becomes 35-41 fps depending on which multiplier you choose. I have no idea how Big Navi would compare against Ampere at that point, but it would handily blow past the RTX 2080 Ti.
...
I haven’t addressed the question of IPC at all, but I want to touch on it here. When Nvidia launched Turing, it paid a significant penalty in die size and power consumption relative to a GPU with an equivalent number of cores, TMUs, and ROPs but without the tensor cores and RT cores. What does that mean for AMD? I don’t know.

The Nvidia and AMD / ATI GPUs of any given generation almost always prove to respond differently to certain types of workloads in at least a few significant ways. In 2007, I wrote an article for Ars Technica that mentioned how the 3DMark pixel shader test could cause Nvidia power consumption to surge.

I later found a different 3DMark test (I can’t recall which one, and it may have been in a different version of the application) that caused AMD’s power consumption to similarly surge far past Nvidia.

Sometimes, AMD and Nvidia implement more-or-less the same solution to a problem. Sometimes they build GPUs with fundamental capabilities (like asynchronous computing or ray tracing) that their competitor doesn’t support yet. It’s possible that AMD’s implementation of ray tracing in RDNA2 will look similar to Nvidia’s in terms of complexity and power consumption penalty. It’s also possible that it’ll more closely resemble whatever Nvidia debuts with Ampere, or be AMD’s unique take on how to approach the ray tracing efficiency problem.

The point is, we don’t know. It’s possible that RDNA’s improvements over RDNA1 consist of much better power efficiency, higher clocks, more CUs, and ray tracing as opposed to any further IPC gains. It’s also possible AMD has another IPC jump in store.

The tea leaves and indirect rumors from sources suggest, at minimum, that RDNA2 should sweep past the RTX 2000 family in terms of both power efficiency and performance. I don’t want to speculate on exactly what those gains or efficiencies will be or where they’ll come from, but current scuttlebutt is that it’ll be a competitive high-end battle between AMD and Nvidia this time around. I hope so, if only because we haven’t seen the two companies truly go toe-to-toe at the highest end of the market since ~2013.
Reply
#84
https://www.tomshardware.com/news/big-na...eon-rx6000
Quote:AMD dropped a surprise today with the first image of the Radeon RX 6000 alias Big Navi, which will give Nvidia's Ampere a run for its money for the best gaming graphics card. The chipmaker recently announced a keynote for Big Navi on October 28 at 10 a.m. PT and what better way to build up hype than to drop a small teaser before the event.
...
On an aesthetic level, the Radeon RX 6000, which is probably the reference edition, is definitely easy on the eyes. It seems AMD opted for a black and silver theme with some red highlights. The graphic card's profile reminds us a bit of Nvidia's 10-series Founders Edition shroud, but with more fans. Alternatively, it's a less boxy take on AMD's own Radeon VII. It includes dual 8-pin PCIe connectors and the Radeon logo, which likely has red (maybe RGB?) LED backlighting.
...
The graphics card draws power through two 8-pin PCIe power connectors. The PCIe slot delivers up to 75W and each 8-pin PCIe power connector can supply up to 150W, meaning the Radeon RX 6000 could theoretically pull up to 375W. It doesn't mean that it has to use that much power, of course. Big Navi probably isn't a power hog; the juice is there if the graphics card requires it.

Jarred created the above video flyby of the RX 6000 for us, with a short discussion of what it shows. Apparently, AMD has equipped the Radeon RX 6000 with one HDMI port, two DisplayPort outputs and the previously rumored USB-C port that's said to be present on certain Navi 21-based models. Given the lack of the lightning bolt symbol, it's safe to assume that it's a normal USB-C port and not a Thunderbolt 3 interface. The first thought that comes to mind is that the port is there to accommodate USB-C monitors that are becoming ever so popular these days.
Reply
#85
https://www.tomshardware.com/news/amd-di...avi-23-gpu
Quote:Continuing with AMD's tendency for fishy codenames, the chipmaker (via @Komachi Ensaka) has added support for a Dimgrey Cavefish graphics card to Mesa 20.3-devel. Much like Sienna Cichlid and Navy Flounder, Dimgrey Cavefish is presumed to be a RDNA 2 graphics cards that'll surely unsettle the gaming graphics card hierachy as we know it.

AMD has already committed to lift the curtains for the Radeon RX 6000 series, which have been popularly baptized as Big Navi, on October 28. Therefore, it's not too surprising that the chipmaker's trio of next-generation graphics cards are doing their rounds in the wild. We don't have any factual information on AMD's RDNA 2 product stack so it's wise to treat the specifications that are going around the hardware world with a truckload of salt.

Assuming that each Compute Unit (CU) in AMD's RDNA 2 architecture still equates to 64 Stream Processors (SPs), we can piece together some of the rumored specifications for AMD's Radeon RX 6000-series graphics cards.
...
The Dimgrey Cavefish is the latest RDNA 2 codename to pop up. Common wisdom tells us that Dimgrey Cavefish must be Navi 23, the last piece to the puzzle. The only logical assumption is that Navi 23 will be featured in either the Radeon RX 6600 or RX 6500, depending on AMD's intentions.

It's uncertain which graphics card AMD will announce on October 28. The chipmaker vaguely used the Radeon RX 6000 moniker. If we look back at RDNA 1, AMD started with the Radeon RX 5700 (XT) and eventually went down the pile. Being optimistic, we would love for AMD to reveal Big Navi because the current graphics card market needs some competition in the higher tiers. Nvidia's recent GeForce RTX 3080 has proven to be a tough cookie, and Big Navi will likely be the most worthy competitor.
Reply
#86
https://www.tomshardware.com/news/amds-i...-bandwidth
Quote:AMD (via @momomo_us) has trademarked the term "AMD Infinity Cache." The filing, which is on the Justia Trademarks website, applies to both the chipmaker's processor and graphics cards. In fact, the description of the trademark is so broad that it encompasses just about every type of silicon that AMD manufactures.

But the common consensus is that the trademark correlates with AMD's pending Big Navi launch. Memory bandwidth, among other aspects, is one of the major talking points about Nvidia's Ampere. The GeForce RTX 3090 flaunts an impressive memory bandwidth up to 936.2 GBps. The GeForce RTX 3080 and GeForce RTX 3070 aren't too shabby either, with theoretical values that peak to 760.3 GBps and 448 GBps, respectively.
...
Other than the folks at AMD, we doubt anyone has any idea of what the Infinity Cache is truly all about. It might be a new feature, or it could just be a fancy term for an existing concept. For example, AMD branded the L3 cache on its Zen 2 processors as GameCache. It sounds great for marketing, but at the end of the day, it's still just the L3 cache that we've all come to know from most modern CPUs.
...
It remains a mystery whether the Infinity Cache actually refers to the L2 cache or a new L3 cache, or something else entirely. Graphics cards commonly come with L1 and L2 caches because the bigger caches are slower and induce higher latency.

There's a possibility that the Infinity Cache may be related to a patent that AMD filed last year on Adaptive Cache Reconfiguration Via Clustering. Subsequently, the authors published a paper on the topic. It talks about the possibility of sharing the L1 caches between GPU cores.

Traditionally, GPU cores have their own individual L1 cache, while the L2 cache is shared among all the cores. The suggested model proposes that each GPU is allowed to access the other's L1 cache. The objective is to optimize the caches' use by eliminating the replicated data in each slice of the cache. The results are pretty amazing. Across a suite of 28 GPGPU applications, the new model improved performance by 22% (up to 52%) and energy efficiency by 49%.

https://www.techpowerup.com/272946/amd-b...ures-536mm
Quote:Coreteks, in a video presentation on Sunday, released what is possibly the very first picture of the AMD "Big Navi" GPU silicon, which could power the company's next-generation Radeon RX 6000 series flagship graphics card. The grainy, blurry-cam picture reveals a mostly square package with a large, rectangular die at its center, which Coreteks estimates to be 536 mm² in die-area, with 29 mm x 18.5 mm (LxW) dimensions. The channel used an unusual method for measuring the die size. The chip is rumored to feature around 80 compute units based on the RDNA2 graphics architecture, which includes fixed-function hardware for real-time raytracing, as RDNA2 is designed to meet DirectX 12 Ultimate logo requirements. We'll know more about the chip in the run up to its October 28 unveiling.
Reply
#87
https://www.techpowerup.com/272868/amd-r...nufactured
Quote:A report originated from Cowcotland paints AMD as having ceased production on the Navi 10-powered RX 5700 XT and RX 5700. No reference or custom designs are currently being manufactured for either of these GPUs. AMD having ceased production on these cards makes sense, considering the upcoming announcement for the RX 6000 series scheduled for October 28th. This serves as a way for the supply channel to keep draining its supply of RX 5700 cards ahead of the upcoming RDNA 2 solutions. Them being discontinued means that AMD is looking to replace them - at least price-wise - on their product stack.

Interestingly, it appears that the RX 5600 XT is still being manufactured - it's likely AMD reduced manufacturing of Navi 10 so as to feed only this GPU, which should, as such, remain in the market for a little while until AMD launches an RDNA 2 equivalent - if those are the company's plans. TSMC capacity is freed for additional wafers for other AMD product requirements - which, with both Zen 3, next-gen consoles, and RDNA 2 all launching between the same time frame - should tend towards infinity.
Reply
#88
https://www.tomshardware.com/news/digite...5-rx6900xt
Quote:Not surprisingly, it looks like AMD's gaming flagship RX 6900 XT will be low in volume and hard to get once the cards launch tomorrow, which is the same story regarding all of AMD's RDNA2 cards at the moment -- and recent Nvidia offerings as well. Swiss retailer Digitec Galaxus has said that they only received 35 RX 6900 XT reference designs from MSI and Asus for distribution.

Although this is one retailer, Digitec can give us a good glimpse into what stock will be like worldwide.

To combat the issue, Digitec won't be selling these cards the usual way. Instead, the company has created a raffle system that anyone can participate in. For added bonus, the retailers has thrown 25 RX 6800s and 17 RX 6800 XTs into the raffle as well. You can choose which model(s) you want to try and win.

Hopefully, more retailers implement this system, as it's almost scalper proof since a raffle system prevents bots from cheating their way into getting the cards first. But this system won't help the fact that 6900 XTs are going to be super rare, much rarer than Nvidia's RTX 3090, at least for now.

https://www.techpowerup.com/review/sapph...us/39.html
Quote:With these performance numbers, the Radeon RX 6800 XT is the perfect choice for 4K gaming at 60 FPS. It achieved that mark in nearly all titles in our test suite. Things are different once you turn on raytracing. Just like on NVIDIA, there's a hefty performance hit when running with the DirectX Raytracing API. We only tested two games so far, but it seems the loss in performance is bigger than on NVIDIA, who improved in that area with Ampere. Remember, this is AMD's first-generation raytracing implementation. Performance is still very respectable, reaching roughly RTX 2080 Ti levels. Now that RT hardware is available for both AMD and NVIDIA, and game developers are making console games on AMD's new RDNA2 architecture, it'll be interesting to see how raytracing performance evolves in the coming months.

In our review, AMD's RX 6800 XT reference cooler impressed us with good temperatures and even better noise levels. It's finally a large triple-slot design with three fans. This definitely sets the bar high for AMD's partners and their own cooler designs. We've seen excellent heatsinks from Sapphire before, and the Nitro+ is no exception. In our apples-to-apples cooler testing, we found out that Sapphire's cooler is definitely better than the AMD reference heatsink, sitting roughly between that and the massive NVIDIA RTX 3090 cooler. We've measured gaming temperatures of 75°C, 2°C lower than the AMD reference. Noise levels are pretty much identical, too, so it's safe to say that Sapphire's Nitro+ cooler will give you an experience comparable to the AMD RX 6800 XT reference. Sapphire does have an ace up its sleeve, and that's the dual BIOS. Once you toggle to the silent BIOS, noise levels go down a bit, by 1 dBA. With 30 dBA, the card is almost whisper quiet in heavy gaming—4K is no problem, very impressive. Just like on the AMD reference design, idle fan stop is included on the card to provide the perfect noise-free experience during desktop work, Internet browsing, media playback, and light gaming.

AMD surprised us with the power efficiency of their new Navi 21 RDNA 2 graphics processor, beating even NVIDIA's Ampere lineup. Despite the large factory overclock, Sapphire did not go overboard with power consumption. It's 15 W higher for 3% performance gained, a very reasonable tradeoff. The maximum power limit has been increased, too, so AMD's Boost algorithm can boost higher, for longer.

Back in my original review of the reference design, I had to increase the power limit on the AMD RX 6800 XT to see any meaningful performance gains from overclocking. This is not the case on the Sapphire Nitro+ because of the increase in the board power limit I just mentioned. Maximum manual overclock ended up slightly higher than both the PowerColor Red Devil, which we also tested today, and the AMD reference. These differences are small, though. I'm currently reviewing the ASUS RX 6800 XT STRIX Liquid Cooling, it'll be interesting to see the results for that card later today or tomorrow.
...
At $770, the Sapphire Nitro+ also goes up against custom-design RTX 3080 cards like the EVGA FTW3 and MSI Gaming X. Now, none of those graphics cards are in stock, of course, and people are paying insane prices to jump on the RDNA2 or Ampere train, so I'm sure Sapphire will sell everything they have even at that price point. No doubt, RX 6800 XT and RTX 3080 are fantastic cards that will give you an amazing gaming experience, but there's only so much that can be worth more. I heard from several board partners that their margins are really thin because AMD is charging so much money for their new GPU, guess while stock is low, we're going to have to pay for that.
Reply
#89
https://www.tomshardware.com/news/amd-ra...-xt-review
Quote:After the more recent Nvidia GeForce RTX 3070 and GeForce RTX 3060 Ti cards, jumping up to a $1000 graphics card feels ludicrous. Sure, it's fast and can sometimes even beat Nvidia's top-shelf RTX 3090. Overall, however, the RX 6900 XT fails to impress relative to the RX 6800 XT. It's such an incremental bump in performance that it hardly seems worth the trouble. That's even assuming that there will be enough cards to meet the demand, which if recent history has taught us anything, there won't be.

By the numbers, the RX 6900 XT is only 4 to 7 percent faster than the RX 6800 XT, but it costs over 50 percent more. Okay, sure, you can't find the 6800 XT in stock for $649 right now, but at some point in 2021, that will no longer be the case. If you want the best high-end AMD graphics card, our pick still goes to the RX 6800 XT. But if you're open to other options, AMD has a tougher time of things.

Toss out ray tracing performance, and the RX 6900 XT looks very competitive, chalking up several wins against the RTX 3090. But if you're willing to spend over a grand on a new graphics card for gaming purposes, we simply can't overlook the ray tracing performance and current lack of a DLSS alternative. Yes, Super Resolution is coming, possibly by the time most of these GPUs are actually available for purchase, but DLSS 2.0 is here already and works great in quite a few games. However, even without DLSS, the RTX 3080 already leads the 6900 XT by an average of 25 percent at 1440p in ray tracing games.
...
As a professional card, the RX 6900 XT again has some potential. There are certain applications where AMD is more generous than Nvidia when it comes to optimized drivers. If you happen to use one of those apps, this could be the best overall value, but again the 6800 XT has the exact same features and specs, only with a few fewer shader cores.

That's the real difficulty with the top of the pecking order. You often get radically diminishing returns going from the second- or third-tier GPU to the fastest card. The RTX 3090 has the same problem, and we don't recommend it as a general gaming solution for the same reasons. However, there are still rumblings of an RTX 3080 20GB card (possibly called RTX 3080 Ti), which could offer both a memory and performance advantage over the 6900 XT when/if it comes out. If Nvidia releases that card in the next few months and prices it at $849, that could be worth waiting for. Which is fine, since anyone wanting a new GPU is likely going to be waiting regardless.

https://www.extremetech.com/gaming/31807...-xt-at-999
Quote:My own take (I’ve reviewed the 6800 XT but not the 69000 XT) is that the 6900 XT is AMD’s way of signaling it intends to compete in the high-end of the graphics market once more, but that the company is still playing catch-up in some regards. This is not automatically a bad thing. If we look back to 2015, we see AMD nearly-match the GTX 980 Ti, only to fall short of the mark with the Vega 64 in 2017. From 2016 – 2019, AMD’s most-competitive positioning was between $100 – $300. In mid-2019, Navi debuted at higher prices with the 5700 and 500 XT, and demonstrated that AMD was still capable of competing with Turing. With Big Navi in 2020, AMD has demonstrated that it can compete with Nvidia in the upper market once again — but Biggest Navi is still a bit of a reach.

Part of the reason for this, it should be said, is because AMD chose to emphasize high VRAM loadouts and relatively high clocks for its lower-end cards. AMD chose to weaken the RX 6900 XT’s positioning by improving the RX 6800 and RX 6800 XT, and while that makes their top-end solution a little bit of an underwhelming step up, it looks this way for the best possible reason.

Most Radeon gamers will, I suspect, be best-served by either the 6800 or the 6800 XT. Nevertheless, the 6900 XT sends a message to investors and enthusiasts that AMD intends to compete robustly in GPUs as well.

When you’ll actually be able to buy one of these cards is anyone’s guess. A recent PR from Swiss retailer Digitec revealed that the company had received just 35 cards for launch, implying that this GPU is going to be extremely difficult to find. In that sense, the entire discussion is academic, since you won’t really be able to buy a card until 2021 unless you want to pay 1.5x – 2.5x over list price. There are RTX 3090’s going on eBay for $2,000 to $2,500, and some that list for even more, so the chances you can buy a new RDNA2 GPU before Christmas are small, no matter what.

https://www.techpowerup.com/review/amd-r...xt/41.html
Quote:The Radeon RX 6900 XT is AMD's return to the fight for the high-end graphics performance throne. The new Radeon flagship is surprisingly similar to the RX 6800 XT, with the biggest difference the eight additional compute units, which brings the total shader count to 5,120 as opposed to 4,608 on the RX 6800 XT. Theoretically, this would suggest a 11% performance increase. Increasing the CU count also increases TMUs and ray accelerators because these are part of every CU. All the other specifications are identical: same ROPs, clocks, memory, L3 cache, TDP, and cooler. We confirmed with AMD that the L3 cache is running at the same frequency as on the RX 6800 XT. The only other change we noticed is that the GPU voltage circuitry has an additional power phase. Probably the most important facet that has remained the same is the 300 W TDP, which is important for power efficiency, heat, and noise.

Averaged over our test suite at 4K resolution, we find the RX 6900 XT 7% faster than the RX 6800 XT. This feels like a bit less than what AMD made us expect, and there are several important points to make here. Our test suite is 23 games strong, a large mix of games and engines. Obviously, not all of these use APIs like DX12 and Vulkan. Ten games are based on DX11, ten use DX12 and three are built around the Vulkan API—a realistic mix, I would claim. AMD has traditionally had more overhead in DirectX 11 games than NVIDIA, which is the main reason their scores are getting dragged down a bit. Since the overhead is per-frame and the RX 6900 XT delivers more frames than any other Radeon card, this effect is more noticeable than before. Titles to check out for this are Project Cars 3 and Divinity Original Sin 2. Do look through our games and exclude those not relevant for your buying decision—that's why we present all this data. It is likely that a lot of upcoming titles use DX12, but I'm sure there will be other important games in the future that still run on DirectX 11. I'll use the Christmas holidays to take a closer look at our test suite to drop old games and add news ones, like Cyberpunk 2077.
...
We were impressed by the Radeon RX 6800 XT cooling solution, and the RX 6900 XT is no different. AMD was wise not to cheap out on this important component. The heatsink uses a large vapor-chamber base that sucks up heat from the GPU quickly, after which it is dissipated by the three slow-running fans. AMD has once again found the perfect fan settings for their card. At just 30 dBA, the card is nearly whisper quiet while pumping out over 60 FPS at 4K. The Radeon RX 6900 XT is also significantly quieter than NVIDIA's RTX 3090 Founders Edition—oh, how the tables have turned. Noise levels on the AMD reference card are better than on every single RTX 3090 custom design we've tested, and we tested all the important ones. If you want low noise, you have to go AMD. As we have seen in our RX 6800 XT custom design reviews, AMD's reference card is almost too good. The super-low noise levels are very hard for board partners to match, which makes finding a convincing selling point to justify a more expensive cooler difficult. Just like NVIDIA's GeForce RTX 30-series lineup, idle fan stop is part of the Radeon RX 6000, too; fans will shut down completely when the card is idle at the desktop or running productivity applications, and while Internet browsing.

The secret sauce behind the low noise levels is power efficiency. NVIDIA learned this after the GeForce GTX 480, and AMD has finally cashed in on it now. With just 300 W in typical gaming, the RX 6900 XT uses only 20 W more than the RX 6800 XT, which has both cards exactly match in efficiency. Compared to NVIDIA's lineup, the new AMD cards are more energy efficient; RX 6900 XT is 12% more efficient than NVIDIA's RTX 3090, which runs at 366 W, but offers a bit more performance. This 66 W difference is what allowed AMD to make their cooler whisper-quiet even though it isn't as good as the one on the RTX 3090 Founders Edition.
...
AMD has set an MSRP of $999 for the RX 6900 XT, which is a lot of money and new territory for AMD. The RX 5700 XT launched at $400, and the Radeon VII at $700. No doubt, AMD looked at NVIDIA's pricing and thought that with the RX 6900 XT faster than the RTX 3080, it is definitely worth more than $700; and with the RTX 3090, which can't be caught, at $1500, they made it $999, a nice three-figure number. If the market were normal, I'd be unsure about how feasible this price point would be. The RTX 3080 is very similar in rasterization performance, better in raytracing, and runs a bit louder due to higher power draw, all for $300 less—most people would probably go for it. The RTX 3090, on the other hand, offers only minimal gains over RTX 3080, but screams "over the top" due to how NVIDIA positions it as the TITAN replacement—the RX 6900 XT is different. When comparing specification sheets, it looks like a small upgrade because most specifications are identical. I'm also not sure if I would be willing to spend an additional $350 over the RX 6800 XT for roughly 10% higher performance. In all fairness, I absolutely wouldn't spend +$800 for the RTX 3090 over the RTX 3080, either.

But, as we all know, the market isn't like that as there is pretty much no stock of the new graphics cards, which has people willing to pay crazy prices just to jump on the Ampere and RDNA 2 train. Scalpers are making a killing and are probably launching their bots as we speak to gobble up what little stock exists. Merchants are jacking up prices because they know they will sell everything. Looking at recent launches from both AMD and NVIDIA, it seems MSRP prices are a fantasy true for only the first batch, there to impress potential customers, with actual retail pricing ending up much higher. Let's hope that stock levels will improve in the coming weeks and months so people can actually get their hands on these new cards.

Obviously, the "Recommended" award in this context is not for the average gamer. Rather, it means you should consider this card if you have this much money to spend and are looking for an RX 6900 XT or RTX 3090.
Reply
#90
https://www.tomshardware.com/news/amd-to...definitely
Quote:AMD has officially changed its mind and is promising to keep selling its reference design RX 6000 series cards indefinitely, according to a new tweet from AMD graphics business unit Vice President Scott Herkelman.
Reply
#91
https://www.tomshardware.com/news/amd-gp...let-patent
Quote:AMD jumped over to using a chiplet-based CPU design with the introduction of Zen 2 in the Ryzen 3000 series CPUs, enabling the chipmaker to cram more cores into a single CPU. Now, a new patent appears to reveal that AMD wants to do the same thing with GPUs (via ComputerBase).
...
As great as it all sounds, we don't want to get your hopes up for such a product to come out anytime soon. SLI and Crossfire died because getting multiple GPUs to work together across different cards is a pain, and even with AMD's proposed solution of bringing the GPU chiplets closer together with a high-bandwidth interconnect, there is still a lot of work to be done.

Chances are that if this manifests into a real-world product, it will first happen on a research-level scale, being aimed at supercomputers or scientific-purpose GPUs for users with high GPU power needs in single workstations. Such a ludicrous amount of GPU horsepower will likely need to be coupled to HBM memory just to keep up, so it's likely that you can rule out consumer products for some time to come.

All that being said, it's also very possible that this will never become a thing. Tech companies file a lot of patents, and most of them never end up being used.

https://www.techpowerup.com/276746/amd-p...adeon-gpus
Quote:AMD reports that the use of multiple GPU configuration is inefficient due to limited software support, so that is the reason why GPUs were kept monolithic for years. However, it seems like the company has found a way to go past the limitations and implement a sufficient solution. AMD believes that by using its new high bandwidth passive crosslinks, it can achieve ideal chiplet-to-chiplet communication, where each GPU in the chiplet array would be coupled to the first GPU in the array. All the communication would go through an active interposer which would contain many layers of wires that are high bandwidth passive crosslinks. The company envisions that the first GPU in the array would communicably be coupled to the CPU, meaning that it will have to use the CPU possibly as a communication bridge for the GPU arrays. Such a thing would have big latency hit so it is questionable what it means really.

The patent also suggests that each GPU chiplet uses its own Last Level Cache (LLC), instead of separate LLCs for each GPU, so each of the LLCs is communicably coupled and the cache remains coherent across all chiplets. Rumors suggest that we are going to see the first chiplet-based architecture from AMD as successor to the RDNA3 generation, so it will happen in the coming years. AMD already has experience with chiplets from its processors, with Ryzen processors being the prime example. We just need to wait and see how it will look once it arrives for GPUs.

https://www.tomshardware.com/news/amd-us...shirapoint
Quote:AMD might have brought its RX 6800, RX 6800 XT, and RX 6900 XT graphics cards to market by now, but that doesn't mean the chef is done cooking up new recipes. In that light, three new entries (1, 2, 3) have surfaced in the USB-IF, as spotted by Komachi, pointing to a handful of new GPUs. There isn't much information to go on right now, so take the news with a pinch of salt until more details emerge.
...
This would leave the XTXH and XLE GPUs in the open, assuming these entries are for the Navi 21 family. If they're for the Navi 22 family, then it's very possibly a pointer to the RX 6700 and RX 6700 XT, among others.

The same principle applies to the Nashira Point codename -- at this time we simply don't have enough details to say which product (family) it refers to.

As a result, all we can do with this information is tell you that new products are on the horizon. The RX 6700 XT's BIOS has already been leaked late last year, and it pointed to some serious overclocking potential. With AMD's historic tendency to announce products at CES, it wouldn't come as a big surprise to see some official news next week.

https://www.techpowerup.com/276747/amd-r...d-incoming
Quote:Currently, it is unknown what the additional "H" means. It could indicate an upgraded version with more CUs, or perhaps a bit cut down configuration. It is unclear where such a GPU would fit in the lineup or is it just an engineering sample that is never making it to the market. It could represent a potential response from AMD to NVIDIA's upcoming GeForce RTX 3080 Ti graphics card, however, that is just speculation. Other options suggest that such a GPU would be a part of mainstream notebook lineup, just like Renoir comes in the "H" variant. We have to wait and see what AMD does to find out more.
Reply
#92
https://www.techpowerup.com/276813/amds-...s-in-march
Quote:A report coming from Cowcotland now points towards a 1Q2021 release for AMD's high-performance RX 6700 series, which was initially poised to see the light of day in the current month of January. The RX 6700 series will ship with AMD's Navi 22 chip, which is estimated to be half of the full Navi 21 chip (which puts it at a top configuration of 2560 Stream Processors over 40 CUs). These cards are expected to ship with 12 GB of GDDR6 memory over a 192-bit memory bus. However, it seems that AMD may have delayed the launch for these graphics cards. One can imagine that this move from AMD happens so as to not further dilute the TSMC wafers coming out of the factory, limited as they are, between yet another chip. One which will undoubtedly have lower margins than the company's Zen 3 CPUs, EPYC CPUs, RX 6800 and RX 6900, and that doesn't have the same level of impact on its business relations as console-bound SoCs. Besides, it likely serves AMD best to put out enough of its currently-launched products' to sate demand (RX 6000 series, Ryzen 5000, cof cof) than to launch yet another product with likely too limited availability in relation to the existing demand.
Reply
#93
https://www.tomshardware.com/news/sonnet...-puck-navi
Quote:Sonnet has introduced its new family of external graphics solutions aimed primarily at Apple's Mac systems featuring an Intel CPU and a Thunderbolt 3 port. The latest eGPU Breakaway Puck boxes are powered by AMD's Navi graphics processors and enable you to attach a 6K display to an older Mac or just boost its graphics performance. As an added bonus, they can act like simplistic docking stations.
Reply
#94
https://www.techpowerup.com/277168/two-n...igns-spied
Quote:In her 2021 International CES keynote address, AMD revealed a slide with two upcoming reference board designs. The slide which points to what AMD has in store for 2021 illustrates two unannounced graphics cards, and a notebook. The first of these cards is a dual-fan sibling of the RX 6000 series that's been doing rounds for quite some time now, which is very likely the RX 6700 XT. The one next to it is interesting—a card with just one fan, which is likely the RDNA2 successor to the RX 5500 XT. The gaming notebook next to them brandishes both the Ryzen and Radeon logos, which means the company will not only launch the Ryzen 5000 mobile series based on "Zen 3," but also mobile variants of its Radeon RX 6000 RDNA2 series. The best part, all these launch within the first half of 2021.
Reply
#95
https://www.tomshardware.com/news/amd-ra...p-review/9
Quote:The winner of the current GPU battle will be whichever company can produce the most GPUs first and ship them at reasonable prices, with features, performance, and all the other aspects being secondary concerns. If we take the RX 5700 XT and the RTX 2060 Super as $400 graphics cards for our baseline, the RX 6800 XT is around 75 percent faster than the 5700 XT and 90 percent faster than the 2060 Super. That means we could reasonably accept prices of $700-$800. Anything more than that and we recommend waiting and searching for a better deal.

We know for certain that, just as the 2017 GPU shortages eventually came to an end, the current shortages will also pass into history at some point. Hopefully, that happens sooner rather than later.
Reply
#96
https://www.tomshardware.com/news/amd-rd...le-scaling
Quote:Modern CPU and GPU architectures support numerous features to temporarily boost performance or reduce power consumption. Some of these capabilities are advertised, others are not. AMD's RDNA2 GPUs appear to support the so-called Duty Cycle Scaling (DCS) , according to a Linux patch discovered by Phoronix. DCS can momentarily turn off graphics core when it is under a high load and then turn it back on in a bid to reduce power consumption and meet strict TDP requirements.
...
To work properly, DCS has to be supported by the GPU, its firmware, driver, and operating system. At present it is unknown whether it is something that actually works and is enabled for at least some graphics subsystems.
Reply
#97
https://www.tomshardware.com/news/amd-ra...dr6-memory
Quote:AMD made it clear during its CES 2021 presentation that new mainstream RDNA 2 graphics cards will arrive in the first half of this year. The Radeon RX 6700 XT may be one of the very first models.

Gigabyte (via Komachi_Ensaka) today registered at least six custom Radeon RX 6700 XT graphics card with the Eurasian Economic Commission (EEC), implying that we may be nearing the Radeon RX 6700 XT's release. While the Radeon RX 6700 XT's specifications are left to speculation, Gigabyte's submission at least confirms that the graphics card will come equipped with 12GB of GDDR6 memory.

There have been whispers that the Radeon RX 6700 XT would employ AMD's Navi 22 (codename Navy Flounder) silicon. Size-wise, the die should be smaller than Navi 21, which dwells inside the Radeon RX 6900 XT, RX 6800 XT (one of the best graphics cards) and RX 6800. Navi 22 may well end up with 40 Compute Units (CUs), adding up to a total of 2,560 Stream Processors (SPs).
...
While the Radeon RX 6900 XT and Radeon RX 6800 XT are rated for 300W, the Radeon RX 6700 XT should feature a more modest TDP in the range of 200W. If so, a single 8-pin PCIe power connector would suffice. If AMD is using the same display output formula with the Radeon RX 6700 XT as it did with the Radeon RX 6900 XT and RX 6700 XT, the Radeon RX 6700 XT will likely have an HDMI 2.1 port, two DisplayPort 1.4a outputs and possibly the USB-C port as well.

Current rumors suggest we could see RX 6700 XT (and possibly RX 6700 as well) cards launch by the end of March.
Reply
#98
https://www.techpowerup.com/277598/amd-i...let-design
Quote:AMD is about to enter the world of chiplets with its upcoming GPUs, just like it has been doing so with the Zen generation of processors. Having launched a Radeon RX 6000 series lineup based on Navi 21 and Navi 22, the company is seemingly not stopping there. To remain competitive, it needs to be in the constant process of innovation and development, which is reportedly true once again. According to the current rumors, AMD is working on an RDNA 3 GPU design based on chiplets. The chiplet design is supposed to feature two 80 Compute Unit (CU) dies, just like the ones found inside the Radeon RX 6900 XT graphics card.

Having two 80 CU dies would bring the total core number to exactly 10240 cores (two times 5120 cores on Navi 21 die). Combined with the RDNA 3 architecture, which brings better perf-per-watt compared to the last generation uArch, Navi 31 GPU is going to be a compute monster. It isn't exactly clear whatever we are supposed to get this graphics card, however, it may be coming at the end of this year or the beginning of the following year 2022.
Reply
#99
https://www.tomshardware.com/news/rx6700...n-rx6600xt
Quote:A potential leak from the EEC (Eurasian Economic Commission) shows that Asrock has filed information pertaining to the currently unannounced Radeon RX 6600XT and an RX 6700 (vanilla). The info suggests Asrock's RX 6600XT's might feature 12GB of VRAM, and 8GB of VRAM for the RX 6700 variant.

Beware that ECC filings can be VERY misleading. We've seen false information pertaining to the RTX 30 series show up on the ECC database, so take this data with a grain of salt.

If this info is at all true about the RX 6600XT and RX 6700, it would seem AMD is duplicating Nvidia's shenanigans with video memory in the mid-range graphics card market. We're specifically talking about Nvidia's RTX 3060 featuring 12GB of VRAM, while its higher-tiered siblings — the RTX 3060 Ti and RTX 3070 — only include 8GB of VRAM.
...
We'll have to wait and see what AMD does. Again, take this data with a grain of salt as ECC fillings are not always 100 percent accurate information. The Radeon RX 6600 XT and RX 6700 haven't been officially announced, and we don't have clear knowledge of the other specs either. Our best guess is that Navi 22 or Navi 23 will be used to build the 6600XT and/or 6700, and that the cards will ship most likely in the March or April timeframe.
Reply
https://www.tomshardware.com/news/amd-ra...nt-march-3
Quote:AMD will reveal a new Radeon RX 6000 graphics card during Episode 3 of its "Where Gaming Begins" event on March 3 on 11 AM US Eastern. Although the chipmaker didn't specify which model, it's likely going to be the much-awaited Radeon RX 6700 XT. Following AMD's Big Navi release pattern,. the Radeon RX 6700 Xt is the next SKU in line after all.
...
The Radeon RX 6700 XT emerges with a shorter cooler with only two cooling fans. A quick glimpse at the front of the graphics card reveals three DisplayPort 1.4a outputs and a single HDMI 2.1 port. It would seem that AMD has removed the USB Type-C connector on the Radeon RX 6700 XT. While the USB Type-C port has its uses, it never really took off so it will please consumers to know that it has been replaced with an extra DisplayPort 1.4a output instead.

The Radeon RX 6700 XT will be gunning after Nvidia's mid-range Ampere-based graphics cards, such as the GeForce RTX 3060 that launches tomorrow. The specifications for the new Big Navi (I guess this is really Medium Navi) graphics card are still blurry, but we expect to see a full Navi 22 (codename Navy Flounder) die, which houses 40 Compute Units (CUs). As AMD has done in the past, it's reasonable to think that the chipmaker would also put out a Radeon RX 6700, which would probably leverage a cut-down version of the Navi 22 silicon.

The rumors are painting the Radeon RX 6700 XT and RX 6700 with 2,560 and 2,304 Stream Processors (SPs), respectively. Assuming that the SP count is accurate, the XT variant will have 40 ray accelerators at its disposal, while the non-XT variant should be equipped with 36 of them.

On the memory aspect, Gigabyte has registered multiple custom Radeon RX 6700 XT graphics cards before the EEC (Eurasian Economic Commission) with 12GB of GDDR6 memory. Similary, ASRock has submitted a couple of Radeon RX 6700 SKUs with 6GB of GDDR6 memory.

Pricing and performance are important, but availability has ultimately taken up a bigger role nowadays given the graphics card shortages, crypto-mining boom and scalpers. AMD has made it clear that it'll announce a Radeon RX 6000 graphics card on March 3. However, it'll be interesting to see if it will be available for purchase sooner rather than later.
Reply
https://www.extremetech.com/extreme/3205...ced-at-479
Quote:AMD has announced its upcoming 6700 XT. As the name implies, it’s intended as the lower-end sibling of the 6800 and 6800 XT series and as the generational, drop-in replacement for the 5700 XT.

The 6700 XT will hit store shelves on March 18 with a price of $479. This represents a price increase relative to that previous card, which debuted at $400. Of course, given current GPU prices, anyone able to score a new card at anything adjacent to MSRP will likely feel as if they’ve won the lottery regardless given current price trends.
...
These gains should reduce the sting of higher prices somewhat — a 1.19x price increase ought to be “paid” for, in this instance, with a 1.3x – 1.4x performance improvement, which means AMD is objectively delivering a better value at that price point than it did 18 months ago. This is all for the good.

These clock boosts will also help offset the difference between the 6700 XT and its larger cousins. While the 6800 has a full 3840 cores, the base clock is just 1.8GHz. The 6700 XT has just 67 percent of the cores of the 6800 and 55 percent of the 6800 XT, but its base clock is 1.35x higher than the 6800 and 1.2x faster than the 6800 XT. This will help to close the performance gaps in compute-bound workloads.

Partner cards will be available at the same time as reference cards in an attempt to boost overall channel availability. AMD has not announced any plans to limit mining and it’s not clear how much cryptocurrency mining is creating problems right now. The GPU will focus on the 1440p segment, probably with some drops to 1080p for gamers who want smooth frame rates and added effects like ray tracing and are willing to drop resolution to get there.

Availability is likely to be minimal, despite AMD’s efforts. This is not a knock on AMD. Nvidia has had no success keeping GPUs on store shelves, either.

https://www.tomshardware.com/news/amd-fi...evelopment
Quote:AMD has kept its new DLSS competitor, FidelityFX Super Resolution, under wraps for some time now. That makes us wonder what's going on with the technology and when it will actually see the light of day. Fortunately, LinusTechTips received some insider knowledge from AMD as to why the supersampling tech is taking so long to develop.

Apparently, AMD wants FidelityFX Super Resolution to have some polish to it before release. AMD also wants it to be fully operational on all its graphics cards and RDNA-based consoles like the PS5 and Xbox Series X, at launch. The alternative would be slowly rolling out the technology, one platform at a time (assuming each platform proves capable).
...
AMD also wants to make Super Resolution GPU agnostic, just like other FidelityFX libraries. That potentially means supporting many generations of GPUs, including Nvidia and possibly even Intel options. If AMD limited support for Super Resolution to RDNA2 based products, without getting it to work on first gen RX 5000 series, it would certainly draw flak. Getting it to work well on Vega integrated graphics and older GCN products like the RX 400 and 500 series GPUs meanwhile would make for a more compelling options for game developers.

Not only does it need to work on a variety of architectures, but it needs to look good and perform well. Simple resolution upscaling is easy, but it also causes a loss of visual quality. Doing all of this requires a lot of time, naturally. We do know AMD is actively working on FidelityFX Super Resolution. Hopefully, the DLSS alternative will work well and come out sooner rather than later.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)