Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Turing Discussion Thread
#1
https://www.techpowerup.com/245088/nvidi...rce-series
Quote:The BoM also specifies a timeline for the tentative amount of time it takes for each of the main stages of the product development, leading up to mass-production. It stipulates 11-12 weeks (2-3 months) leading up to mass-production and shipping, which could put product-launch some time in August (assuming the BoM was released some time in May-June). A separate table also provides a fascinating insight to the various stages of development of a custom-design NVIDIA graphics card.
Reply
#2
https://www.techpowerup.com/245318/nvidi...-indicates
Quote:It's more likely, though, that we're looking at a product launch and announcement that precedes the Hot Chips presentation. This breadcrumb trail could be not much more than wishful thinking, though: NVIDIA CEO Jensen Huang himself said at COMPUTEX 2018 that we might have to wait for a long time before new GeForce hardware is actually launched.

This is both expected and unexpected for a variety of reasons. Personally, I believe NVIDIA would only reap benefits by introducing its new 1100 or 2000 series GeForce graphics cards before AMD has its act together for their next generation Radeon products. NVIDIA has enjoyed an earlier time to market with their solutions for some time now, and that means they tend to entrench themselves in the market with their new solutions first, addressing the urge for users to get the next shiny piece of graphics hardware they can. At the same time, it gives them the opportunity to launch products with raised costs upfront (if mumblings of increased base pricing of GeForce products to capitalize on expected cryptocurrency demand are anything to go by). This means the company could begin filling up its war chest for price cuts should AMD pull a rabbit out of its proverbial hat with an extremely competitive lineup of products - as it has done in the past.
Reply
#3
https://www.techpowerup.com/245399/nvidi...-inventory
Quote:Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.

With no competition on the horizon from AMD, it makes sense that NVIDIA would give the market time to assimilate their excess graphics cards. A good solution for excess inventory would be price-cuts, but the absence of competition brings that to a halt: NVIDIA's solutions are selling well in the face of current AMD products in the market, and as such, there is no need to artificially increase demand - and lower ASP in the meantime. Should some sort of pressure be applied, NVIDIA can lower MSRP at a snap of its proverbial fingers.
Reply
#4
New GTX 1180 sighting, and this time it's appearing as Turing, it uses GDDR6, its release date is September 28: https://www.techpowerup.com/245657/nvidi...ese-stores
Reply
#5
The existence of the GTX 1160 has reportedly been leaked by Lenovo: https://www.techpowerup.com/245719/lenov...e-gtx-1160
Reply
#6
Leaked email indicates that Geforce 11 series is coming, starting on August 30, with GTX 1180+, GTX 1180, GTX 1170, GTX 1160: https://www.techpowerup.com/246207/nvidi...tes-leaked
Reply
#7
https://www.tomshardware.com/news/geforc...37498.html
Quote:The above information should be taken with a grain of salt. Gamer Meld didn’t give any clue as to which OEM the information came from, so we can’t begin to verify the details within.
Reply
#8
https://www.tomshardware.com/news/nvidia...37530.html
Quote:Nvidia's announcement says the "event will be loaded with new, exclusive, hands-on demos of the hottest upcoming games, stage presentations from the world’s biggest game developers, and some spectacular surprises." Naturally, speculation is running high that Nvidia will announce its latest GPUs, which are rumored to come to market in late August, at the event.

Nvidia is holding the event on August 20, which is the same date as Nvidia's now-canceled "Next Generation Mainstream GPU" presentation at Hot Chips 2018. That presentation was largely viewed as the first introduction of the finer details of the new GPU architecture, but the conference removed the listing from the schedule after extensive press coverage.

The event is open to the public. The Eventbrite registration says the venue has limited capacity and entry will be first-come, first-served. The company is also livestreaming the event, so you won't be entirely left out if you aren't in Cologne, Germany for the gaming celebration. Nvidia hasn't provided a link to view the live streamed event, but we expect that information will come to light when it announces the location of the secretive event.
Reply
#9
https://www.tomshardware.com/news/nvidia...37391.html
Quote:The latest and perhaps most substantive development is a story from VideoCardz with an image of what may (or may not) be a leaked Nvidia-made GTX 1180 PCB. The image, which comes from a Baidu user, seems to show a reference board with both a six- and eight-pin power connector, a non-standard SLI connector (perhaps as part of an NVLink implementation), and a fairly small pinout area for the GPU itself. VideoCardz also points out that there is no DVI connector on this board. Perhaps Nvidia has nixed the aging port in favor of the reported VR-centric VirtualLink connector based on USB-C. There will certainly be gamers with older monitors who will be affected by this if that's the case--even if they just have to buy a new cable or an adapter.

As with any such leaks, it's tough to say anything for sure about this image. It could be legit, it could be doctored, or it could be a PCB for a future card other than the GTX 1180/GTX 2080. So you shouldn't take any of this as fact. But with a rumored launch at Gamescom in August, we may not have long to wait before we know more for certain.
Reply
#10
AIDA 64 beta release lists GTX 1180: https://www.techpowerup.com/246565/lates...e-gtx-1180
Reply
#11
https://www.neowin.net/news/the-founders...oling-fans
Quote:It's not uncommon for third-party graphics card makers to build dual-fan GPUs based on reference designs from Nvidia, but this would be the first time that the company does it in its own variation of the GPU. This could indicate that the card will have significantly more power inside, thus the need for additional cooling.

This is backed up by comments from GPU manufacturer Galax, who says that the performance will see a "breakthrough growth" with Nvidia's upcoming line of hardware. The company also says to expect information about the cards sometime in September, which could be the closest we have an official announcement date, though this should still be taken with a grain of salt. Earlier this year, the cards were expected to debut in June, a report that never materialized.

https://www.techpowerup.com/246656/nvidi...-dual-fans
Quote:The PCB pictures revealed preparation for an unusually strong VRM design, given that this is an NVIDIA reference board. It draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features a 10+2 phase setup, with up to 10 vGPU and 2 vMem phases. The size of the pads for the ASIC and no more than 8 memory chips confirmed that the board is meant for the GTX 1080-successor. Adding to the theory of this board being unusually hot is an article by Chinese publication Benchlife.info, which mentions that the reference design (Founders Edition) cooling solution does away with a single lateral blower, and features a strong aluminium fin-stack heatsink ventilated by two top-flow fans (like most custom-design cards). Given that NVIDIA avoided such a design for even big-chip cards such as the GTX 1080 Ti FE or the TITAN V, the GTX 1080-successor is proving to be an interesting card to look forward to. But then what if this is the fabled GTX 1180+ / GTX 2080+, slated for late-September?
Reply
#12
Turing officially announced, Quadro cards will be coming first, no word on Turing Geforce cards: https://www.tomshardware.com/news/nvidia...37599.html
Reply
#13
Turing Quadro cards are using GDDR6: https://www.techpowerup.com/246759/samsu...s-solution
Reply
#14
https://www.tomshardware.com/news/ray-tr...37600.html
Quote:Nvidia has been pushing ray-tracing technology for at least a decade. In 2008, it acquired a ray-tracing company called RayScale, and two years later at Siggraph 2010, it showed the first interactive ray-tracing demo running on Fermi-based Quadro cards. After witnessing the demo first-hand, we surmised that we would see real-time ray-tracing capability “in a few GPU generations.”

A few generations turned into six generations, but Nvidia finally achieved real-time ray tracing with the new Quadro RTX lineup. When the company releases gaming-class GPUs that support real-time ray tracing, which could happen as soon as next week, we should see a big improvement in graphics fidelity in future video games. Real-time ray tracing is a foundational step towards game graphics that are indistinguishable from the real world around us.

https://www.extremetech.com/computing/27...w-possible
Quote:Jen-Hsun Huang is claiming that this new GPU represents a fundamental shift in capabilities and could drive the entire industry towards a new mode of graphics. It’s possible he’s right — Nvidia is in a far more dominant position to shift the graphics industry than most companies. But I’m also reminded of another company that thought it could revamp the entire graphics industry with a new GPU architecture that would be a radical departure from everything anyone had done before, with a specific goal of enabling RTRT. The company was Intel, the GPU was Larrabee, and the end result was not much in particular. After a brief flurry of interest, Intel killed the card and the industry went along its path.

Obviously, that’s not going to happen here, given that Nvidia is shipping silicon, but the major question will be whether the very different techniques associated with real-time ray tracing can catch on with developers and drive a major change in how consumer graphics are created and consumed. The odds of a global market transformation in favor of real-time ray tracing will increase substantially if companies like AMD and eventually Intel throw their own weight behind it.
Reply
#15
https://www.tomshardware.com/news/ashes-...37605.html
Quote:Which brings us back to Ashes of the Singularity: Escalation. We've long used the game to benchmark new graphics cards and processors--it offers a variety of quality settings, relies on both the GPU and CPU and is a generally accepted measure of how well a system handles real-time strategy games. Now someone with the handle "nvgtltest007" has used the game on Crazy settings at 4K resolution to benchmark the "Nvidia Graphics Device" paired with an Intel Core i7-5960X clocked at 3GHz (and yes, we suspect that gobbledygook of a username is supposed to refer to secret testing of Nvidia cards).

The "Nvidia Graphics Device" appeared to score well enough. It managed to squeeze out 75.1, 60.6 and 57.4 frames per second in the game's normal, medium and heavy batch tests, respectively. The recorded CPU frame rates were 138.6, 118.4 and 91.9, respectively. That averages out to a frame rate of 63.5 and CPU frame rate of 113 on Crazy settings. For reference, we've gotten between 40 and 59.5 frames per second out of the GeForce GTX 1080 Ti 11GB with Crazy settings at a 3,840 x 2,160 resolution. Therefore, this mysterious Nvidia device bottoms out near the top of what the GTX 1080 Ti can achieve.

Those numbers should be taken with a pound of salt, however, because we didn't run the tests on the "Nvidia Graphics Device" ourselves. We don't know all the differences between the setups and methodologies we use and what "nvgtltest007" uses. But if the benchmark is at least close to accurate, there's another reason to be excited for whatever Nvidia plans to announce sometime soon. We're looking forward to getting this mystery device into our own test systems so we can benchmark it ourselves. And, you know, finally play some Crysis.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)