Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD de-thrones nVidia with DX12!
#81
(09-14-2015, 03:14 PM)BenSkywalker Wrote: This talk about nVidia hardware lacking support for async compute is truly disturbing. It must be the absolute truth or nVidia would have done something to refute it by now. At this point, the only thing that I can think of that would give any credibility to their stance would be for them to travel back in time, several years, and build something into their GPU that utilized async compute in a manner that would truly stand out to the end users, given the API limitations of the time this would be tough, maybe they could do.... I don't know..... how about physics. They could come up with some dumb marketing name, we'll call it PhysX for the sake of argument- and it would use async compute to do things like generate large amounts of particles, handle things like cloth and fog.... stuff like that. They could do this on older DX11/10/9 games and add physics based calculations that ran concurrently with the graphics engine on the same GPU to prove that their parts were built from the ground up, for many years, for async compute.

Of course, something like that is way outside of the realm of possibility.

I remain undisturbed.

Last I checked, my 980Ti is a faster card than a Fury X pretty much across the board. When DX12 games matter, I'll have a Pascal chip that will likely be faster than Fury X2.

Back in the day people yelled a lot about how DX9 was faster on the ATi cards, which was true.

I think at the time the 6800 series launched and beat ATi at DX9 there were around three games that used DX9 and they weren't exactly the big sellers. (Dirt car racing and one of the Tomb Raiders?)

AMD is doing what AMD always does- trying to create fear that future games will favor their hardware. (see 8m Advocate threads saying "When future games are coded properly, you will need moar corez and a Faildozer!")

Rolleyes

The one hope they have this time is a MS (a company that matters) is probably pushing DX12 hard for XBone. AMD graphics can only influence devs who work in their basement with their 18% discrete market share.
Reply
#82
(09-26-2015, 06:00 PM)RolloTheGreat Wrote:
(09-14-2015, 03:14 PM)BenSkywalker Wrote: This talk about nVidia hardware lacking support for async compute is truly disturbing. It must be the absolute truth or nVidia would have done something to refute it by now. At this point, the only thing that I can think of that would give any credibility to their stance would be for them to travel back in time, several years, and build something into their GPU that utilized async compute in a manner that would truly stand out to the end users, given the API limitations of the time this would be tough, maybe they could do.... I don't know..... how about physics. They could come up with some dumb marketing name, we'll call it PhysX for the sake of argument- and it would use async compute to do things like generate large amounts of particles, handle things like cloth and fog.... stuff like that. They could do this on older DX11/10/9 games and add physics based calculations that ran concurrently with the graphics engine on the same GPU to prove that their parts were built from the ground up, for many years, for async compute.

Of course, something like that is way outside of the realm of possibility.

I remain undisturbed.

Last I checked, my 980Ti is a faster card than a Fury X pretty much across the board. When DX12 games matter, I'll have a Pascal chip that will likely be faster than Fury X2.

Back in the day people yelled a lot about how DX9 was faster on the ATi cards, which was true.

I think at the time the 6800 series launched and beat ATi at DX9 there were around three games that used DX9 and they weren't exactly the big sellers. (Dirt car racing  and one of the Tomb Raiders?)

AMD is doing what AMD always does- trying to create fear that future games will favor their hardware. (see 8m Advocate threads saying "When future games are coded properly, you will need moar corez and a Faildozer!")

Rolleyes

The one hope they have this time is a MS (a company that matters) is probably pushing DX12 hard for XBone. AMD graphics can only influence devs who work in their basement with their 18% discrete market share.
Yo wassup - glad you still kicking nuts around here! 

Yeah, and what was all of this nonsense talk about NV having their Fermi GPUs ready for DX12?  By then, I'd be giving away Fermi cards for free! 

I expect AMD to beat NV to the punch with 16nm FINFET (as usual, once again, like with being first to 90nm, 80nm, 55nm, 40nm, and 28nm).  Hopefully it won't be like Intel's Skylake which is completely unimpressive with hardly any power savings over Haswell at the same 4.4GHz clock (14nm vs 22nm), but at least 16nm is a full-node shrink from 28nm. 

BTW, it was DX9.0c that gave NV an edge with the 6800 Ultra, when AMD was still limited to DX9.0b.  Games like Bioshock that came out in 2007 required DX9.0c as a minimum, and was rather unplayable with the X800XT.  Still, I absolutely loved my X800XT AIW!!!!  It was great with Far Cry, one of the first DX9 games ever - released in like 2003 or 2004 (and perhaps the most impressive one until Crysis came out).  What really helped NV with its 6800GT/Ultra cards was the SLI feature introduced with PCI-E, with Doom3 as the killer app as a single 6800Ultra blew the X800XT out of the water.  Then with 7800GTX SLI, I started playing everything in 3D, and from that point on, NV was the leader for me with gaming experience, with superior 3D driver support.  Good days.  Now, boring days until 16nm.
Reply
#83
I think you give AMD way too much credit these days. They're just stretched too thin, too broke.

Look at last two launches:

290X they slap some poor dick broke dust buster on it and reincarnate the FX5800 Ultra. You think they didn't KNOW that fan was as shitty as it gets?

You can just imagine the meeting:

Engineers: Boss, the 290X is a REALLY strong chip! If we put a good fan on it, the press will treat us like the second coming! NVIDiA high end sales will CEASE!

Mgmt: We have a TON of 7970 fans left. Use those, and make a "dual mode" so the schmucks think we're doing it to help them overclock!

Fast forward two years and we get a tweaked Tonga with a new kind of memory and a cheap pump.

Poverty = death in chip design industry
Reply
#84
Quote:The one hope they have this time is a MS (a company that matters) is probably pushing DX12 hard for XBone.

Amusingly, this isn't going to help them nearly as much as they like to think. The XBone has 3 ACEs, Sony made AMD add additional units for their hardware because the CPUs were so stupidly weak they couldn't come close to matching the PS3's CPU. Games that push ACEs hard on the XBone, are going to be using less than half the available resources on the PC hardware.

Quote:I remain undisturbed.

Was I too subtle in my facetiousness? Big Grin

Fermi supports 16 async threads-

http://on-demand.gputechconf.com/gtc-exp...ebinar.pdf

Their focus group is downright painful to read, they have no fucking clue what they are talking about in terms of any of the technology. I swear, they get a memo written by a guy at AMD who also doesn't understand what the fuck he is talking about and then manage to mangle it even further.
Reply
#85
(09-28-2015, 05:48 PM)BenSkywalker Wrote:
Quote:The one hope they have this time is a MS (a company that matters) is probably pushing DX12 hard for XBone.

Amusingly, this isn't going to help them nearly as much as they like to think. The XBone has 3 ACEs, Sony made AMD add additional units for their hardware because the CPUs were so stupidly weak they couldn't come close to matching the PS3's CPU. Games that push ACEs hard on the XBone, are going to be using less than half the available resources on the PC hardware.

Feeble AMD APU is feeble? I'm inclined to agree with you that DX12 won't bring parity with PS4. Somewhere here I posted that my thought is what it might do for XBone is add some games to the 1080p list that wouldn't have been and bring it closer to PS4 on that metric.


(09-28-2015, 05:48 PM)BenSkywalker Wrote:
Quote:I remain undisturbed.

Was I too subtle in my facetiousness? Big Grin

Fermi supports 16 async threads-

http://on-demand.gputechconf.com/gtc-exp...ebinar.pdf

Their focus group is downright painful to read, they have no fucking clue what they are talking about in terms of any of the technology. I swear, they get a memo written by a guy at AMD who also doesn't understand what the fuck he is talking about and then manage to mangle it even further.

When I was in NVIDIA's focus group they'd ask me for ideas on who to recruit as people would come and go, seemed like they wanted to have 5-7 members. Then we'd chat about candidates and whether they seemed knowledgeable and able to put a professional image of the company forward. (you may remember I tried to get you in back in the day, but you declined)

The AMD path seems to be "Let's recruit as many dumbasses as we can with posting contests. We'll award them points for the number of times they post "Arf arf arf! I love AMD and you should too!" or "Arf arf arf! NVIDIA is the devil!" and then give them games and hardware based on the number of posts.".

With us it was always,"Here's our new stuff. Use it, tell us if you find any bugs or things that aren't user friendly and post about your experiences online, good or bad.". When you're using what is usually the highest end stuff in existence, you don't usually have too much negative to say, because what you have is better than anything else or at least as good usually. So they got free advertising and QA out of the deal, because it's pretty easy to get excited when you're building quad SLi rigs and seeing you can do things like ginormous AA.
Reply
#86
(09-28-2015, 11:44 PM)RolloTheGreat Wrote:
(09-28-2015, 05:48 PM)BenSkywalker Wrote:
Quote:The one hope they have this time is a MS (a company that matters) is probably pushing DX12 hard for XBone.

Amusingly, this isn't going to help them nearly as much as they like to think. The XBone has 3 ACEs, Sony made AMD add additional units for their hardware because the CPUs were so stupidly weak they couldn't come close to matching the PS3's CPU. Games that push ACEs hard on the XBone, are going to be using less than half the available resources on the PC hardware.

Feeble AMD APU is feeble? I'm inclined to agree with you that DX12 won't bring parity with PS4. Somewhere here I posted that my thought is what it might do for XBone is add some games to the 1080p list that wouldn't have been and bring it closer to PS4 on that metric.

Some more games to the 1080p list, when the PS4 already does many games at 900p or even 720p?!?? 

Maybe just 10% faster with DX12 (or 10% more games running at 1080p), YAYAYAYAYA!!!!

XBone doesn't even have more than 1 good Kinect game (while Xbox360 has a few).  PS4 doesn't even have the Move controllers (I absolutely love them for my Sharpshooters - it's my favorite part of the PS3..  about 20 great games that work with the move controllers). 

This is by far the most boring, unexciting console generation since Sega Genesis vs Super Nintendo when everything else was fail.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)