Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD de-thrones nVidia with DX12!
#1
http://arstechnica.com/gaming/2015/08/di...or-nvidia/

Now, this is only one test but it looks *very* promising for AMD. I have to say that I'm very impressed to see a 290x beat a GTX 980Ti! If you look at the 1080p results, the 290x's framerate doubled!

It's too bad that this is only going to affect new DX12 games and not current games. Still, it does make current AMD cards look much more future proof than we thought before!
Reply
#2
(08-21-2015, 03:14 AM)SickBeast Wrote: http://arstechnica.com/gaming/2015/08/di...or-nvidia/

Now, this is only one test but it looks *very* promising for AMD.  I have to say that I'm very impressed to see a 290x beat a GTX 980Ti!  If you look at the 1080p results, the 290x's framerate doubled!

It's too bad that this is only going to affect new DX12 games and not current games.  Still, it does make current AMD cards look much more future proof than we thought before!

Looks like good news for Xbone, as the only console with DX12.
Reply
#3
(08-21-2015, 05:46 AM)RolloTheGreat Wrote:
(08-21-2015, 03:14 AM)SickBeast Wrote: http://arstechnica.com/gaming/2015/08/di...or-nvidia/

Now, this is only one test but it looks *very* promising for AMD.  I have to say that I'm very impressed to see a 290x beat a GTX 980Ti!  If you look at the 1080p results, the 290x's framerate doubled!

It's too bad that this is only going to affect new DX12 games and not current games.  Still, it does make current AMD cards look much more future proof than we thought before!

Looks like good news for Xbone, as the only console with DX12.

Sony has identical hardware and they use OpenGL. Now please correct me if I'm wrong, but can they not do the exact same thing as DX12 with OpenGL? Perhaps DX12 will be more optimized and easier to program for, but we really don't know this yet. Also, check this out:

http://www.cinemablend.com/games/Xbox-On...66567.html

Quote:“I think there is a lot of confusion around what and why DX12 will improve. Most games out there can’t go 1080p because the additional load on the shading units would be too much. For all these games DX12 is not going to change anything,”

“They might be able to push more triangles to the GPU but they are not going to be able to shade them, which defeats the purpose. To answer the first question, I think we will see a change in the way graphics programmers will think about their pipelines and this will result in much better systems hopefully.”
Reply
#4
DX12 is Windows 10 only. Not much written for Win 10 yet.
Reply
#5
(08-21-2015, 06:03 AM)dmcowen674 Wrote: DX12 is Windows 10 only. Not much written for Win 10 yet.

Yes however Windows 10 is a free upgrade and tons of people are going to have it soon. It looks to me as though what Microsoft is doing with Windows 10 and DX12 it's going to be a huge boon to PC gaming.

The Xbox One also has it, or at least will have it. I don't think they have updated it yet.

In any event, I'm hopeful that a lot of the newer games that come out will be DX12.
Reply
#6
PS4 has the same hardware as XBONE, only better where the GPU is concerned. Anything M$ can do on XBONE Sony can do on PS4 too.

Vulkan is DX12 for non m$ devices.

As for DX12 on PC's, I wouldn't expect a faster uptake than DX10 or DX11 back in their heyday.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#7
It doesn't seem that XBONE would have that much improvement with fully mature DX12 code over the older code, as I'd be willing to bet that most of the older code already implements most of the significant improvements that DX12 has over DX11 on Windows, as most of the code would already be designed to be closer-to-metal than the ordinary commonplace Windows DX11 code.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#8
(08-21-2015, 08:25 AM)gstanford Wrote: PS4 has the same hardware as XBONE, only better where the GPU is concerned.  Anything M$ can do on XBONE Sony can do on PS4 too.

Vulkan is DX12 for non m$ devices.

As for DX12 on PC's, I wouldn't expect a faster uptake than DX10 or DX11 back in their heyday.

Same for DX9 - it didn't seem that fast, taking a couple years to come into full effect and maturation, but with how popular Windows 10 is turning out to be, along with XBONE using DX12 as a basis to begin with for future console ports, DX12 is just as much of a no-brainer as DX9 ever was.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#9
(08-21-2015, 03:14 AM)SickBeast Wrote: http://arstechnica.com/gaming/2015/08/di...or-nvidia/

Now, this is only one test but it looks *very* promising for AMD.  I have to say that I'm very impressed to see a 290x beat a GTX 980Ti!  If you look at the 1080p results, the 290x's framerate doubled!

It's too bad that this is only going to affect new DX12 games and not current games.  Still, it does make current AMD cards look much more future proof than we thought before!

WOW!!!!!!!!!

This is amazing.  I'm hoping for the sake of AMD that it's not just this one game that happened to be poorly optimized for NV cards using DX12 code. 

But what gets me real excited from the link you shared is:

Quote:Tied into to this is Split-Frame Rendering (SFR). Instead of a multiple GPUs rendering an entire frame each, a process known as Alternate Frame Rendering (AFR), each frame is split into tiles for each GPU to render before being transferred to a display. In theory, this should eliminate much of the frame variance that afflicts current multi-GPU CrossFire and SLI setups.

Finally, DX12 will allow for multiple GPUs to pool their memory. If you've got two 4GB graphics cards in your machine, the game will have access to the full 8GB.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#10
Yes but as always the case with vaporware the actual products fail to deliver.
Reply
#11
[Image: zHOilTgh2G.png]
Reply
#12
[Image: 6g8uc9.jpg]
Reply
#13
Can you imagine the water cooler talk at intel the day Bulldozer was released?

"Dude! AMD guys made a CPU that is slower than their last one!"

"WTF?! If we tried slacking like that, we'd get fired! Those lucky bastards say it will be faster in future apps and go back to drinking!"

Kidding aside, AMD is making the second best CPUs and GPUs on the planet, and that in itself is an amazing achievement. Problem is no one wants second best.
Reply
#14
I thought it was actually 3rd best unless they have somehow managed to outperform their old 6 core Phenom II's.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#15
(08-21-2015, 09:32 AM)gstanford Wrote: I thought it was actually 3rd best unless they have somehow managed to outperform their old 6 core Phenom II's.

Vishera trumps the 6 cores.

The point is what AMD does is world class work, only one company beats them. As such, they're smarter guys than us, and most people.

But, they're not the best, so they get the "moar corez" jokes.

It's all money. When you have the most cash, you can hire the brightest guys, give them the best tools, most staff. When you don't have the cash you get the guys with the second best ideas, cut corners with computer modelling, and put moar corez on your chip because there has to be some differentiating factor for your product. (even if it doesn't really matter)
Reply
#16
They got the "moar corez" joke for going backward with IPC compared to older designs and competitor designs while focusing on (relatively) unimportant things like more processor cores per die instead of IPC.

Then you get people on other forums hoping that Zen will be a return to form for AMD. What they don't realize is that Bulldozer etc IS a return to form for AMD. Most AMD x86/x64 processors have sucked, frankly. The only real exceptions to that rule have been 5x86, K7 & K8. Everything else has been subpar (even Phenom II).

Crap processors are AMD's norm with things like K7 & K8 being the exception to the rule. Even there they got lucky. Intel was asleep at the wheel with P4 when K7/K8 arrived.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#17
SFR was actually around in the beginning of NVIDIA SLi and discarded.

Good discussion of why here:

https://forums.geforce.com/default/topic...patabilty/
Reply
#18
I thought SFR was still around, mainly as part of AFR2. The mode I remember going away/never really being available was the tiled/scissors mode.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#19
Good luck with getting SFR to work with DX10+ games, Gstan! Just read the thread that Rollo shared (where NV's employee dimed in).

Back then when I put up with the headaches of running cards in SLI (like with 7900GTX'es), I forced my online shooter games to run with SFR enabled instead of AFR. AFR was about 20% faster, but the input lag was unbearable. Also, AFR didn't allow for true triple buffering (unless there was something like quad buffering, if you tried to force it), hence worse input lag and frame rate "fractioning". Whenever frame rates dropped below vsync refresh rate, the frame rate halved, 1/3, 1/4, etc.. so I realized that AFR didn't allow for true triple buffering in the single-player games where I didn't mind the input lag.

Good thing SFR can finally become feasible once again with DX12 (everything designed for DX12).
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#20
From the link in the OP:

Quote:On the other hand, Nvidia's cards are very much designed for DX11. Anandtech found that any pre-Maxwell GPU from the company (that is, pre-980 Ti, 980, 970, and 960) had to either execute in serial or pre-empt to move tasks ahead of each other. That's not a problem under DX11, but it potentially becomes one under DX12.

What was this BS about all NV cards from 4xx and up (Fermi and Kepler) "ready for DX12" that Poppin and Gstan loved to spout all over the forums??? Would these cards even do SFR rendering properly with games designed for DX12, with decent scaling? We'll see.

Even GTX 980 Ti (Maxwell 2.0) cannot do the job as nicely on DX12 as DX11, while AMD's cards receive about 2x performance boost over DX11. In effect, if Fury was included in the benchmark, AMD would be mopping the floor with TitanX at every resolution, let alone the 99% frame time.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#21
(08-22-2015, 09:16 PM)BoFox Wrote: From the link in the OP:

Quote:On the other hand, Nvidia's cards are very much designed for DX11. Anandtech found that any pre-Maxwell GPU from the company (that is, pre-980 Ti, 980, 970, and 960) had to either execute in serial or pre-empt to move tasks ahead of each other. That's not a problem under DX11, but it potentially becomes one under DX12.

What was this BS about all NV cards from 4xx and up (Fermi and Kepler) "ready for DX12" that Poppin and Gstan loved to spout all over the forums???  Would these cards even do SFR rendering properly with games designed for DX12, with decent scaling?  We'll see.  

Even GTX 980 Ti (Maxwell 2.0) cannot do the job as nicely on DX12 as DX11, while AMD's cards receive about 2x performance boost over DX11.  In effect, if Fury was included in the benchmark, AMD would be mopping the floor with TitanX at every resolution, let alone the 99% frame time.
Someone else was telling me on another forum that the AMD cards are *much* more optimized for DX12 and support more features. I'm not sure if that's why the benchmarks I linked to are so one sided. The thing about those benchmarks, though, is if you look at the nVidia scores they gained nothing at all with DX12 and they actually lost some performance in some cases. That makes no sense to me unless that particular game is only supporting DX12 features that are supported by the AMD cards and not nVidia. My suspicion, however, is that the game or the drivers are not properly optimized for nVidia yet. It's still just a beta and it's hard to base firm conclusions from it (yet).
Reply
#22
Honestly I'll worry about DX12 when Pascal arrives and DX12 games arrive that don't have developers closely tied to AMD.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#23
AMD pouring money into drivers to boost DX12 performance
or
AMD bribing a cash-strapped dev into rigging Ashes of the Singularity against Nvidia

Which requires less money for a cash-strapped AMD?
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#24
The AMD money pot for cheating, bribing and shilling appears to be infinite as far as anyone can tell. Sadly the money pot for the rest of AMD's operations isn't.
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#25
(08-22-2015, 10:05 PM)SteelCrysis Wrote: AMD pouring money into drivers to boost DX12 performance
or
AMD bribing a cash-strapped dev into rigging Ashes of the Singularity against Nvidia

Which requires less money for a cash-strapped AMD?

Well, the developer of this game said this:

Quote:"Specifically, that the application has a bug in it which precludes the validity of the test. We assure everyone that is absolutely not the case. Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months."


Is Nvidia saying anything about this yet?  I'd like to hear an explanation from one of NV's employees. 
 
It seems that AMD was hoping for DX12 to come out by the time XBONE was launched, but then it was delayed.  Otherwise, why would AMD bother with the ACE's starting 3 years ago (and then quadruple the ACE's with the 290 cards):

Quote:The company's GCN architecture has long featured asynchronous compute engines (ACE), which up until now haven't really done it any favours when it comes to performance.  Under DX12, those ACEs should finally be put to work, with tasks like physics, lighting, and post-processing being divided into different queues and scheduled independently for processing by the GPU.

[Image: click?format=go&jsonp=vglnk_144026359218..._575px.png]

That was from a few months ago (interesting that BF4 didn't use ACE's for the PC version).  http://www.anandtech.com/show/9124/amd-d...us-shading
Good thing that the 2nd-gen Maxwell cards can finally mix compute queue with graphics operations.  NV was lucky that the Radeon cards were really held back by the lack of full fledged DX12 capabilities to really push them to the max.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#26
My 980 Ti runs DX12 in an emulator. Sort of like a Glide wrapper.

When NV irons out the kinks of the wrapper it will be every bit as fast at DX12 as the XBone. You guys just aren't patient!
Reply
#27
(08-22-2015, 09:39 PM)SickBeast Wrote:
(08-22-2015, 09:16 PM)BoFox Wrote: From the link in the OP:

Quote:On the other hand, Nvidia's cards are very much designed for DX11. Anandtech found that any pre-Maxwell GPU from the company (that is, pre-980 Ti, 980, 970, and 960) had to either execute in serial or pre-empt to move tasks ahead of each other. That's not a problem under DX11, but it potentially becomes one under DX12.

What was this BS about all NV cards from 4xx and up (Fermi and Kepler) "ready for DX12" that Poppin and Gstan loved to spout all over the forums???  Would these cards even do SFR rendering properly with games designed for DX12, with decent scaling?  We'll see.  

Even GTX 980 Ti (Maxwell 2.0) cannot do the job as nicely on DX12 as DX11, while AMD's cards receive about 2x performance boost over DX11.  In effect, if Fury was included in the benchmark, AMD would be mopping the floor with TitanX at every resolution, let alone the 99% frame time.
Someone else was telling me on another forum that the AMD cards are *much* more optimized for DX12 and support more features.  I'm not sure if that's why the benchmarks I linked to are so one sided.  The thing about those benchmarks, though, is if you look at the nVidia scores they gained nothing at all with DX12 and they actually lost some performance in some cases.  That makes no sense to me unless that particular game is only supporting DX12 features that are supported by the AMD cards and not nVidia.  My suspicion, however, is that the game or the drivers are not properly optimized for nVidia yet.  It's still just a beta and it's hard to base firm conclusions from it (yet).

Clearly the performance going backwards is a good indication that somethingpisnt right. This developer does have close ties to AMD but I 100% expect nvidia's performance to inprove. Have they even released a true dx12 driver yet?

I wouldn't underestimate nvidia's software team, just look at what they accomplished in dx11. We don't even know what real games will be like or how much this demo will be like dx12 game engines. It could be specifically written around GCN. Remember the star swarm demo? Nvidia has some talented teams that are capable of truly impressive things.

Regardless, I think most people can agree that nvidia has a lot of work to do in this demo. I don't know how urgent it may be but I do believe they can surely improve these pathetic results
Reply
#28
(08-22-2015, 09:45 PM)gstanford Wrote: Honestly I'll worry about DX12 when Pascal arrives and DX12 games arrive that don't have developers closely tied to AMD.

I wouldn't be surprised to see DX12 arrive earlier than past DXs.

ATi/AMD/NVIDIA can't really push an API's adoption, they are small companies compared MS.

Although the XBone is a tiny piece of MS's revenue, they seem to be willing to put some money into it's success.

Yes, GStan, all XBone has to do is make money to be a "success". In the business world, the only thing that matters is profit, not your fanboy fapping.

http://www.anandtech.com/show/8936/micro...venue-gain

Quote:Computing and Gaming Hardware had an 11% year-over-year drop in revenue, down to $4 billion. Gross margin is much lower for hardware, but it did show a 12% gain over 2014, to $460 million.

Most of that is probably Surface profit, but the software side is where MS is making cash with XBone:

Quote: First-party games had a strong quarter, with revenue up 79%, driven by Minecraft, Halo, and Forza franchises. Xbox Live had revenues up 42%, which is attributed to higher Xbox Live transactions.

Look at all the cash MS is spending on backwards compatibility, acquisition of Minecraft, streaming- it's pretty obvious they are heavily invested in computer and Xbox gaming and willing to put money there.

They spent $2.6b on Minecraft alone. That is over four times Sony's profit for the last fiscal year, on one game.

MS has the power to drive development trends, with cash and market share. (Xbox + PC)
Reply
#29
Quote:Yes, GStan, all XBone has to do is make money to be a "success". In the business world, the only thing that matters is profit, not your fanboy fapping.

I bet M$ would love to see the XBOX division in general make a profit and the XBONE in particular. Would make a nice change from it being a ravenous money pit for M$

As for fanboy fapping there is only one person here pushing shit uphill - you with the XBONE!
Adam knew he should have bought a PC but Eve fell for the marketing hype.

Homeopathy is what happened when snake oil salesmen discovered that water is cheaper than snake oil.

The reason they call it the American Dream is because you have to be asleep to believe it. -- George Carlin
Reply
#30
(08-23-2015, 08:20 AM)ocre Wrote:
(08-22-2015, 09:39 PM)SickBeast Wrote:
(08-22-2015, 09:16 PM)BoFox Wrote: From the link in the OP:

Quote:On the other hand, Nvidia's cards are very much designed for DX11. Anandtech found that any pre-Maxwell GPU from the company (that is, pre-980 Ti, 980, 970, and 960) had to either execute in serial or pre-empt to move tasks ahead of each other. That's not a problem under DX11, but it potentially becomes one under DX12.

What was this BS about all NV cards from 4xx and up (Fermi and Kepler) "ready for DX12" that Poppin and Gstan loved to spout all over the forums???  Would these cards even do SFR rendering properly with games designed for DX12, with decent scaling?  We'll see.  

Even GTX 980 Ti (Maxwell 2.0) cannot do the job as nicely on DX12 as DX11, while AMD's cards receive about 2x performance boost over DX11.  In effect, if Fury was included in the benchmark, AMD would be mopping the floor with TitanX at every resolution, let alone the 99% frame time.
Someone else was telling me on another forum that the AMD cards are *much* more optimized for DX12 and support more features.  I'm not sure if that's why the benchmarks I linked to are so one sided.  The thing about those benchmarks, though, is if you look at the nVidia scores they gained nothing at all with DX12 and they actually lost some performance in some cases.  That makes no sense to me unless that particular game is only supporting DX12 features that are supported by the AMD cards and not nVidia.  My suspicion, however, is that the game or the drivers are not properly optimized for nVidia yet.  It's still just a beta and it's hard to base firm conclusions from it (yet).

Clearly the performance going backwards is a good indication that somethingpisnt right.  This developer does have close ties to AMD but I 100% expect nvidia's performance to inprove.  Have they even released a true dx12 driver yet?

I wouldn't underestimate nvidia's software team, just look at what they accomplished in dx11.  We don't even know what real games will be like or how much this demo will be like dx12 game engines.   It could be specifically written around GCN. Remember the star swarm demo?  Nvidia has some talented teams that are capable of truly impressive things.

Regardless, I think most people can agree that nvidia has a lot of work to do in this demo.  I don't know how urgent it may be but I do believe they can surely improve these pathetic results

Good point, agreed.  Must've been something NV did not yet fully compile an optimization for.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#31
(08-23-2015, 10:29 PM)BoFox Wrote:
(08-23-2015, 08:20 AM)ocre Wrote:
(08-22-2015, 09:39 PM)SickBeast Wrote:
(08-22-2015, 09:16 PM)BoFox Wrote: From the link in the OP:

Quote:On the other hand, Nvidia's cards are very much designed for DX11. Anandtech found that any pre-Maxwell GPU from the company (that is, pre-980 Ti, 980, 970, and 960) had to either execute in serial or pre-empt to move tasks ahead of each other. That's not a problem under DX11, but it potentially becomes one under DX12.

What was this BS about all NV cards from 4xx and up (Fermi and Kepler) "ready for DX12" that Poppin and Gstan loved to spout all over the forums???  Would these cards even do SFR rendering properly with games designed for DX12, with decent scaling?  We'll see.  

Even GTX 980 Ti (Maxwell 2.0) cannot do the job as nicely on DX12 as DX11, while AMD's cards receive about 2x performance boost over DX11.  In effect, if Fury was included in the benchmark, AMD would be mopping the floor with TitanX at every resolution, let alone the 99% frame time.
Someone else was telling me on another forum that the AMD cards are *much* more optimized for DX12 and support more features.  I'm not sure if that's why the benchmarks I linked to are so one sided.  The thing about those benchmarks, though, is if you look at the nVidia scores they gained nothing at all with DX12 and they actually lost some performance in some cases.  That makes no sense to me unless that particular game is only supporting DX12 features that are supported by the AMD cards and not nVidia.  My suspicion, however, is that the game or the drivers are not properly optimized for nVidia yet.  It's still just a beta and it's hard to base firm conclusions from it (yet).

Clearly the performance going backwards is a good indication that somethingpisnt right.  This developer does have close ties to AMD but I 100% expect nvidia's performance to inprove.  Have they even released a true dx12 driver yet?

I wouldn't underestimate nvidia's software team, just look at what they accomplished in dx11.  We don't even know what real games will be like or how much this demo will be like dx12 game engines.   It could be specifically written around GCN. Remember the star swarm demo?  Nvidia has some talented teams that are capable of truly impressive things.

Regardless, I think most people can agree that nvidia has a lot of work to do in this demo.  I don't know how urgent it may be but I do believe they can surely improve these pathetic results

Good point, agreed.  Must've been something NV did not yet fully compile an optimization for.
The GTX 980 Ti may lack the features required for this game in DX12. We just don't know at this point. The final build of that game is going to make a great benchmark though.
Reply
#32
(08-23-2015, 06:38 PM)gstanford Wrote:
Quote:Yes, GStan, all XBone has to do is make money to be a "success". In the business world, the only thing that matters is profit, not your fanboy fapping.

I bet M$ would love to see the XBOX division in general make a profit and the XBONE in particular.  Would make a nice change from it being a ravenous money pit for M$

As for fanboy fapping there is only one person here pushing shit uphill - you with the XBONE!

Do you have any proof that XBone and MS game related software (MS games, Xbone Live) lose money?
Reply
#33
(08-24-2015, 01:25 AM)RolloTheGreat Wrote:
(08-23-2015, 06:38 PM)gstanford Wrote:
Quote:Yes, GStan, all XBone has to do is make money to be a "success". In the business world, the only thing that matters is profit, not your fanboy fapping.

I bet M$ would love to see the XBOX division in general make a profit and the XBONE in particular.  Would make a nice change from it being a ravenous money pit for M$

As for fanboy fapping there is only one person here pushing shit uphill - you with the XBONE!

Do you have any proof that XBone and MS game related software (MS games, Xbone Live) lose money?

Microsoft hides that information so there is no way for us to know that for sure. I'm sure you already knew that though before asking the question.
Reply
#34
(08-24-2015, 03:24 AM)SickBeast Wrote: Microsoft hides that information so there is no way for us to know that for sure.  I'm sure you already knew that though before asking the question.

Oh. So Gstanford was talking out of his ass again, spouting his own theories with no basis in fact?

Business as usual for GStan I guess.
Reply
#35
(08-24-2015, 03:59 AM)RolloTheGreat Wrote:
(08-24-2015, 03:24 AM)SickBeast Wrote: Microsoft hides that information so there is no way for us to know that for sure.  I'm sure you already knew that though before asking the question.

Oh. So Gstanford was talking out of his ass again, spouting his own theories with no basis in fact?

Business as usual for GStan I guess.

But you knew this. You were baiting us by asking for proof. You knew all along.
Reply
#36
(08-24-2015, 04:55 AM)SickBeast Wrote:
(08-24-2015, 03:59 AM)RolloTheGreat Wrote:
(08-24-2015, 03:24 AM)SickBeast Wrote: Microsoft hides that information so there is no way for us to know that for sure.  I'm sure you already knew that though before asking the question.

Oh. So Gstanford was talking out of his ass again, spouting his own theories with no basis in fact?

Business as usual for GStan I guess.

But you knew this.  You were baiting us by asking for proof.  You knew all along.

No, I didn't. I knew they lump XBone and 360 sales (which have to be mostly XBones these days) but how/why would I know this?

That link I shared about revenues and margins seemed pretty forthcoming with info.

About all you could say is I was amused to find out GStan was talking out of his ass, as usual.
Reply
#37
Well perhaps Greg has some information that I could not find. What I read was that MS bundles the XB1 sales with their Surface tablets so it's impossible to know if the XB1 is causing MS to lose money.
Reply
#38
Yet gstanford is not only insisting but celebrating this claim
Reply
#39
Not sure where the original info is on this but apparently there is a little more to this story.

https://forums.geforce.com/default/topic...nt=4641776

Quote:"
As you all know, NVIDIA released the 355.60 driver specifically for Ashes of the Singularity’s Alpha, which is in itself a rare occurrence for a game still in development. Even so, we registered mixed results in our DX12 performance benchmarks with NVIDIA cards and clearly the company noticed all of this on its own, as they reached out to the press in order to give their side to the story.
We were able to get a detailed statement from NVIDIA’s Brian Burke, Senior PR Manager. Here’s what he had to say on the matter:
This title is in an early Alpha stage according to the creator. It’s hard to say what is going on with alpha software. It is still being finished and optimized. It still has bugs, such as the one that Oxide found where there is an issue on their side which negatively effects DX12 performance when MSAA is used. They are hoping to have a fix on their side shortly.
We think the game looks intriguing, but an alpha benchmark has limited usefulness. It will tell you how your system runs a series of preselected scenes from the alpha version of Ashes of Singularity. We do not believe it is a good indicator of overall DirectX 12 gaming performance.
We’ve worked closely with Microsoft for years on DirectX 12 and have powered every major DirectX 12 public demo they have shown. We have the upmost confidence in DX12, our DX12 drivers and our architecture’s ability to perform in DX12.
When accurate DX12 metrics arrive, the story will be the same as it was for DX11.
It should be noted that NVIDIA’s mention of a MSAA performance bug while running on DX12 has been contested by developer Oxide Games, which published a blog post of its own talking about some “misinformation” being spread on the Ashes of the Singularity benchmark. They also dispute the fact that this test is not useful, of course:
It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games. What’s the point of optimizing code 6 months after a title is released, after all? Certainly, things will change a bit until release. But PC games with digital updates are always changing, we certainly won’t hold back from making big changes post launch if we feel it makes the game better!
There’s also this cryptic but seemingly ominous tweet by Brad Wardell, CEO of Stardock, which is publishing Ashes of the Singularity.

It really only seems logical that nvidia will not go backwards in performance moving to dx12. Also, a random demo that is in alpha stages from a developer with close ties to AMD and mantle shouldn't be used as the pentacle of dx12. Not saying that AMD won't gain more when dx12 games launch but nvidia having results this terrible is comical.
Reply
#40
(08-24-2015, 06:13 AM)ocre Wrote: Yet gstanford is not only insisting but celebrating this claim

Perhaps because GStan is an internet troll second only to Apoppin.

Apparently there's no way of knowing and we've certainly seen him crow about this more than once.

GStan "logic": "Microsoft is selling half as many XBones as Sony is selling PS4s. They MUST be losing money!"

Of course if MS makes one dollar off every Xbone sold, they've made $14,000,000.00.

http://www.extremetech.com/gaming/211302...ned-report

Not much in the scope of MS profits (they make more every 3 months than Sony has made in the last five or ten years, but profit is profit)

GStan- your number one source for random ranting and fanboy fapping. Just don't expect to see sources cited.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)