Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AMD Fury no 980Ti killer
#81
Plus, everybody knows that the 4K age has dawned upon us, so the console makers need something much more powerful for decent 4K graphics without suffering any IQ losses compared to what is currently shown on the PS4. Radeon and Geforce tech are that much more justified for driving 4K graphics. 16nm makes Titan-X performance feasible with just about 100-130 watt consumption (less than half of 28nm Titan X), which is about 5-6 times more powerful than XboxOne's GPU, considering all other things equal.

This would be gravely needed to push 4K, which is exactly 4 times the 1080p content barely being driven by the PS4, let alone XBONE (mainly 720p/900p).

In this light, with AMD being so dirt cheap this year, perhaps their x86 CPU business does not even matter that much anyway, in regards of competing against Intel with x86 CPU's. Microsoft would only need AMD to produce decent ZEN CPU's that are probably on the level of Ivy Bridge CPU's, consuming about 50W or so with 8 cores running at 3+GHz. Then Sony would be forced to explore other alternatives or go the much more expensive route with Intel's CPU's that still wouldn't make the console win out if Nvidia decides to make their own serious 4K Android SHIELD console while still refusing to support other consoles.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#82
(07-29-2015, 09:40 AM)BoFox Wrote: It's all for the better - a mega-corporation needs to swoop in and snatch AMD up for pennies, even if AMD has over a billion dollar debt weighing down the carcass.  

It makes the most sense for Microsoft to buy AMD and start pouring billions upon billions of R&D into hardware.  Sure, there could be legal workarounds so that AMD could still be considered  an independent company "merging" with Microsoft, and still produce x86 CPUs/APUs with the old license, but really be completely owned by Microsoft after all.  

This would really threaten Sony with the PS5, especially if Nvidia sticks to their decision to no longer be a bitch for somebody else's consoles.  

I'm thinking that there has to be such a secretive agreement or clause somewhere between Sony, AMD, and Microsoft - that neither Sony or Microsoft are allowed to buy AMD out for the entire duration of AMD supplying chips to either party????  Maybe I'm not making any sense at all?  It's just that Microsoft would do anything to wipe Sony out of the console competition, and snatching AMD up now would be a no-brainer.  The price that Microsoft is paying for all of those APU's in their consoles is probably not too far away from the total market value of AMD altogether, nearly at a historic bottom within the last 12 years or so.  

Must be anti-monopoly laws that M$ is afraid of, thus steering clear of the temptation to buy 51% of AMD's shares, at least?

MS can take over the gaming market cheaper than buying AMD and putting billions and billions into R&D, they could just lower the price of XBone to $199.99. Sony can't follow suit because PS4 is one of the only things they sell that makes money.

Or they could re-do the XBone with an intel chip and NVIDIA GPU. Think anyone on the planet buys an AMD APU over that?
Reply
#83
(07-29-2015, 04:09 AM)BoFox Wrote: Voltage adjustment experiments done on Fury X - abysmal results with scaling vs core clock increases:
http://www.techpowerup.com/reviews/AMD/R...ervoltage/

[Image: scaling.gif]

That's basically half the fps gain for each MHz gain, proportionally-wise!!!  It reminds me of the scaling of overclocking AMD's bullsh*coughs*dozer or pile-o'turd-driver CPU's!



Looks like even stock-clocked HBM1 isn't plentiful already for Fury X cards, poorly optimized with default 512GB/s bandwidth, despite having much larger L2 cache than Hawaii.  

[Image: power.gif]

Warning:  W1zzard said that +40mV would be the max for safe continuous operation, even with overclocked watercooling pump.  Anything more overheats the VRM's, especially given the fact that they're already directly cooled by a copper pipe nearby, connected to the watercooler.

OMG!!!!

Bofox, what do you mean by half a fps per each mhz?  it looks nowhere near that.  I would say 50mhz increase gave about a half a fps.  that would be .01 fps every mhz.

but the OMG i first posted........
it is in reference to the power consumption.  Its really shooting up.  You need to sight the article

Quote:As you can see, power ramps up very quickly, much faster than maximum clock or performance. From stock to +144 mV, power draw increases by 27%, while overclocking potential only went up by 5% and real-life performance increases by only 3%.
 

that is what it says, but actually he rigged the test.  Its probably even worse.  

Quote:In this graph, I'm showing full-system power draw during testing. This test has clock speeds fixed at 1100 MHz for better comparison.

You see, they locked the speed at 1100mhz!!!!
That means that the power consumption results are not showing the card with a 5% overclock.  That 27% figure is not based on the card running with a 5% higher overclock.  He fixed the clock speed at 1100mhz.  With +144mv his gpu clocks at about 1220mhz.  He talks about the extra 5% potential in his power consumption commentary but did not show the actual results of 1220mhz @ +144mv.  The results would have been worse.  Possibly, way worse.......most likely, way way worse

and they are already, really really bad

he says, "Looking at the numbers, I'm not sure if a 150W power draw increase for a mere 3 FPS increase is worth it for most gamers. Smaller voltage bumps to get a specific clock frequency 100% stable are alright, though." BUT------> the power consumption results are from his card at 1100mhz. There is no extra 3 fps, it is not accurate. At 1220mhz, his power consumption could would have been worse
Reply
#84
(07-29-2015, 11:42 PM)ocre Wrote: Bofox, what do you mean by half a fps per each mhz?  it looks nowhere near that.  I would say 50mhz increase gave about a half a fps.  that would be .01 fps every mhz.
Oops, my bad - I meant to say half the performance, rather than fps.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply
#85
(07-29-2015, 11:42 PM)ocre Wrote: that is what it says, but actually he rigged the test.  Its probably even worse.  

Quote:In this graph, I'm showing full-system power draw during testing. This test has clock speeds fixed at 1100 MHz for better comparison.

You see, they locked the speed at 1100mhz!!!!
That means that the power consumption results are not showing the card with a 5% overclock.  That 27% figure is not based on the card running with a 5% higher overclock.  He fixed the clock speed at 1100mhz.  With +144mv his gpu clocks at about 1220mhz.  He talks about the extra 5% potential in his power consumption commentary but did not show the actual results of 1220mhz @ +144mv.  The results would have been worse.  Possibly, way worse.......most likely, way way worse

and they are already, really really bad

he says, "Looking at the numbers, I'm not sure if a 150W power draw increase for a mere 3 FPS increase is worth it for most gamers. Smaller voltage bumps to get a specific clock frequency 100% stable are alright, though."  BUT------> the power consumption results are from his card at 1100mhz.  There is no extra 3 fps, it is not accurate.   At 1220mhz, his power consumption could would have been worse

Hahahaha, good observation! 

What doesn't make sense at all is that the Fury non-X is $120 more than the R9 390X, for only 10% more performance, but has HALF the VRAM. 

Never seen anything like this before.
Ok with science that the big bang theory requires that fundamental scientific laws do not exist for the first few minutes, but not ok for the creator to defy these laws...  Rolleyes
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)