Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[EDIT]hmmm.....whats going on at AMD? (and history of 3Dfx Rampage)
#41
Everyone has always speculated about Rampage. It's hard to believe that it was so far along in its development but it never was released. You would think that the bankruptcy people would have wanted it to come out. I guess it just wasn't far enough along.

Was the GeForce 256 the card that Rampage would have competed against? If so, then Rampage would have been close to 4 times as powerful if the specs are accurate.

http://forums.guru3d.com/showthread.php?t=171176

http://www.gpureview.com/GeForce256-DDR-card-114.html

Look at the fill rate and memory bandwidth.
Reply
#42
Can you imagine being an AMD salesman, calling on HP, Dell, Gateway, etc. and trotting out the FXs and the APUs?

"Ummm...SORRY Joe.....I can't meet with you this month, probably next either...they're going to be cleaning the carpets around here in a few months and things are pretty hectic...we'll get together next year and you can show me those Bulldozers again...."
Reply
#43
(10-11-2015, 10:36 AM)SickBeast Wrote: Everyone has always speculated about Rampage.  It's hard to believe that it was so far along in its development but it never was released.  You would think that the bankruptcy people would have wanted it to come out.  I guess it just wasn't far enough along.

Was the GeForce 256 the card that Rampage would have competed against?  If so, then Rampage would have been close to 4 times as powerful if the specs are accurate.

http://forums.guru3d.com/showthread.php?t=171176

http://www.gpureview.com/GeForce256-DDR-card-114.html

Look at the fill rate and memory bandwidth.



Nope, not a chance. The rampage was the architecture that would have followed the visa-100 which the voodoo5 5500 was based ofF. The vsa-100 was based off the voodoo3 chip. The GeForce 256 would have been old by the time the rampage ever could have launched. The voodoo3 era.

3dfX decided to try to combate nvidia by combining chips on a single board. That's how the vsa-100 came to be. They launched the voodoo 5 5500 but by then, nvidia had the GeForce2.

3DFX never got the planned VSA-100 line up launched before they went under. As a matter of fact, they only release a fraction of it. Two, I believe. The first was the voodoo5 5500 and a few months later came a single chip budget cheaper vsa-100......this was oct-nov in the yr 2000. 3DFX went under not to long after that.

So that's where they were. Still in the middle of rolling out their via-100 line up. The rampage was in its very very early stages. Of course 3DFX was working on a new architecture, as a chip maker you always must be. Chips take years to come into production, years.
So i don't doubt at all there were engineers working on a new design. It would be very strange if they were not. My issue is the statements about it.

The paper specs are one thing but how it actually would perform is another. Then the reality....there is no way that it would have launched in 2000. Even in 2001, if it was feasible at all, it would have had to been late 2001-2002 era. Best case, it would be met with the GeForce3, but you have to think about this for a minute, by February 2002 the GeForce 4 launched.

The rampage paper specs for a single chip aren't all that impressive compared to a geforce2 ultra, which was already out by the time the engineers got the earliest rampage silicon back for testing. The rampage in the blog bofox posted was the 2nd revision which barely got any testing done on it, The first revision was a nightmare. See, there could have been any number of issues.....it was fresh silicon of a brand new architecture. About as fresh as could be.
There is just a lot of fantasy in this story, most of it is.
They had a 2nd revised rampage that booted up, that's literally all there is to it. The rest is invented.

The final evidence that should seal this case shut.....
Nvidia acquired 3DFX...........
If the rampage was so far advanced and this untouchable architecture that was years ahead of everything else, would not have nvidia release this already done chip themselves?

The reality is, nvidia was surely working on GeForce3 if not 4 in the same time period as the rampage. They probably had very early stage silicon in their labs as well.
Reply
#44
Well the truth is we will never know.

I owned a couple of 3DFX cards personally. I had a Voodoo 2 and a Voodoo 3. Both were great cards. I particularly enjoyed the Voodoo 3, it had really great performance and I liked that it had the 2D card integrated. Both cards lasted me a really long time and the Voodoo 3 was quite affordable.
Reply
#45
i remember the hype back then, really really exciting times. 3dfx and their voodoo line really made a splash in the history of PC. The impact was huge and the significance can't be understated.

I had acquired a PC with a 3d voodoo card, but it was a few yrs late. I picked it up in a trade, I just wanted to mess with it. By the time I had a voodoo, 3dfx was already bankrupt. Nvidia was well into the GeForce 4.

That PC was crazy when I think back. You had a graphics card that would have to be plugged into the 3d card which then went to the monitor. It was pretty worthless by the time I got it, things changed so fast back then. Games were simply unplayable unless you upgraded. If your CPU was 3yrs old it was a dinosaur.....a joke. That was ages ago
Reply
#46
Of course, they were just overclocking the memory to test the headroom of the memory.  250MHz was the max stable clock, not the expected spec (which was 180-200Mhz DDR, or around 400MHz effective).  It didn't mean that the card would be faster than Geforce2 Ultra.  

I do agree with you though, that they didn't test the Sage T&L unit.  

The VSA-100 was much bigger than the Voodoo3 chip, because it implemented several new features, like 32-bit color that everybody was demanding just a year earlier, and giving 3dfx all the flak for it, giving Nvidia all of the attention.  Voodoo3 was the powerhouse 2d/3d integration that brought the $500 Voodoo2 SLI plus 2D acceleration into one card costing $200 or $300 (I think, it was $200, even though I bought it on launch day at CompUSA, April 1 1999, I think, but I can't even remember the price).  That was in response to Nvidia TNT.  But then customers were still not satisfied, since TNT 2 Ultra came so damn close in many games, and also offered 32-bit color compability plus 2048x2048 texture capability rather than just 256x256.  Then stubborn 3dfx had to tweak their chip for multi-chip architecture, and doing 2 chips on a single board was the first for them (remember how ATI miserably failed with their Rage MAXX which was supposed to rock the world)?  The MAXX was actually launched and marketed, even though it was pure bull, lulz........   the embarrassing shame carries on in ATI/AMD's blood as a recurring disease.  

It was largely a "buy-me-out", with most of the employees at 3dfx receiving an attractive portion of the lump sum.  IMHO, they could've kept up with Geforce3 with the dual-chip+T&L card if it did work out successfully, by the end of 2001 (beating Radeon 8500 as well), but Geforce4 would've come out a couple months later, and been just as fast, and cheaper to produce as well.  It would've only made Nvidia more aggressive with Geforce3 Ti500 (using faster DDR memory, and clocking it a bit higher), and Geforce4 Ti4600 (with higher clocks) as well.  If 3dfx was indeed going to tape out the next gen after Rampage, called Fear, by April 2001,
Quote:Fear- The first part based on 3dfx and Gigapixel technology. Fear actually consisted of two separate parts: Fusion and Sage II. Fusion was derived from combining 3dfx and Gigapixel technology. This was a part targeted at DirectX8-9 (though the specification was nothing near final). Being from Gigapixel, it was a deferred rendering architecture. At the time of 3dfx closing shop, Fusion was considered RTL complete and tape out was expected in March of 2001. Sage II was slightly behind Fusion, but it was making ground.
It would certainly have bashed Geforce4 to bits, but with the 9700Pro coming out in mid 2002, there was just no way it could've kept up.  So 3dfx was doomed anyway.  Oh well.  If ATI bought 3dfx instead, I think Nvidia would've swapped places, being 2nd to ATI ever since. 

Nice history debate, hehe.. 

Most mind-blowing launches ever:
1)  8800GTX  (an asteroid that brought DX10 to earth)
2)  9700 Pro  (a comet that brought DX9 to earth, plus up to 6x RGMSAA and 16x AF that enhanced older games)
3)  Geforce 2 GTS (the proverbial reason 3dfx gave up with voodoo5 6000 and went out of business)
4)  Voodoo2 SLI (actually 2 cards, but like 3-3.5x faster than Voodoo1, while so many games other than just Quake were finally 3d-accelerated)
5) 6800 Ultra SLI (although SLI introduced lag and microstuttering (or somewhat, like 30% "artificial" frame rate doubling) in AFR mode in the few games supported at first, unless SFR offered decent scaling not less than 50%, if it was compatible with a game)
6) HD 5970 (although Crossfire was still highly non-synchronized microstuttering mess, and having 2 of these for Quadfire totally sucked)
7) 7800GTX-512 (although extremely short-lived with its crown before X1900XTX came out - it was amazing how much further Nvidia pushed G70 when ATI had nothing since X850XT, and the X1800XT was an utter disappointment. Also mindblowing was that it was the first $750+ card)
8) What do you think? Titan X with 12GB, still on 28nm, or what?
6)  and what do you think it is?  yep, me and you nerds here.
Reply
#47
(10-13-2015, 06:54 AM)ocre Wrote: i remember the hype back then, really really exciting times. 3dfx and their voodoo line really made a splash in the history of PC. The impact was huge and the significance can't be understated.

I had acquired a PC with a 3d voodoo card, but it was a few yrs late. I picked it up in a trade, I just wanted to mess with it. By the time I had a voodoo, 3dfx was already bankrupt. Nvidia was well into the GeForce 4.

That PC was crazy when I think back. You had a graphics card that would have to be plugged into the 3d card which then went to the monitor. It was pretty worthless by the time I got it, things changed so fast back then. Games were simply unplayable unless you upgraded. If your CPU was 3yrs old it was a dinosaur.....a joke. That was ages ago

Was quite a time.

Now both the CPU and GPU's are so powerful.
Reply
#48
^^ I'm just glad I don't have to ditch the entire rig every 3 years, and that GPU's are the only thing that really need ditching every 2-3 years, for the gaming eye-candy lust that we enthusiasts have.

Now, it seems to be all about the small and mobile market, cutting down the power of CPU's to 10W within the next 10 years, with maybe 5% increase in IPC, lol. Then Intel is in trouble and becomes a fabbing company like TSMC or GloFo.

BTW, edited the title of the thread.
Reply
#49
(10-13-2015, 12:18 PM)BoFox Wrote: ^^ I'm just glad I don't have to ditch the entire rig every 3 years, and that GPU's are the only thing that really need ditching every 2-3 years, for the gaming eye-candy lust that we enthusiasts have.

Now, it seems to be all about the small and mobile market, cutting down the power of CPU's to 10W within the next 10 years, with maybe 5% increase in IPC, lol. Then Intel is in trouble and becomes a fabbing company like TSMC or GloFo.

Exactly.

It is a case of the Technology actually hurting itself by getting too powerful.

The same thing is now happening to smartphone sales.

I read something about a 24 core chip now.
Reply
#50
(10-13-2015, 11:12 AM)BoFox Wrote: Of course, they were just overclocking the memory to test the headroom of the memory.  250MHz was the max stable clock, not the expected spec (which was 180-200Mhz DDR, or around 400MHz effective).  It didn't mean that the card would be faster than Geforce2 Ultra.  

I do agree with you though, that they didn't test the Sage T&L unit.  

The VSA-100 was much bigger than the Voodoo3 chip, because it implemented several new features, like 32-bit color that everybody was demanding just a year earlier, and giving 3dfx all the flak for it, giving Nvidia all of the attention.  Voodoo3 was the powerhouse 2d/3d integration that brought the $500 Voodoo2 SLI plus 2D acceleration into one card costing $200 or $300 (I think, it was $200, even though I bought it on launch day at CompUSA, April 1 1999, I think, but I can't even remember the price).  That was in response to Nvidia TNT.  But then customers were still not satisfied, since TNT 2 Ultra came so damn close in many games, and also offered 32-bit color compability plus 2048x2048 texture capability rather than just 256x256.  Then stubborn 3dfx had to tweak their chip for multi-chip architecture, and doing 2 chips on a single board was the first for them (remember how ATI miserably failed with their Rage MAXX which was supposed to rock the world)?  The MAXX was actually launched and marketed, even though it was pure bull, lulz........   the embarrassing shame carries on in ATI/AMD's blood as a recurring disease.  

It was largely a "buy-me-out", with most of the employees at 3dfx receiving an attractive portion of the lump sum.  IMHO, they could've kept up with Geforce3 with the dual-chip+T&L card if it did work out successfully, by the end of 2001 (beating Radeon 8500 as well), but Geforce4 would've come out a couple months later, and been just as fast, and cheaper to produce as well.  It would've only made Nvidia more aggressive with Geforce3 Ti500 (using faster DDR memory, and clocking it a bit higher), and Geforce4 Ti4600 (with higher clocks) as well.  If 3dfx was indeed going to tape out the next gen after Rampage, called Fear, by April 2001,
Quote:Fear- The first part based on 3dfx and Gigapixel technology. Fear actually consisted of two separate parts: Fusion and Sage II. Fusion was derived from combining 3dfx and Gigapixel technology. This was a part targeted at DirectX8-9 (though the specification was nothing near final). Being from Gigapixel, it was a deferred rendering architecture. At the time of 3dfx closing shop, Fusion was considered RTL complete and tape out was expected in March of 2001. Sage II was slightly behind Fusion, but it was making ground.
It would certainly have bashed Geforce4 to bits, but with the 9700Pro coming out in mid 2002, there was just no way it could've kept up.  So 3dfx was doomed anyway.  Oh well.  If ATI bought 3dfx instead, I think Nvidia would've swapped places, being 2nd to ATI ever since. 

Nice history debate, hehe.. 

Most mind-blowing launches ever:
1)  8800GTX  (an asteroid that brought DX10 to earth)
2)  9700 Pro  (a comet that brought DX9 to earth, plus up to 6x RGMSAA and 16x AF that enhanced older games)
3)  Geforce 2 GTS (the proverbial reason 3dfx gave up with voodoo5 6000 and went out of business)
4)  Voodoo2 SLI (actually 2 cards, but like 3-3.5x faster than Voodoo1, while so many games other than just Quake were finally 3d-accelerated)
5)  6800 Ultra SLI (although SLI introduced lag and microstuttering (or somewhat, like 30% "artificial" frame rate doubling) in AFR mode in the few games supported at first, unless SFR offered decent scaling not less than 50%, if it was compatible with a game)
6)  HD 5970 (although Crossfire was still highly non-synchronized microstuttering mess, and having 2 of these for Quadfire totally sucked)
7)  7800GTX-512 (although extremely short-lived with its crown before X1900XTX came out - it was amazing how much further Nvidia pushed G70 when ATI had nothing since X850XT, and the X1800XT was an utter disappointment.  Also mindblowing was that it was the first $750+ card)
8)  What do you think?  Titan X with 12GB, still on 28nm, or what?  
6)  and what do you think it is?  yep, me and you nerds here.

Geez..
That is a great list, but I am unsure about the titanX. You know, it was decent and a very respectable achievement. But imagine.........imagine if maxwell debuted as 20nm like originally intended. Even if it is not nvidia's fault the had to scratch 20nm plans, the actual performance jump from the Titan black to Titan x is not what it could have been.

I imagine things will get slower and slower, that's the reality of PC these days. But we could finally be on the verge of another landmark. Pascal

We all want to see another 8800gtx, boy would that be awesome......
Kind of makes me giddy inside........sort of like thinking about fallout 4
Reply
#51
(10-14-2015, 09:05 AM)ocre Wrote:
(10-13-2015, 11:12 AM)BoFox Wrote: Of course, they were just overclocking the memory to test the headroom of the memory.  250MHz was the max stable clock, not the expected spec (which was 180-200Mhz DDR, or around 400MHz effective).  It didn't mean that the card would be faster than Geforce2 Ultra.  

I do agree with you though, that they didn't test the Sage T&L unit.  

The VSA-100 was much bigger than the Voodoo3 chip, because it implemented several new features, like 32-bit color that everybody was demanding just a year earlier, and giving 3dfx all the flak for it, giving Nvidia all of the attention.  Voodoo3 was the powerhouse 2d/3d integration that brought the $500 Voodoo2 SLI plus 2D acceleration into one card costing $200 or $300 (I think, it was $200, even though I bought it on launch day at CompUSA, April 1 1999, I think, but I can't even remember the price).  That was in response to Nvidia TNT.  But then customers were still not satisfied, since TNT 2 Ultra came so damn close in many games, and also offered 32-bit color compability plus 2048x2048 texture capability rather than just 256x256.  Then stubborn 3dfx had to tweak their chip for multi-chip architecture, and doing 2 chips on a single board was the first for them (remember how ATI miserably failed with their Rage MAXX which was supposed to rock the world)?  The MAXX was actually launched and marketed, even though it was pure bull, lulz........   the embarrassing shame carries on in ATI/AMD's blood as a recurring disease.  

It was largely a "buy-me-out", with most of the employees at 3dfx receiving an attractive portion of the lump sum.  IMHO, they could've kept up with Geforce3 with the dual-chip+T&L card if it did work out successfully, by the end of 2001 (beating Radeon 8500 as well), but Geforce4 would've come out a couple months later, and been just as fast, and cheaper to produce as well.  It would've only made Nvidia more aggressive with Geforce3 Ti500 (using faster DDR memory, and clocking it a bit higher), and Geforce4 Ti4600 (with higher clocks) as well.  If 3dfx was indeed going to tape out the next gen after Rampage, called Fear, by April 2001,
Quote:Fear- The first part based on 3dfx and Gigapixel technology. Fear actually consisted of two separate parts: Fusion and Sage II. Fusion was derived from combining 3dfx and Gigapixel technology. This was a part targeted at DirectX8-9 (though the specification was nothing near final). Being from Gigapixel, it was a deferred rendering architecture. At the time of 3dfx closing shop, Fusion was considered RTL complete and tape out was expected in March of 2001. Sage II was slightly behind Fusion, but it was making ground.
It would certainly have bashed Geforce4 to bits, but with the 9700Pro coming out in mid 2002, there was just no way it could've kept up.  So 3dfx was doomed anyway.  Oh well.  If ATI bought 3dfx instead, I think Nvidia would've swapped places, being 2nd to ATI ever since. 

Nice history debate, hehe.. 

Most mind-blowing launches ever:
1)  8800GTX  (an asteroid that brought DX10 to earth)
2)  9700 Pro  (a comet that brought DX9 to earth, plus up to 6x RGMSAA and 16x AF that enhanced older games)
3)  Geforce 2 GTS (the proverbial reason 3dfx gave up with voodoo5 6000 and went out of business)
4)  Voodoo2 SLI (actually 2 cards, but like 3-3.5x faster than Voodoo1, while so many games other than just Quake were finally 3d-accelerated)
5)  6800 Ultra SLI (although SLI introduced lag and microstuttering (or somewhat, like 30% "artificial" frame rate doubling) in AFR mode in the few games supported at first, unless SFR offered decent scaling not less than 50%, if it was compatible with a game)
6)  HD 5970 (although Crossfire was still highly non-synchronized microstuttering mess, and having 2 of these for Quadfire totally sucked)
7)  7800GTX-512 (although extremely short-lived with its crown before X1900XTX came out - it was amazing how much further Nvidia pushed G70 when ATI had nothing since X850XT, and the X1800XT was an utter disappointment.  Also mindblowing was that it was the first $750+ card)
8)  What do you think?  Titan X with 12GB, still on 28nm, or what?  
6)  and what do you think it is?  yep, me and you nerds here.

Geez..
That is a great list, but I am unsure about the titanX. You know, it was decent and a very respectable achievement. But imagine.........imagine if maxwell debuted as 20nm like originally intended.  Even if it is not nvidia's fault the had to scratch 20nm plans, the actual performance jump from the Titan black to Titan x is not what it could have been.  

I imagine things will get slower and slower, that's the reality of PC these days.  But we could finally be on the verge of another landmark.  Pascal

We all want to see another 8800gtx, boy would that be awesome......
Kind of makes me giddy inside........sort of like thinking about fallout 4

Well, I was just blown away by Titan X's ability to deliver like 50% more than Titan Black, using only 8 billion trannies vs 7 billion, with hardly any more power consumption (largely due to 12GB GDDR5) using the same node process. This was on top of how impressive Kepler was over Fermi.

BTW, not sure which was more mind-blowing:
Radeon 9700 Pro, or
8800GTX...

The 9700 Pro was pretty much right up there in the high-end class for 2 years, until next gen came out.
The 8800GTX was up there for only about 1 1/2 years plus a month (19 months or so) before GTX 280 and HD 4870 came out.

Plus the 9700 Pro brought more significant features like efficient RGMSAA (up to 6x) and 16x AF - such revolutionary features that became a must for enthusiast PC gaming ever since.

One drawback was that the 9700 Pro had closer competition from the Geforce FX series than the competition that the 8800GTX had, which was largely untouched by HD 2900XT. In both cases, the competition was at least 6 months late anyway.

I should just give the 9700Pro the #1 title anyway. The 8800GTX only brought very slightly better AF quality, up to 16xQ CSAA mode, but pretty much nerfed 3DVision support that became proprietary for a while with Vista, and remained at $500 for over 1 year, while the 9700Pro was $400 and dropped to under $250 a year later. 

ATI wins!  Big Grin
Reply
#52
AMD suffers losses yet again, writes down $65 million on older APUs: http://techreport.com/news/29198/amd-rep...write-down
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#53
-200 million

WHAT THE ______!!!!!!!

I can hear the advocates now, "AMD is doing fine" , "they can have 37 consecutive Quarters that bad and still not go under" , "you guys dont understand business! See their revenue is 1 billion so they can keep posting massive losses.......they can go all the way down to -1 billion a Quarter before they go under, duh that is how business works" --lolololololol this people really do say this stuff.

OMG!!!!!
Its bad

AMD posted a .2 billion in loses this quarter. This is a company that is only worth about a billion on track to post a billion dollar loses for 2015.

look guys, this is bad bad bad.

Those guys are leaving AMD because AMD is going belly up. I bet you anything the upper management is talking about an exit strategy. If you havent got used to the idea yet, it is time to face it............AMD is going under

so much for the consoles saving them
Reply
#54
(10-17-2015, 05:12 AM)ocre Wrote: look guys, this is bad bad bad.

Those guys are leaving AMD because AMD is going belly up.  I bet you anything the upper management is talking about an exit strategy.  If you havent got used to the idea yet, it is time to face it............AMD is going under

so much for the consoles saving them


Let's see:

Got me banned from most forums on the net, for the "crime" of pointing out deficiencies in their products? Check.

Destroyed the forum world that I used to love? Check.

Lied to the public about upcoming products to stop people from buying competitors products? Check.

Sold FX-9590 for $1000 when it was slower than a $200 2500K? Check.

Launched the Fury Nano for $650 even though it's far slower than a 980Ti? Check.

Launched a CPU line that was slower than last gen and priced it higher? Check.

Are you sure this is "bad"?
Reply
#55
The people I feel bad for are the guys a AMD betrayed by their management and marketing. If you're a software or chip engineer there, you have honorably done the second best work in the world.

That's not shabby, especially given the budget they did it on.

All thrown away by mismanagement, and their decision to unleash the huge viral program on the world.
Reply
#56
Most billion-dollar companies (like Monsanto, Intel, etc..) seem to have it real, real easy when it comes to the overall bottom line. AMD certainly had it hard - the road wasn't smooth for AMD.

The management tried to act like they were Intel during the 65nm fab roll-out.

1) Realizing that all fabbed 90nm Athlon64's and Opteron X2's couldn't meet demand quickly enough, while gaining so much reputation over Intel's Pentium4, AMD with their greatest intelligence decided to cut the L2 cache for all 65nm Athlon's by HALF, while implementing basically close to ZERO IPC improvements, just so that more of these chips could be fabbed for cheaper. At the same time, they had the audacity to completely ditch the Socket939 platform which was what AMD's immense growth was based on (forcing people to upgrade motherboards as well if they wanted to upgrade). Guess what, Intel went back to the Core architecture, so AMD was fucked (as if AMD couldn't have seen the hints from Intel's mobile Core processors that were emerging for quite a while). Even if AMD didn't spend all of their $$ on ATI, there was no way AMD could've ever recovered from this.

2),
3),
4), etc... all stems from 1) above, perhaps the dumbest move of any multi-billion dollar company in the decade of 2000.
Reply
#57
AMD is pairing up with Fujitsu: http://www.eteknix.com/fujitsu-purchase-...perations/
Quote:AMD have just announced that they are now on a joint venture with electronics giant, Fujitsu. The two companies will be pairing together to provide factories in Penang (Malaysia) and Suzhou (China). Fujitsu will be getting a whopping 85 percent of the joint venture, which is suspected to heavily reduce AMD’s capital expenditure. Approximately 1700 workers at the two factories will become employees of the new venture the companies stated. For AMD, this is all for $371 Million in cash and retain a 15% stake in the new entity.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#58
That is interesting. I assume they bought 85% of those two factories, not 85% of AMD?

Didn't know AMD had anything left to sell.
Reply
#59
http://www.overclock3d.net/articles/cpu_...ks_found/1
Zen's getting closer.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#60
I'm really hoping for Zen.

Ocre, do you really think Zen will deliver +40% of Excavator's IPC (that is, if AMD will still clock Zen at say, 5GHz like they did with their watercooled Piledriver)?
Reply
#61
Bulldozer was bragged about all the way up till it launched, so I will reserve my thoughts until we see zen in action.
Reply
#62
http://techreport.com/news/29287/glofo-s...pp-process

Oh, and there's a new lawsuit against AMD: http://techreport.com/news/29289/lawsuit...core-count
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#63
Apparently the lawsuit against AMD is garbage: http://www.extremetech.com/extreme/21767...hout-merit
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#64
Another interesting article from VR-Zone by the Theo Valich (lol):

http://vrworld.com/2015/11/19/14nm-amd-g...-xeon-phi/
Reply
#65
AMD ends driver support for legacy GPUs: http://www.tomshardware.com/news/amd-ret...30643.html
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#66
AMD preparing new dual GPU card: http://videocardz.com/57896/amd-radeon-r...n-december
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#67
Ugh!!! Why couldn't AMD do this like 2-3 months ago, at least? Trying to cash in for Xmas season makes sense, but it's a bit late even for Xmas.

I think, at this point most of the serious enthusiasts willing to spend that kind of $ are waiting for the next gen.
Reply
#68
this is pretty funny (from Steel's link)

http://twitter.com/repi/status/669829650...04/photo/1
Reply
#69
Bad news for the Fury X: http://www.gamersnexus.net/industry/2217...orce-sales
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#70
AMD responded

Quote:“We are aware that Asetek has sued Cooler Master. While we defer to Cooler Master regarding the details of the litigation, we understand that the jury in that case did not find that the Cooler Master heat sink currently used with the Radeon Fury X infringed any of Asetek’s patents.”

http://www.gamersnexus.net/industry/2220...-on-fury-x
Reply
#71
Fury X, so what? Watercooling does not even help with the overclocking at all (compared to the Arctic Accelero cooler on the non-X).. such a non-issue after all.
Reply
#72
BTW, wonder how much AMD had to bribe that judge to ignore Asetek's patents... Big Grin
Reply
#73
http://www.techpowerup.com/218223/amd-ac...ients.html
One bit of good news for AMD.
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#74
I wonder how being able to OC Intel's non K Skylake CPU's is going to affect Zen?
Reply
#75
(12-14-2015, 12:32 AM)Mousemonkey Wrote: I wonder how being able to OC Intel's non K Skylake CPU's is going to affect Zen?

I think the larger question is:

Now that the world has pretty much given up on AMD processors, would a Zen chip that is only equal or marginally better than intel lure enough people back?

So many years of fail might have AMD out of the game before they even start playing.

BTW- welcome Mousemonkey, good to see you here!
Reply
#76
(12-14-2015, 01:47 AM)RolloTheGreat Wrote:
(12-14-2015, 12:32 AM)Mousemonkey Wrote: I wonder how being able to OC Intel's non K Skylake CPU's is going to affect Zen?

I think the larger question is:

Now that the world has pretty much given up on AMD processors, would a Zen chip that is only equal or marginally better than intel lure enough people back?

So many years of fail might have AMD out of the game before they even start playing.

BTW- welcome Mousemonkey, good to see you here!

Thanks mate and it's good to be back, I had to re-register but that's understandable all things considered. As for Zen the future looks a bit on the bleak side if you ask me, there are a couple of benchies floating round that show the i3 6100 CPU keeping up with and sometimes getting ahead of the FX 8320E.

http://www.techspot.com/review/1108-inte...rclocking/

Of course Intel could shit a brick and put a stop to this as it may impact the sale of their K series CPU's but then why would they? And how? It's the motherboard manufactures that have come up with this little humdinger after all, completely independent of any Intel influence.
Reply
#77
(12-14-2015, 02:10 AM)Mousemonkey Wrote: Thanks mate and it's good to be back, I had to re-register but that's understandable all things considered. As for Zen the future looks a bit on the bleak side if you ask me, there are a couple of benchies floating round that show the i3 6100 CPU keeping up with and sometimes getting ahead of the FX 8320E.

http://www.techspot.com/review/1108-inte...rclocking/

Of course Intel could shit a brick and put a stop to this as it may impact the sale of their K series CPU's but then why would they? And how? It's the motherboard manufactures that have come up with this little humdinger after all, completely independent of any Intel influence.

Interesting!  I wonder if BCLK overclocking is as stable as it used to be with the older CPU's from several years ago (i.e., as stable as changing the multiplier, at pretty high BCLK like 140MHz or so for the same end result)? 

FX 8320E just sucks real badly.  32nm is ancient now.  Look at the 0.1% time results in games (stuttering).  Overclocking also makes it go like this:

Anim_angry (eating electricity like mad and hot)

I'm hoping that AMD can stun the world like ATI did with their 9700 Pro.
Reply
#78
(12-14-2015, 02:10 AM)Mousemonkey Wrote:
(12-14-2015, 01:47 AM)RolloTheGreat Wrote:
(12-14-2015, 12:32 AM)Mousemonkey Wrote: I wonder how being able to OC Intel's non K Skylake CPU's is going to affect Zen?

I think the larger question is:

Now that the world has pretty much given up on AMD processors, would a Zen chip that is only equal or marginally better than intel lure enough people back?

So many years of fail might have AMD out of the game before they even start playing.

BTW- welcome Mousemonkey, good to see you here!

Thanks mate and it's good to be back, I had to re-register but that's understandable all things considered. As for Zen the future looks a bit on the bleak side if you ask me, there are a couple of benchies floating round that show the i3 6100 CPU keeping up with and sometimes getting ahead of the FX 8320E.

http://www.techspot.com/review/1108-inte...rclocking/

Of course Intel could shit a brick and put a stop to this as it may impact the sale of their K series CPU's but then why would they? And how? It's the motherboard manufactures that have come up with this little humdinger after all, completely independent of any Intel influence.
Wow!!

I never thought we would see this day!

Has the bios been released to the public yet? I haven't been keeping up, crazy busy these past few months
Reply
#79
Meanwhile, http://www.tomshardware.com/news/amd-zen...30751.html
Valve hater, Nintendo hater, Microsoft defender, AMD hater, Google Fiber hater, 4K lover, net neutrality lover.
Reply
#80
(12-09-2015, 10:51 AM)BoFox Wrote: Fury X, so what?  Watercooling does not even help with the overclocking at all (compared to the Arctic Accelero cooler on the non-X)..    such a non-issue after all.

What are you talking about BoFox?

FuryX is an "overclocker's dream"!

Just remember they didn't say it was an overclocker's good dream and you'll be OK......
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)