Fermi GF-100 NDA Ends Tomorrow at 9 PM – Sunday, 1/17/2010
NVIDIA has finally released architectural details of their upcoming 3 Billion-plus transistor GeForce DX11 GPU codenamed, Fermi. This editor was at NVIDIA’s GPU Technology Conference last September where the general and GPU computing Fermi architecture was detailed here, here and here in a three-part series. NVIDIA had hoped the GeForce Fermi would launch last year, but yield issues and the complexity of their new architecture evidently is delaying the launch of this GPU until later in this first quarter of 2010.
NVIDIA knows they are late in relation to their competitor, AMD, whose ATI division has launched the 5000 series of GPUs, the first DX11 chips four months ago, and has even launched their DX11 mobile GPUs at CES as we reported here last week. However, NVIDIA believes the wait for gamers is worth it. In their first unveiling of the Fermi GF-100 GPU yesterday, NVIDIA has given brand new details of their new DX11 Fermi architecture for gaming and they are quite proud of it.
ABT is privy to this information about the latest GeForce and we shall share it with you tomorrow night, at 9 PM Pacific Standard Time (P.S.T.). Stay tuned, the graphics wars are heating up again and our readers will know about it first.
Mark Poppin
ABT Senior Editor
Please join us in our Forums
Become a Fan on Facebook
Follow us on Twitter
For the latest updates from ABT, please join our RSS News Feed
Join our Distributed Computing teams
- Folding@Home – Team AlienBabelTech – 164304
- SETI@Home – Team AlienBabelTech – 138705
- World Community Grid – Team AlienBabelTech
i’m sure it will suck, hot n power-hungry
How do you know? Nvidia hasn’t announced the power requirements of GF-100 yet. Are you privy to information none of us have?
The real test of this new architecture will come in gaming and by regular use in the PC. We will report on what we find.
Yes for sure it will suck..hot n power-hungry…
thats why nvidia had to castrate its product from 512 shaders to 4xx shaders…
and m sure whatever they have come up with its not on a graphics card side but on GPCPU..
Then you are in for a big surprise.
Check back in just over 21 hours.
I don’t think we even know the clock speeds yet.
Do you think NVIDIA has even finalized them yet?
– it appears that they may be still working with engineering samples- much as we saw 5 or 6 months ago before AMD released 5870.
yeah, maybe i’m wrong. maybe it’s not hot, it’s not power-hungry, but 1 thing for sure, it will be expensive, if it’s cheap, nvidia will suffer loss from it
Ah, expensive. That depends on what you mean by expensive. The top-performing video cards always come at a premium. You do not think paying well over $600 for a hard-to-find 5970 is not expensive – right now?
If GF100 is successful, then the other currently expensive video cards by NVIDIA’s competitor will finally drop well below MSRP – after probably 6 months of high prices with no DX11 competition whatsoever.
As consumers, choice and competition are good. We at ABT welcome GF100 – expensive or not.
I agree. Top of the line cards have always been at a premium, especially when they have just been released.
Keeping in mind, the situation in the market should provide a niche for these cards, if of course they sort out yield issues.
ABT welcomes all new technology – expensive or not.
Top of the line GPUs have been expensive, but knowing Nvidia, the higher end cards (395) will most likely be at least 100 bucks more than ATI’s 5970… after the price drop.
Apoppin: I wouldn’t be at all surprised if the clocks weren’t fully finalized yet.
I just want to see what time zone the sites running don’t mind me. 9:14PM CST currently
It is 1-1/2 hours to go. This NDA expires at 9 PM PST. Don’t worry, we will post our review on time. It is ready to go.
We are an international group. Some of us actually are PST, others are CST and EST, and for some of us, it is already Monday in Australia.
yeah, the easiest way to compare the expensiveness of a product is by calculating performance per price, but i think that doesnt always true. TDP, heat, and another factor like slot-taking must be counted. for me, honestly, the price is not a main factor. it is expensive if the money you have paid is not exactly like what you expect
First of all, we don’t know the specs for GF100 yet. It may have an excellent clock-down feature; which is what it is doing most of the time – even for a gamer.
Even though I consider myself “green” oriented, electricity costs of using my PC don’t really figure in for me. I don’t have that big plasma television, nor do I drive a big car. Instead, I attempt to compensate for my gaming habits on a 24″ LCD with HD 4870-X3 Tri-Fire, in other ways.
You have to realize that my desktop PC(s) are only on when I am gaming, benching or encoding video. The other 90 percent of the time, I use my notebook for all of my writing and Internet tasks.
As to the heat and thermal issues. I only find heat an issue in Summer. For the other eight or nine months, my desktop PC serves as a welcome mini-space heater when I am using it. And I have five case fans, so my hardware should have a long life.
I also think Nvidia is concerned about conservation and we may see it reflected in this chip. We shall see once we get more details and once we get to test it.
I am really impressed by your post and like how you think. And to think that I found this little gem of yours through Yahoo!
Please do write some more! I am very interested in what you have to say.