Fermi GF-100 NDA Ends Tomorrow at 9 PM – Sunday, 1/17/2010

apoppin

Founder and Senior Editor of ABT.

16 Responses

  1. yogi says:

    i’m sure it will suck, hot n power-hungry

  2. apoppin says:

    How do you know? Nvidia hasn’t announced the power requirements of GF-100 yet. Are you privy to information none of us have?

    The real test of this new architecture will come in gaming and by regular use in the PC. We will report on what we find.

  3. dnes says:

    Yes for sure it will suck..hot n power-hungry…

    thats why nvidia had to castrate its product from 512 shaders to 4xx shaders…

    and m sure whatever they have come up with its not on a graphics card side but on GPCPU..

  4. apoppin says:

    Then you are in for a big surprise.

    Check back in just over 21 hours.

  5. BFG10K says:

    I don’t think we even know the clock speeds yet.

  6. apoppin says:

    Do you think NVIDIA has even finalized them yet?
    – it appears that they may be still working with engineering samples- much as we saw 5 or 6 months ago before AMD released 5870.

  7. yogi says:

    yeah, maybe i’m wrong. maybe it’s not hot, it’s not power-hungry, but 1 thing for sure, it will be expensive, if it’s cheap, nvidia will suffer loss from it

  8. apoppin says:

    Ah, expensive. That depends on what you mean by expensive. The top-performing video cards always come at a premium. You do not think paying well over $600 for a hard-to-find 5970 is not expensive – right now?

    If GF100 is successful, then the other currently expensive video cards by NVIDIA’s competitor will finally drop well below MSRP – after probably 6 months of high prices with no DX11 competition whatsoever.

    As consumers, choice and competition are good. We at ABT welcome GF100 – expensive or not.

  9. Leon Hyman says:

    I agree. Top of the line cards have always been at a premium, especially when they have just been released.

    Keeping in mind, the situation in the market should provide a niche for these cards, if of course they sort out yield issues.

    ABT welcomes all new technology – expensive or not.

  10. Matt says:

    Top of the line GPUs have been expensive, but knowing Nvidia, the higher end cards (395) will most likely be at least 100 bucks more than ATI’s 5970… after the price drop.

  11. BFG10K says:

    Apoppin: I wouldn’t be at all surprised if the clocks weren’t fully finalized yet.

  12. Timezonetest says:

    I just want to see what time zone the sites running don’t mind me. 9:14PM CST currently

  13. apoppin says:

    It is 1-1/2 hours to go. This NDA expires at 9 PM PST. Don’t worry, we will post our review on time. It is ready to go.

    We are an international group. Some of us actually are PST, others are CST and EST, and for some of us, it is already Monday in Australia.

  14. yogi says:

    yeah, the easiest way to compare the expensiveness of a product is by calculating performance per price, but i think that doesnt always true. TDP, heat, and another factor like slot-taking must be counted. for me, honestly, the price is not a main factor. it is expensive if the money you have paid is not exactly like what you expect

  15. apoppin says:

    First of all, we don’t know the specs for GF100 yet. It may have an excellent clock-down feature; which is what it is doing most of the time – even for a gamer.

    Even though I consider myself “green” oriented, electricity costs of using my PC don’t really figure in for me. I don’t have that big plasma television, nor do I drive a big car. Instead, I attempt to compensate for my gaming habits on a 24″ LCD with HD 4870-X3 Tri-Fire, in other ways.

    You have to realize that my desktop PC(s) are only on when I am gaming, benching or encoding video. The other 90 percent of the time, I use my notebook for all of my writing and Internet tasks.

    As to the heat and thermal issues. I only find heat an issue in Summer. For the other eight or nine months, my desktop PC serves as a welcome mini-space heater when I am using it. And I have five case fans, so my hardware should have a long life.

    I also think Nvidia is concerned about conservation and we may see it reflected in this chip. We shall see once we get more details and once we get to test it.

  16. Reynaldo Parter says:

    I am really impressed by your post and like how you think. And to think that I found this little gem of yours through Yahoo!

    Please do write some more! I am very interested in what you have to say.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>