[ABT Exclusive] HTC Glacier is back with GPU performance results

4 Responses

  1. Electrofreak says:

    MrK, it’s good to see you still digging up more info on Glacier. My previous theory of the QSD8650A was debunked when the graphics performance clearly showed that it’s using a next-gen Adreno, not just the Adreno 205 in the QSD8650A.

    I’m still a bit skeptical about it containing the QSD8672. I’ve just heard that the QSD8672 is going to see its way into a flagship Windows Phone 7 device first, so I’ll wait until I see it. That said, my previous thoughts about Glacier containing one of the MSM8x60 chips have grown doubts after a little sleep and a clearer mind, as they were only announced to have shipped in June. It hardly seems likely that we’d see them in running on a device in the (apparently) final stages of testing only a few months after they’ve been shipped.

    The apparent jump from 1st-gen to 3rd-gen Snapdragon is surprising indeed, but if BGR’s reports of a 1.3 GHz “Droid Pro” in November is true, it could be that Qualcomm is simply doing whatever it takes to remain competitive. I wonder if the Droid Pro is going to be sporting an OMAP 3640 Cortex-A8 or an OMAP 4430/40 Cortex-A9…

    You know what a fan I am of Samsung’s Hummingbird… and the fact that I’m seriously considering sitting tight a couple more months in the hopes for a more concrete timetable for Cortex-A9 certainly should indicate my excitement in this new generation of ARM architecture. It’ll be interesting to see how Qualcomm strikes back.

    Who knows, maybe they’ve decided it’s finally time to pull the wraps off the big guns; after all, the original Snapdragon QSD8x50s have reached a ripe old age of 9 months now and are beginning to show their age… 😉

  2. anon says:

    Some more food for the thought:

    Compare Motorola Droid with 600MHz ARM Cortex-A8 to HTC Legend with ARM11 at the same clock frequency. Pay attention to how HTC Legend bests Droid on both GPU and CPU skinning tests. Now compare how moving from 600MHz ARM11 to 1GHz Scorpion in EVO changes scores. They drop. Clearly CPU doesn’t govern glBenchmark scores. Both 7227 and 8×50 have Adreno 200, but clearly not clocked at same clock frequency.

    7227/Adreno200 scores 480 and 8×50/Adreno200 scores 178 in GPU skinning test. If different chips with Adreno200 can have over 2x differences in glBenchmark scores, why would you require a jump over Adreno205 straight to Adreno220 to accept score that is only twice that of 7227? :)

    http://www.glbenchmark.com/compare.jsp?benchmark=glpro11&showhide=true&D1=Motorola%20Droid%20(Milestone%20Sholes)&D2=HTC%20Legend&D3=HTC%20Evo%204G

    I’m not convinced that one of the first new HTC products would have a 3rd-gen QC chip since they haven’t been even demoed yet. 7×30, on the other hand, has been demoed recently and has shown vastly better graphics performance than the 8×50.

    Surely if 8672 or 8660s would be ready for prime time, they would have been showcased on major conferences?

    Since it appears to be trendy to just take one GPU performance metric (like triangle rate) and predict performance from that:

    Adreno 205 has 4x shader performance compared to Adreno 200. This coupled with likely increased clock frequency (and thus fillrate and triangle rate) is enough, IMO, to reach numbers shown on glBenchmark site.

    FYI, some interesting information:
    http://www.uplinq.com/press/presentations.htm
    http://developer.qualcomm.com/dev/gpu/processors

    The former tells that Adreno 205 has 4 ALU pipes, and latter shows chips where it’s being used.

    My money is on 7×30 and rest of the gen 2 chips like 8×55 and 8x50A. No company is going to skip over a generation.

  3. Electrofreak says:

    I suspect that Android 2.2 was at play there Anon, but you do bring up some excellent points.

  4. Noobz says:

    Can someone explain to me what the true meaning of triangles are in context with multimedia platforms like gaming and animations ? If the more “triangles” that exist = better animation detail then why don’t people jump ship to the samsung I9000 and it’s multiple US iterations since it has raw graphics processing power when even compared to 2nd Generation ARM based Snapdragons. Will AMDs recent partnership with Qualcomm produce better GPUs than the ImgTechs POWER SGX540s on SGS I9000? If a 1 ghz hummingbird keeps up with a dual core 1.3-1.5 ghz phone what is Qualcomm doing to stay competitive? I love mobile gaming alot and I’m a real noob to all of this so :/ I run to things that have six axis accelerometers and huge MTS output like the SGS just too feel like a serious mobile gamer. And on a side note, even with Droid X not following suit with SoC chipsets why does their separate Imagination GPU suck so terribly?

Leave a Reply

Your email address will not be published.