Galaxy GTS 250 512 MB Review

front page image Galaxy GTS 250 512 MB Review

Evolution is the nature of nature. Everything evolves. Man continues to work to make everything that he has better. Every product gets replaced by a more efficient and more feature-rich product. Such is the nature of things. Naturally, this applies to the world of technology where this evolution is very fast. Almost every year a technological product gets replaced by a newer, faster model. The graphics card industry is no stranger to this evolution. And the two big players here, Nvidia and ATI, are always looking to outdo each other with faster and faster graphic cards. As they say, “competition drives innovation”.

The GTS 250 is a perfect example of evolution. The G92 GPU chip used in this card, has a long history. Such is the history of this chip, that it would not be wrong to give this chip a legendary status and just call it “The G92”. This chip may be forever remembered in video card history for showing the world that a “mainstream video card” need not be a slouch capable of playing the games only at lower resolutions and lower settings when compared to their $600 top of the line brethren. For more on the G92 history check this out. The G92 based on 65nm fabrication process made its first appearance in the 8800 GT, with a handicap. The handicap was that the G92 chip implemented in the 8800 GT had only 112 shaders enabled. The G92 would make its full featured appearance in a video card called the 8800 GTS. The name was old, same as the G80 GPU chip based 8800 GTS,  but the underlying structure was different. G92-based 8800 GTS has 128 Shader Processors; uses a 256-bit wide memory interface paired with 512 MB, 1GB or 2 GB video memory. The G80-based 8800 GTS had 96 Shader Processors and used a 320-bit wide memory interface paired with usually 320 MB or 640 MB video memory.

The G92 chip would show up again in a video card named the 9800 GTX.  The 9800 GTX was basically the same card as the G92-based 8800 GTS with a few tweaks. These tweaks would not be in the GPU chip but rather in the PCB and other parts of the video card. These tweaks allowed the 9800 GTX to run at higher stock frequencies and required two 6-pin power connectors. The G92 chip made another appearance in the 9800 GTX+. This time it was based on a  55nm fabrication process and was codenamed G92b. The change in the fabrication process allowed the G92b GPU to be clocked higher than the 65nm G92 GPU. Other features such as shader processors and memory configuration remained the same. This card still required 2 six-pin power connectors. This was not the last time we would see the G92 GPU. The card I am reviewing today, GTS 250, is based on the same G92b 55nm GPU used in the 9800 GTX+. The GTS 250 has brought with it a shorter PCB and uses just one 6-pin power connector.

To sum up

  • 8800 GTS/G92/65nm/1 six pin power connector
  • 9800 GTX/G92/65nm/2 six pin power connector/Longer PCB/Higher clock frequencies than 8800 GTS
  • 9800 GTX+/G92b/55nm/2 six pin power connector/Same PCB Length as 9800 GTX/Higher clock frequencies than 9800 GTX
  • GTS 250/G92b/55nm/1 six pin power connector/Shorter PCB length than 9800 GTX+/Same clock frequencies as 9800 GTX+

If you ignore the subtle changes that Nvidia has made to these cards, you might just call it rebranding, and that seems to be the general consensus among the enthusiast community.

The G92 chip has spanned three generation of Nvidia’s GeForce lineup of video cards, a feat unmatched by any other GPU chip in the history of video cards. This is why I said that this chip is legendary and we will call it “The G92”.

Let’s take a look at what I am reviewing today:

galaxygts250i Galaxy GTS 250 512 MB Review

galaxylogo thumb Galaxy GTS 250 512 MB Review

Galaxy, established in 1994, is a Nvidia Add-in-Board (AIB) partner which manufactures products from the low-end GeForce 7200 series to the high-end GTX200 series. They manufacture products based on Nvidia’s reference design as well as using their own in-house production facilities to manufacture graphic cards based on their own designs using high-end coolers from Arctic Cooling and others.

Galaxy has shipped to the US for a long time as they built video cards for many of the tier 1 brands in the market today. They realized they could create a brand for themselves and save the end customer the middleman fees.  Two years ago they launched Galaxy in the US and their products are now available at Best Buy, Microcenter, Fry’s, Dell.com, Newegg, TigerDirect and many other sites.  They have excellent quality and toll-free tech support with a 2 year transferable no-registration warranty.

«»

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

20 Responses

  1. stridhiryu030363 says:

    You’re not even hitting the card’s limits on Furmark to correctly gauge it’s temperature or power consumption under load. Under res 1440 x 900, MSAA 16x, and post processing; the card starts to throttle as it reaches temperatures of 105 c.

  2. Karan says:

    @stridhiryu030363 – I tried to get the card temperature near to the maximum temperature a game would have done. I don’t think any game would be able to heat a card to 105C, unless of course the ambient temperature is very hot.

    For Power consumption, I tried to max it. If you use AA, the power consumption is less than with no AA.

  3. stridhiryu030363 says:

    Just putting that out there as that is the worst case scenario should a game could put that much stress in the near future. I have an older 9800 gt that doesn’t put out that much heat on the same settings and a friend’s gtx 260 maxing out at 85 c. There’s something wrong with galaxy’s non reference design imo.

    Did not know AA lowered power consumption. You’ll think sharpening textures would put more stress on the card.

  4. BFG10K says:

    Karan, were you testing total system power consumption, or just the card’s?

    Adding AA to a card should generally increase its load consumption because it works harder.

  5. stridhiryu030363 says:

    Update. My card just died. Maybe there was something wrong with mine which caused the overheating. No, I didn’t overheat the thing to death, it just locked up my system and refuses to boot during a gaming session.

    Anyone else willing to confirm the temps for me?

  6. Karan says:

    @BFG10K – was measuring total system power consumption.

    IIRC using high AA saturates the GPU bus, which makes the shaders and texturing units idle while they wait for the AA samples to clear the ROPs.

    @stridhiryu030363 – Do you have the same card ?

    No game was able to hit the temps Furmark hit in the review.

  7. stridhiryu030363 says:

    Yes, I have the same card. Funny thing is, the game I was playing was hardly graphic-intensive so my card stayed around an average of 58 c when it bit the dust.

    When I meant confirm the temps, I meant attempt to run Furmark on the same settings I had set to recreate my result. My card died yesterday under very mild conditions of operation so I was just wondering if it was the defect in my card that was causing the high temperatures.

  8. BFG10K says:

    IIRC using high AA saturates the GPU bus, which makes the shaders and texturing units idle while they wait for the AA samples to clear the ROPs.

    I’m not sure what you mean by “GPU bus”. Yes, AA hits the ROPs harder but it also hits the memory too. In general the card’s consumption should increase when AA is enabled because the rendering load is higher.

    @BFG10K – was measuring total system power consumption

    I have a theory then. If the GPU becomes saturated due to AA, the CPU might not be working as hard because it’s waiting for the GPU to catch up, thereby lowering overall system consumption. If you could test just the GPU’s consumption then it should increase when AA is applied.

  9. Karan says:

    @BFG10K – It could be the CPU idling as you said. But I didn’t notice any change in CPU Usage with and without AA.

    Also it could be the shaders and texture units waiting for the AA samples to clear the ROPs.

    Could be a combination of both things.

    I have thought of measuring the GPU only power usage, but haven’t come up with a way to do so yet.

  10. stridhiryu030363 says:

    Well, finally received my card back from rma and with the same settings as before on furmark, I seem to top out at 96 c after 20 minutes of it.

  11. Karan says:

    @stridhiryu030363 – what is your ambient temperature ?

  12. stridhiryu030363 says:

    Not sure, I don’t have anyway of checking. It was around 2 A.M. when I attempted this so it couldn’t have been very hot.

  13. Karan says:

    @stridhiryu030363 – check the weather on internet or something. What state or country are you in ?

  14. stridhiryu030363 says:

    California.

  15. Karan says:

    @stridhiryu030363 – that should explain it. it must be hot where you live.

    How hot does the card get in games ?

  16. stridhiryu030363 says:

    Not at 2 a.m at night.

    Right now, 84 f according to google, been folding units all day with folding@home and it’s only at 71 c, 51% fan speed. It’s not a really demanding application. Will stress test again later tonight.

  17. stridhiryu030363 says:

    60 f according to google. Same Settings, Same results.
    Tops out at 96 c

  18. Alex says:

    someone help pls in metro 2033 dx9 max settings whit 4x i get 103 degrees C jesus almost the limit i dont know what to do

  19. fisher.2011 says:

    CHINA IS A BEAUFUL COUNTRY

  20. BG mail says:

    Normally I don’t learn article on blogs, but I would like to say that this write-up very forced me to check out and do it! Your writing style has been surprised me. Thanks, very great post.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>