Galaxy GTS 250 512 MB Review
Software and Test System Used
Test Configuration
Hardware
- Intel Q9450 @ 3.2GHz
- GIGABYTE EX-38 DS4
- GeIL 4GB (2×2GB) 800MHz RAM 5-4-4-12
- Zerotherm Nirvana 120mm CPU Cooler
- Creative X-Fi Xtreme Music
- 2 x Western Digital 6400AAKS 640GB SATA Hard Drives
- 1 x Western Digital 250AAKS 250GB
- 1 x SAMSUNG 22X DVD±R SATA DVD Burner
- Cooler Master Silent Pro 600M PSU (Kindly Supplied by Cooler Master)
- Cooler Master Sniper Case
Software
- Windows Vista SP1 x64
- Nvidia GeForce 185.66 drivers
- All settings in Nvidia control panel at default
- PhysX on GeForce GPU’s in Nvidia control panel was disabled
- All games were patched to their latest versions
- Game settings are shown on the benchmark results page
- 1280×1204 resolution was used for these tests as this was the maximum resolution available to me.
Video Cards Used
- Galaxy 9600 GT 512 MB Low Power Low Profile (Review Here) – 600 MHz/1500 MHz/900 MHz (Core/Shader/Memory)
- Galaxy 9600 GT 512 MB Low Power Low Profile (Review Here) Overclocked to 716 MHz/1850 MHz/1107 MHz (Core/Shader/Memory)
- Palit 9800 GT 1 GB – 600 MHz/1500 MHz/900 MHz (Core/Shader/Memory)
- Galaxy GTS 250 512 MB – 738 MHz/1836 MHz/1100 MHz (Core/Shader/Memory)
You’re not even hitting the card’s limits on Furmark to correctly gauge it’s temperature or power consumption under load. Under res 1440 x 900, MSAA 16x, and post processing; the card starts to throttle as it reaches temperatures of 105 c.
@stridhiryu030363 – I tried to get the card temperature near to the maximum temperature a game would have done. I don’t think any game would be able to heat a card to 105C, unless of course the ambient temperature is very hot.
For Power consumption, I tried to max it. If you use AA, the power consumption is less than with no AA.
Just putting that out there as that is the worst case scenario should a game could put that much stress in the near future. I have an older 9800 gt that doesn’t put out that much heat on the same settings and a friend’s gtx 260 maxing out at 85 c. There’s something wrong with galaxy’s non reference design imo.
Did not know AA lowered power consumption. You’ll think sharpening textures would put more stress on the card.
Karan, were you testing total system power consumption, or just the card’s?
Adding AA to a card should generally increase its load consumption because it works harder.
Update. My card just died. Maybe there was something wrong with mine which caused the overheating. No, I didn’t overheat the thing to death, it just locked up my system and refuses to boot during a gaming session.
Anyone else willing to confirm the temps for me?
@BFG10K – was measuring total system power consumption.
IIRC using high AA saturates the GPU bus, which makes the shaders and texturing units idle while they wait for the AA samples to clear the ROPs.
@stridhiryu030363 – Do you have the same card ?
No game was able to hit the temps Furmark hit in the review.
Yes, I have the same card. Funny thing is, the game I was playing was hardly graphic-intensive so my card stayed around an average of 58 c when it bit the dust.
When I meant confirm the temps, I meant attempt to run Furmark on the same settings I had set to recreate my result. My card died yesterday under very mild conditions of operation so I was just wondering if it was the defect in my card that was causing the high temperatures.
IIRC using high AA saturates the GPU bus, which makes the shaders and texturing units idle while they wait for the AA samples to clear the ROPs.
I’m not sure what you mean by “GPU bus”. Yes, AA hits the ROPs harder but it also hits the memory too. In general the card’s consumption should increase when AA is enabled because the rendering load is higher.
@BFG10K – was measuring total system power consumption
I have a theory then. If the GPU becomes saturated due to AA, the CPU might not be working as hard because it’s waiting for the GPU to catch up, thereby lowering overall system consumption. If you could test just the GPU’s consumption then it should increase when AA is applied.
@BFG10K – It could be the CPU idling as you said. But I didn’t notice any change in CPU Usage with and without AA.
Also it could be the shaders and texture units waiting for the AA samples to clear the ROPs.
Could be a combination of both things.
I have thought of measuring the GPU only power usage, but haven’t come up with a way to do so yet.
Well, finally received my card back from rma and with the same settings as before on furmark, I seem to top out at 96 c after 20 minutes of it.
@stridhiryu030363 – what is your ambient temperature ?
Not sure, I don’t have anyway of checking. It was around 2 A.M. when I attempted this so it couldn’t have been very hot.
@stridhiryu030363 – check the weather on internet or something. What state or country are you in ?
California.
@stridhiryu030363 – that should explain it. it must be hot where you live.
How hot does the card get in games ?
Not at 2 a.m at night.
Right now, 84 f according to google, been folding units all day with folding@home and it’s only at 71 c, 51% fan speed. It’s not a really demanding application. Will stress test again later tonight.
60 f according to google. Same Settings, Same results.
Tops out at 96 c
CHINA IS A BEAUFUL COUNTRY
Normally I don’t learn article on blogs, but I would like to say that this write-up very forced me to check out and do it! Your writing style has been surprised me. Thanks, very great post.