Galaxy GTS 250 512 MB Review
Temperature and Power Consumption
To measure maximum temperature and power consumption, we turn to Furmark. Furmark represents one of the most intensive tests a GPU can undertake. Thus temperatures and power consumption measured with Furmark represent the worst-case scenario. Although no game today puts as much load on the GPU as Furmark does, should a game do that in future, you will be ready armed with the knowledge gained by testing with Furmark – knowing how hot your card can get and how much power it can consume.
According to its developer, “FurMark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that’s why FurMark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card. This benchmark requires an OpenGL 2.0 compliant graphics card: NVIDIA GeForce 5/6/7/8 (and higher), AMD/ATI Radeon 9600 (and higher) or a S3 Graphics Chrome 400 series with the latest graphics drivers.”
Here are the settings used for temperature measurement and the result on the right:
Temperature (Max) |
|
Idle |
52C |
Load |
79C |
Here are the settings used for power measurement:
Power Consumption (Average) |
|
Idle |
158 W |
Load |
272 W |
You’re not even hitting the card’s limits on Furmark to correctly gauge it’s temperature or power consumption under load. Under res 1440 x 900, MSAA 16x, and post processing; the card starts to throttle as it reaches temperatures of 105 c.
@stridhiryu030363 – I tried to get the card temperature near to the maximum temperature a game would have done. I don’t think any game would be able to heat a card to 105C, unless of course the ambient temperature is very hot.
For Power consumption, I tried to max it. If you use AA, the power consumption is less than with no AA.
Just putting that out there as that is the worst case scenario should a game could put that much stress in the near future. I have an older 9800 gt that doesn’t put out that much heat on the same settings and a friend’s gtx 260 maxing out at 85 c. There’s something wrong with galaxy’s non reference design imo.
Did not know AA lowered power consumption. You’ll think sharpening textures would put more stress on the card.
Karan, were you testing total system power consumption, or just the card’s?
Adding AA to a card should generally increase its load consumption because it works harder.
Update. My card just died. Maybe there was something wrong with mine which caused the overheating. No, I didn’t overheat the thing to death, it just locked up my system and refuses to boot during a gaming session.
Anyone else willing to confirm the temps for me?
@BFG10K – was measuring total system power consumption.
IIRC using high AA saturates the GPU bus, which makes the shaders and texturing units idle while they wait for the AA samples to clear the ROPs.
@stridhiryu030363 – Do you have the same card ?
No game was able to hit the temps Furmark hit in the review.
Yes, I have the same card. Funny thing is, the game I was playing was hardly graphic-intensive so my card stayed around an average of 58 c when it bit the dust.
When I meant confirm the temps, I meant attempt to run Furmark on the same settings I had set to recreate my result. My card died yesterday under very mild conditions of operation so I was just wondering if it was the defect in my card that was causing the high temperatures.
IIRC using high AA saturates the GPU bus, which makes the shaders and texturing units idle while they wait for the AA samples to clear the ROPs.
I’m not sure what you mean by “GPU bus”. Yes, AA hits the ROPs harder but it also hits the memory too. In general the card’s consumption should increase when AA is enabled because the rendering load is higher.
@BFG10K – was measuring total system power consumption
I have a theory then. If the GPU becomes saturated due to AA, the CPU might not be working as hard because it’s waiting for the GPU to catch up, thereby lowering overall system consumption. If you could test just the GPU’s consumption then it should increase when AA is applied.
@BFG10K – It could be the CPU idling as you said. But I didn’t notice any change in CPU Usage with and without AA.
Also it could be the shaders and texture units waiting for the AA samples to clear the ROPs.
Could be a combination of both things.
I have thought of measuring the GPU only power usage, but haven’t come up with a way to do so yet.
Well, finally received my card back from rma and with the same settings as before on furmark, I seem to top out at 96 c after 20 minutes of it.
@stridhiryu030363 – what is your ambient temperature ?
Not sure, I don’t have anyway of checking. It was around 2 A.M. when I attempted this so it couldn’t have been very hot.
@stridhiryu030363 – check the weather on internet or something. What state or country are you in ?
California.
@stridhiryu030363 – that should explain it. it must be hot where you live.
How hot does the card get in games ?
Not at 2 a.m at night.
Right now, 84 f according to google, been folding units all day with folding@home and it’s only at 71 c, 51% fan speed. It’s not a really demanding application. Will stress test again later tonight.
60 f according to google. Same Settings, Same results.
Tops out at 96 c
CHINA IS A BEAUFUL COUNTRY
Normally I don’t learn article on blogs, but I would like to say that this write-up very forced me to check out and do it! Your writing style has been surprised me. Thanks, very great post.