Galaxy GeForce GT 240 GDDR3 Review
Temperature
We fired up our copy of Furmark 1.7.0 and rendered the fur at 640×480 with no AA on “xtreme burning mode”.
Furmark represents one of the most intensive tests that a GPU can run. Thus temperatures and power consumption measured with Furmark represent the worst-case scenario. Although no game today puts as much load on the GPU as Furmark does, should a game do that in future, you will be ready armed with the knowledge gained by testing with Furmark – knowing how hot your card can get and how much power it can consume.
According to its developer, “Furmark is a very intensive OpenGL benchmark that uses fur rendering algorithms to measure the performance of the graphics card. Fur rendering is especially adapted to overheat the GPU and that’s why Furmark is also a perfect stability and stress test tool (also called GPU burner) for the graphics card. This benchmark requires an OpenGL 2.0 compliant graphics card: NVIDIA GeForce 5/6/7/8 (and higher), AMD/ATI Radeon 9600 (and higher) or a S3 Graphics Chrome 400 series with the latest graphics drivers.”
Furmark was left running for a period of 10 minutes after which the final load temperature was measured. The ambient temperature was 24-25C.
\
Overclocking
Furmark Stability test was used to find the maximum stable overclock. We used EVGA’s Precision Tuner for overclocking. After all the dust was settled, here are the overclocked frequencies. 650MHz/1584MHz/1050MHz (core/shader/memory).
Power Usage (Total System Consumption)
Furmark 1.7.0 was run in extreme burning mode with the fur rendered at 640×480 with no AA. Measuring power usage with Furmark is the worst case scenario. Normally games are never able to put this much load on the GPU as Furmark does.
It’s nice to see that my card (the HD 4770) performed the best, and used the least amount of power.