9800GT PhysX card with 580 GTX GPU: The heat isn't worth it!

OldM3ta

[H]ard|Gawd
Joined
Jun 6, 2004
Messages
1,150
I recently built a new system with an Intel i5-2500K processor, 8GB RAM, 850W Power Supply, Gigabyte board.

I put it all in a Cooler Master CM 690 II Advanced case, along with my main Geforce GTX 580 video card. I purchased a 9800GT single slot cooler 1GB VRAM card, and popped that into the other PCI-Express slot that allowed for video cards.

After running the computer a few days, I realized that the heat that I was getting from the case was really high. I started looking at my temperatures closely. Mind you, I have about five fans in that case and the SB processor is cooled by a Corsair H50 heatsink/wc'r.

Temperature monitor programs showed that my processor was idling around 35-40 celcius, and loading around 60. My Geforce GTX 580 was idling around 49-52 degrees, loading at who knows what. But the 9800GT was just sitting there idling at 65 degrees! I applied thermal paste and assured the heatsink was on right, and it still stayed as the hottest component in my case.

I tried to use MSI Afterburner to lower the frequency of all the components of the card to see if it would lower temperature, but alas, it did not. The card was significantly the hottest device in my case, and since heat rises, and it is towards the bottom, I'm sure it was no help to the cooling of my other components (CPU/GPU#1).

I decided that it wasn't worth keeping it in the case. I do own about 5 physx enabled games (of the 13 out there or so). If I start playing them consistently, I'll think about popping in the 9800GT which I paid $50 for on eBay. Otherwise, the card stays out of the case and not powered on and raising the temperature of my case and room.

Story I thought I'd share.....

TLDR; 9800GT is too hot to keep in case for just a few games of PhysX.
 
Yeah... that's hot. Though I don't think you can apply this broadly to all 9800gt cards. The one I had idled at around 44.
 
That seems way to high for idle, perhaps the diode is messed up and not reporting correctly, also do you notice a difference in temps with it in your case as opposed to out of your case and if so by how much? If its negligable i'd just chalk it up to a messed up thermal reading the card is reporting
 
offloading physx onto a 9800gt compared to letting your gtx580 handle both graphics and physx is not going to do that great anyway. with your primary card being that strong you would need a faster gpu to do physx on for it to be worth the trouble.
 
Your saying 49-52c is way to hot for idle? I just checked my 570 and it was idling at 62 :mad:

I did some research online and apparently with two monitors the card will not downclock or something... as soon as I deactivated my second monitor the cards already dropped to 45 in less than a minutes and still dropping... but this is really a bummer I always use the second monitor :confused:

edit * lol I did some more research and this is quite ironic... apparently the Nvidia cards have an issue not dropping into idle mode with two different monitors at different resolutions as you can see here thanks to the Nvidia rep...
http://forums.nvidia.com/index.php?showtopic=187662&st=20

so my solution for lower temps would actually be the exact opposite of your solution! I'll probably have to pick up a secondary card like a 9800gt to run the second monitor and allow the cards to idle. That 52c actually sounds good compared to where I am at...
 
Last edited:
My 9800GT 512MB (originally 600/900, overclocked to 650/1000) for PhysX is currently at 49°C while maxed out on the CUDA dnetc client. After pausing the client for a couple minutes, it's dropped down to 44°C already. With the stock cooler/BIOS, it wouldn't try to go any lower than 60°C.

Also keep in mind that the chip's temperature doesn't have that much to do with how much heat it's putting out. The TDP tells how much electricity something is converting to heat. The chip temp just tells how much heat is still in the chip (completely ignoring how much heat it's already dumped into the air). As long as it's within the limits of the chip, there's no reason to try to get more heat out of the device and into the air. I'm sure the GTX580 is putting out a lot more heat, it's probably just dumping it out into the room rather than into your case (or leaving it in the GPU).

How much did your other temps drop when you took the 9800GT out? I'm guessing probably a couple degrees at most.
 
YamahaAlex37, interesting! Thanks for all others who have posted. InvisiBill, I haven't looked but will see the difference and get back to you shortly. As for tricky0502, you are right... I am on a single slot cooler design.

Did others who posted their 9800GT temps have single slot cooler designs?
 
My latest temperatures without the extremely hot 9800GT is the same temps for the CPU, 40 idle, 60 load... so it didn't really change the temperature of the CPU. However, the GPU which was sitting about 48-50 idle, now idles around 40-41. The 9800GT is a hot card, but it probably doesn't help that in order to have it fit, I had to remove the external shrouding of the heatsink that sits on the card. It may have helped funnel air through the channels more than when it's off. I could try to put it back on and see what the temp differences are at some point.
 
hey!! i did the same thing. bought a 9800gt for a physx like a year ago. then i ended up turning around going sli with 2 9800 gtx+ i have to say them temps you posted really isnt that bad. not just that but have you tried a benchmark? try it out with a 9800gt it takes a load off ur cpu. try it on 3dmark vantage you'll see you'r cpu score with the physx (9800gt) running wil be higher then without it. atleast mine did i ran 3dmark with out and cpu score was low i ran it with it and it increased dramatically. but since then i have upgraded got rid of the 9800gt and the 9800gtx+ went with ddr3 and 2 gtx460's all water cooled. see ya
 
Have you tried running the 9800 by itself as the main card (with the 580 disconnected) and seen what the tempretures are like?
 
I reached the same conclusion with my 8800GT. Makes the z67 chipset more appealing for a 4th monitor...
 
Death Princess... I ran that test for you.
With just the 9800GT in the computer and my processor running overclocked to 4.0GHz, the 9800GT idles at 63 degrees and after about half an hour of GTA IV, reached a max temperature of 80 degrees.
The CPU sat just fine, as in all cases, idling at 40 degrees and maxing out at 59 degrees.

I have put the metal shroud back on the heatsink of the single slot 9800GT and it helped with a a couple degrees at most.

I think I'll keep the card and plug it in when I play PhsyX games only. Is it worth it? Well... Is 6-12 frames per second advantage with PhysX worth it for $50 and adding around 70 degrees of heat output to your rig? ---* Source http://www.evga.com/forums/fb.ashx?m=748830 *----

Shit maybe not.
 
my old single slot 9800gt used to run hot, idled at 60C and loaded at around 95C, that was straight out of the box!, during the summer of 2009 the highest I seen it hit with wow running was around 105-106C with a idle of 75C with a 33C room temp, that was freshly reseated and cleaned. when I built my bloomfield rig and got a 450gts I gave the card to my dad who is still using it without problems, I even reseated the heatsink with new paste again and it still runs in the high 80s while gaming, I guess some of them are that way or something. :confused:
 
Why use a dedicated physx card. Ain't X86 instructions faster than X87? Dunno
 
@qb4ever - It makes me wonder if some cards have different amount of voltage being fed to them via their bioses.
 
@qb4ever - It makes me wonder if some cards have different amount of voltage being fed to them via their bioses.

mine was a pny xlr8 9800gt, stock never oced, at first I thought maybe the heatsink wasn't flush on the die / bad mount, but I pulled out a micrometer and both the die and heatsink were both perfect as well as the paste was even after a few remounts with fresh paste. earlier tonight before work I went and checked the temp on it, it was idling at 71C with a 18C ambient room temp o.0 either these things run hot or the sensor is wrong :confused:
 
Last edited:
Ill tell you what a nvidia mod once told me in the nvidia forums. A NVIDIA GPU CAN WITHSTAND TEMPERATURES OF UP TO 600 DEGREES.

Btw don't ever go to the Nvidia forums asking about temps.
 
Ill tell you what a nvidia mod once told me in the nvidia forums. A NVIDIA GPU CAN WITHSTAND TEMPERATURES OF UP TO 600 DEGREES.

Btw don't ever go to the Nvidia forums asking about temps.

Why? Doesn't everybody use the Rankine scale?
 
I bought one for my GTX480 companion. Don't stare at temps though. I was looking into flashing with a model that has 2d and 3d clocks. I was surprised it didn't have it. You know what they say about assumptions.
 
canada and the us is not everyone

While we certainly are everybody, I have never heard of anyone using the Rankine scale. Here in the US, we use Fahrenheit (silly us, we should have listen to Thomas Jefferson when he was pushing metric). Rankine is to Fahrenheit as Kelvin is to Celcius (of course, all scientific calculations in the US are done in metric and then converted back to our own silly measurements* afterwards (with the possible exception of Lockheed Martin)).


* I'm quite thankful that the US uses metric electrical units. I'd hate to have to deal with 14 2/3 Franklins to the Edison conversion or something like that.
 
While we certainly are everybody, I have never heard of anyone using the Rankine scale. Here in the US, we use Fahrenheit (silly us, we should have listen to Thomas Jefferson when he was pushing metric). Rankine is to Fahrenheit as Kelvin is to Celcius (of course, all scientific calculations in the US are done in metric and then converted back to our own silly measurements* afterwards (with the possible exception of Lockheed Martin)).


* I'm quite thankful that the US uses metric electrical units. I'd hate to have to deal with 14 2/3 Franklins to the Edison conversion or something like that.
silly us? its just what every one was used to, stuff like that would be a big change for everyday people who dont care about things as frivolous as units, and if really matters its not hard to convert F=C X 9/5 + 32.
 
Try this

On your desktop click right mouse button, select nvidia control panel. Select PhysX settings, now :
assign PhysX to your CPU.

I got 5fps higher when it was assigned to the cpu
 
While we certainly are everybody, I have never heard of anyone using the Rankine scale. Here in the US, we use Fahrenheit (silly us, we should have listen to Thomas Jefferson when he was pushing metric). Rankine is to Fahrenheit as Kelvin is to Celcius (of course, all scientific calculations in the US are done in metric and then converted back to our own silly measurements* afterwards (with the possible exception of Lockheed Martin)).


* I'm quite thankful that the US uses metric electrical units. I'd hate to have to deal with 14 2/3 Franklins to the Edison conversion or something like that.

we use celcuis and say tomato sauce throw petrol in our cars tanks and our rugby players don't wear body armor and helmets.

You are use to it that's why if you were use to the others then the current ones would be strange
 
Try this

On your desktop click right mouse button, select nvidia control panel. Select PhysX settings, now :
assign PhysX to your CPU.

I got 5fps higher when it was assigned to the cpu
do you even have any clue about what you are saying? any decent dedicated gpu can handle physx better than the fastest cpu in the world. heck my old gtx260 can handle both graphics and physx at the same time better than letting my cpu handle physx.
 
I agree let the 580 do both you dont really need a dedicated phy card but if you do wnat one anything beats a cpu :)
 
A simular sitation in this movie where Linus adds an 8600GTS as dedicated PhysX card in combination with a 580... the 8600GTS is holding back the total performance of both... :eek:

http://www.youtube.com/watch?v=cbww3dhzK0M
yeah an 8600gts is too slow for a dedicated physx card if your main card is a gtx260 or faster. in other words if you have a gtx260 or faster you would do better to have it handle both graphics and physx then to try and use an 8600gts for dedicated physx.
 
silly us? its just what every one was used to, stuff like that would be a big change for everyday people who dont care about things as frivolous as units, and if really matters its not hard to convert F=C X 9/5 + 32.

Sorry to be off-topic, but regardless of what people in general are used to, there is a way to express numbers that makes "more sense" relative to another way. It may not be hard, but it's unnecessary. For the amount of difficulty people may have adjusting to metric units, they gain it all back many times over with the efficient conversion of powers. Imperial units don't make any sense at all, and the USA's insistence on using them epitomizes 'because I can', the same 'because I can' that makes other nations the world over laugh at the imbecility that is America in some ways, including Imperial units. It's called getting with the times.
 
Sorry to be off-topic, but regardless of what people in general are used to, there is a way to express numbers that makes "more sense" relative to another way. It may not be hard, but it's unnecessary. For the amount of difficulty people may have adjusting to metric units, they gain it all back many times over with the efficient conversion of powers. Imperial units don't make any sense at all, and the USA's insistence on using them epitomizes 'because I can', the same 'because I can' that makes other nations the world over laugh at the imbecility that is America in some ways, including Imperial units. It's called getting with the times.

I agree that the USA should use the metric system, but I don't agree with how you reached that conclusion.
Most likely because I am a thick-headed moronic American..
GET ERR DUNNNNNNNNN
 
The GTX 580 itself is more than enough for gaming & physx :p

Comments like these just drive me insane....how the hell can you say that without specifying a bit.
It depends on what the GTX 580 is doing at the time. If you are pushing all bells and whistles at higher resolutions or doing 3D vision...do you really think that GTX 580 is not going to be helped by letting a different card deal with the physx and I mean a decently fast card not a 8600gt or something like that of course.
It all depends on many other factors so a blunt statement like that makes no sense.
 
do you even have any clue about what you are saying? any decent dedicated gpu can handle physx better than the fastest cpu in the world. heck my old gtx260 can handle both graphics and physx at the same time better than letting my cpu handle physx.

well some I got a increase some not. That's what I said. And I motion it if anyone want to test the difference it makes and if its working.
 
Back
Top