Way too Hot: Good Bye Next Gen

oozish

[H]ard|Gawd
Joined
Jun 27, 2003
Messages
1,465
Sorry guys, I'm just not Hard Enough, I guess, but I just can't do it...I got my 4870, and after a bunch of install crap, finally got things going; Crysis was nice at 30 fps but still dipped down in low twenties anyway....well,that's besides the point.

The thing is this card, and I'm sure a 260 or 280 of this gen, runs so freaking hot I could literally feel my already hard to keep cool room heating up. I mean it's ungodly hot.

Coming from a passive cooled 9600gt that has served me well and cool (42 idle); I just can't do it. Back to Best buy for you, 4870; and here's wishing for cooler stuff down the line with the die shrink that costs less and runs faster....cooler.

I exchanged the 4870 for a new raptor 300 HD and that thing smokes! You can really tell a difference, I mean it's faster than SS so I read.

Anyway, just my .02 wrapping up my new vid card adventures.
 
Yeah if you are going to try and compare a passively cooled 9600GT to a HD4870, it runs hot......

Did not not bother to read any reviews prior to buying?
 
Some previous gen cards ran at similar temps. Im throwing an aftermarket fan on mine for noise. I'm not really seeing the issue. It's not pumping THAT much heat out into an air conditioned room, really.
 
youre complaining about the heat of a video card and go buy a Raptor ?:confused:
 
I have the same problem as you bud, my room turns into a sauna with my PC running.
 
You are correct, not [H]ard enough. If you can't stand the heat.......:eek:

If you think the 4870 is hot, try SLI'd 8800 GTXs.;)
 
9600GT has a TDP of 95W, Radeon 4850 is 110W, so it doesn't put out much more heat while producing much better framerates. You could have gone with that if you were worried about heating up your room. Yeah it runs hot, but that's because of the conservative fan speeds and small cooler - that doesn't affect the amount of heat it dumps into your room - 110W is always 110W.

4870 is 160W which is a little higher than the G92 cards but not that bad compared to G80 or GTX260/280. The difference compared to your 9600GT is only a little over 60W - now would placing a lamp in your room with a 60W bulb in it be too [H]ard for you? :)

Here in Sweden, we're currently experiencing the hottest few weeks of the year... This is the time when the stability of any overclock is really put to the test. Here, noone in their right mind would buy AC - we only get a couple of weeks of really hot weather in the entire year. I bet in a week or two it will be 10- 12C and raining again :p
 
youre complaining about the heat of a video card and go buy a Raptor ?:confused:
Velociraptors are surprisingly quiet and efficient. To the OP: Solid State Drives have advantages and disadvantages. In certain usage patterns, SSDs are destroyed by just about every modern hard drive on the market. In other patterns, your VR could be slower. SSDs vary widely in performance, too.

I don't think the HD4870 draws too much more power than previous generations have. Certainly, your 9600GT drew less total power, but I wouldn't say that the HD4870 is inefficient because it draws more. In fact, I'd wager that the HD4870's performance per watt is better than the 9600GT since it's built on a smaller process. If you want to reduce the amount of heat it generates you could just severely under clock the card. Or perhaps consider the HD4850 since it operates at a lower voltage.

Or here's a thought: find some way to remove heat from your room. Failing that you could always move to Canada where summers are mild and winters are... well cold I guess but you get used to it :p.
 
I went from a 3850 to a 4850 and didn't notice any heat difference in my room.
Still gets hot as shit in here, but no hotter than before.
:)
 
Yeah if you are going to try and compare a passively cooled 9600GT to a HD4870, it runs hot......

Did not not bother to read any reviews prior to buying?

No, I'm very well versed and have the low down on all video cards ATM. I thought I could deal with the heat, it's not related to the card performing or not; I think the card can survive up to 100C (and we're close aren't we?).

Just was wrong, I thought it would be worth it but honestly this card is a toaster.
 
youre complaining about the heat of a video card and go buy a Raptor ?:confused:

I guess I'll chime in on this comment too...dude, the 300 Raptor that just came out is freaking sweet. It's not hot, it's running at 40C..

Read that again: 40C

Cooler than any other hard drive I have. The thing is smoking fast too, it's really a better upgrade for me than a vid card, hehe.

Cheers.
 
I went from a 3850 to a 4850 and didn't notice any heat difference in my room.
Still gets hot as shit in here, but no hotter than before.
:)
I could feel the heat increasing in the room within about 5 minutes...I was like, "no....I can't believe I can tell the difference!"

Ah well...this room gets hot enough anyway, in TExas u know...100 plus all week here.
 
Those overclocked quad cores run really cool too.

If you're tryi9ng to be sarcastic I think you're off base here...mine runs in the 40's for the CORE, so it's not any hotter than anything else in my case.

Everything runs in the 40C range; hard drives, cpu, and vid card.
 
9600GT has a TDP of 95W, Radeon 4850 is 110W, so it doesn't put out much more heat while producing much better framerates. You could have gone with that if you were worried about heating up your room. Yeah it runs hot, but that's because of the conservative fan speeds and small cooler - that doesn't affect the amount of heat it dumps into your room - 110W is always 110W.

4870 is 160W which is a little higher than the G92 cards but not that bad compared to G80 or GTX260/280. The difference compared to your 9600GT is only a little over 60W - now would placing a lamp in your room with a 60W bulb in it be too [H]ard for you? :)

Here in Sweden, we're currently experiencing the hottest few weeks of the year... This is the time when the stability of any overclock is really put to the test. Here, noone in their right mind would buy AC - we only get a couple of weeks of really hot weather in the entire year. I bet in a week or two it will be 10- 12C and raining again :p

I hear you but I don't know if your statements are fair; given my particular 9600gt which has the S1 accelero passive cooler on it. My card doesn't go over 52C at load, if that.

Now I'm complaining about the heat from the 4870, don't trivialize my experience....it was like turning on the oven in the kitchen; not a light bulb, believe me.

Before I had this card I had a 8800gts 640 so I know about the heat on these things, that idled in the 70c's range. I think the 4870 was worse by about 10C.
 
Actually putting a better cooler on your video card would make your room heat up faster.
Less heat on your GPU = More heat coming out the back of the case.
:)

Plus they haven't fixed the idle clock issue yet, which means the cards running much hotter than it should be at idle. There's no reason the 4800's should idle hotter than the 9600GT.
 
I hear you but I don't know if your statements are fair; given my particular 9600gt which has the S1 accelero passive cooler on it. My card doesn't go over 52C at load, if that.
What do you mean you don't know if his statements are fair?

Well, maybe they're not, since the Swedish public school education system is vastly superior to the American one.
And it shows....:eek:
 
Before I had this card I had a 8800gts 640 so I know about the heat on these things, that idled in the 70c's range. I think the 4870 was worse by about 10C.
The old 8800's are 90nm whereas the 4800's are 55nm.
Saying that a 4800 runs hotter than the old G80's (in terms of heating up your room) is a little ridiculous.

This entire discussion about which hardware will heat your room, so it doesn't necessarly reflect on what core temps you're seeing on your hardware. Technically speaking and with proper PowerPlay support, the 9600GT easily puts off more heat (overall) than the 4800's.
 
I hear you but I don't know if your statements are fair; given my particular 9600gt which has the S1 accelero passive cooler on it. My card doesn't go over 52C at load, if that.

Now I'm complaining about the heat from the 4870, don't trivialize my experience....it was like turning on the oven in the kitchen; not a light bulb, believe me.

Before I had this card I had a 8800gts 640 so I know about the heat on these things, that idled in the 70c's range. I think the 4870 was worse by about 10C.

Not sure your statements are fair, you are comparing a 9600GT with S1 to stock cooled 4870.

Still, know what I'd be doing in your situation, given that the very good cooler on your 9600 also just happens to fit your 4870...
 
Still, know what I'd be doing in your situation, given that the very good cooler on your 9600 also just happens to fit your 4870...
I think his point is that even with the aftermarket cooler on the 9600, it still keeps his *room* cooler than the shitty stock heatsink on the 4870. Putting that cooler on the 4870 would make his *room* even hotter.
:D
 
I think his point is that even with the aftermarket cooler on the 9600, it still keeps his *room* cooler than the shitty stock heatsink on the 4870. Putting that cooler on the 4870 would make his *room* even hotter.
:D

Well this is true.

Perhaps graphics cards these days should have different captions on the box instead of the standard "blistering framerates" ones.

"1KW of gaming heat"

:D
 
What do you mean you don't know if his statements are fair?

Well, maybe they're not, since the Swedish public school education system is vastly superior to the American one.
And it shows....:eek:

Even still I have a Masters Degree so I still own you, lol.
 
The old 8800's are 90nm whereas the 4800's are 55nm.
Saying that a 4800 runs hotter than the old G80's (in terms of heating up your room) is a little ridiculous.

This entire discussion about which hardware will heat your room, so it doesn't necessarly reflect on what core temps you're seeing on your hardware. Technically speaking and with proper PowerPlay support, the 9600GT easily puts off more heat (overall) than the 4800's.

This can't be true.
 
Try a thermometer. Honestly, this isn't even worth discussing -- it is just plain ridiculous. My 4850 runs at a reasonable 73C, overclocked, at full load with the stock cooler.

Temperature of the core =! dissipated heat.
 
Meh. All cards get hot when improperly cooled.

Your goal is to keep the air in the case moving, and the air in the room moving. Using a better cooler and keeping your door / window open would help. All the new faster cards are going to come out super hot as companies and gamers don't want to wait. Eventually they will be refined into cooler more power efficient cards.
 
...Failing that you could always move to Canada where summers are mild and winters are... well cold I guess but you get used to it :p.

What part of Canada are you from? :p In Southern Ontario it gets Uber HOT in the summer and Uber Cold in the Winter :p
 
I think his point is that even with the aftermarket cooler on the 9600, it still keeps his *room* cooler than the shitty stock heatsink on the 4870. Putting that cooler on the 4870 would make his *room* even hotter.
:D

i disagree the stock cooler isnt sh*t mine right now is idleing @48c and loads @65c with the fan running @33% i have had the core clocked up to 865 and the temp still didnt get above 75c
 
I don't know if your statements are fair; given my particular 9600gt which has the S1 accelero passive cooler on it. My card doesn't go over 52C at load, if that.

Regardless of what cooler you have, and the temperature that your GPU reaches - it still puts out the same amount of heat (95W). That heat doesn't just magically "disappear" into nothing just because you have a bigger heatsink. There are some great aftermarket coolers out there, but I've yet to see one that can actually bend the laws of physics.
 
Regardless of what cooler you have, and the temperature that your GPU reaches - it still puts out the same amount of heat (95W). That heat doesn't just magically "disappear" into nothing just because you have a bigger heatsink. There are some great aftermarket coolers out there, but I've yet to see one that can actually bend the laws of physics.

I don't have a thermomenter to put on there but I'm saying those 9600gt's don't get that hot. The 4870 heated up my case and the room quickly, the difference was more than I'm willing to live with. Not to mention when you try the be all 'trick' of increasing the fan speed the thing is loud as crap above 30%, imho.

If some of you think I'm 'soft' (LOL) for not wanting to bake in my room in Texas with a 4870 running IDLE at about 80C and up to about 93 at load, that's fine. Look at my sig, my rig kicks ass and I can play any game I like with my current card anyway.

I DID want to try a next gen card and all I'm saying is they have some work to do and I'm not willing to use a baking hot card that will be obsolete in 6 months when nvidia comes out with something cooler/faster that has better cooling to boot.

Go ahead and play with your donkey, previous poster...you'd know about Pony Playing I wouldn't. Don't feel like I'm insulting anyone about buying a current card, that's great; I was just saying the negatives out weigh the positives on having a heater in my room regardless of what games I could play faster at this time.

Maybe once some decent new games come out that I can't run on my 9600gt I'll try again and see what's out there...Warhead anyone?

Anyway, flame away peeps...:D
 
It's not about how hot the card gets but how much heat it pumps out to your room. 160W is equal to 4 40W light bulbs. Take 4*40W light bulbs and but them under your table by your legs and you know how people with HD4870 equipped PCs under their tables feel.

Heck, even Voodoo 3 2000 ran at 100ºC+ at worst but no way in hell did it pump 160W of heat to your room air.
 
ya but when I touch to HS on my GT it's hardly warm, and the card isn't that warm, I can touch it no problem. I could burn my hand trying to get the darn 4870 out of my case for return and I was afraid it would melt plastic. Maybe I underestimate the amount of heat it puts out normally but it doesn't bother me too much.

I'm glad I got the Velociraptor 300 HD instead, it makes a much bigger difference in my day to day computing experience than a vid card at this point in time anyway. Highly recommended.
 
My post wasn't aimed at you but those couple of guys who said that HD4870 heat output isn't problem at all because older cards can run at higher temps... So they just mixed heat output and temperatures together.
 
i disagree the stock cooler isnt sh*t mine right now is idleing @48c and loads @65c with the fan running @33% i have had the core clocked up to 865 and the temp still didnt get above 75c
This is a joke, correct?

Do you have the computer in a walk-in freezer? What other mod have you done to get 65c at only 33% at load with the stock cooler? What game is that you are playing?
 
Back
Top