Way too Hot: Good Bye Next Gen

yea, i always smell that one wrong, along with imminent, but the judges never notice:)

done, errors fixed. so funny how a change in profession can give one a new found appreciation for grammar tyrants.
 
I increased the OC on my card, and now my whole house is hot. I ACTUALLY HAD TO TURN THE A/C DOWN GODDAMN U ATI...

I just re-read some of the comments defending the "my card heats up the room" arguments.





"Please get a life..." - God
 
I increased the OC on my card, and now my whole house is hot. I ACTUALLY HAD TO TURN THE A/C DOWN GODDAMN U ATI...

I just re-read some of the comments defending the "my card heats up the room" arguments.





"Please get a life..." - God
so you actually think that having a video card at 200 degrees Fahrenheit has no effect on heating up a room?? thats not even taking into consideration all the other pc related hardware. I can tell you that hell yes a gaming pc with a hot running gaming card can have significant bearing on room temperature.
 
I think the difficult part to understand is that the TDP of the 9600GT is ~100 watts, and the TDP of the 4870 is ~160 watts (according to a quick google, unconfirmed though).

You're suggesting that a 60 watt increase in heat output has managed to overpower your AC...

Well, it's all relative. When someone says 60watts, it doesn't seem much, but when you put it into the context of what we're talking about, it's a significant difference. It is after all, dissipating 60% more heat than the 9600GT, so comparatively speaking, it's a pretty good jump.

Add 60HP to a 500HP vehicle and you might be able to tell the difference, but it won't be earth shattering. Add 60HP to a 100HP vehicle and the difference is quite significant.
 
Sapphire 4870 @ 800 / 1000

Stock cooler.

Highest load temp I have seen is 76C.

Idles @ 60.

I've done nothing to control the fan so I assume its set to auto.

Needless to say the temps are fine and the card is quiet.
 
Sapphire 4870 @ 800 / 1000

Stock cooler.

Highest load temp I have seen is 76C.

Idles @ 60.

I've done nothing to control the fan so I assume its set to auto.

Needless to say the temps are fine and the card is quiet.

Dude.. whatever.. its bogus and brown-outs will occur REGARDLESS of your temp-saviness.

Perhaps you can put those clocks to better use instead of wasting power you can start folding @ home too :rolleyes::rolleyes::rolleyes:
 
These newer graphics cards need better stock coolers and the ability to step-down ( like CPUs ) during non-gaming situations.
 
so you actually think that having a video card at 200 degrees Fahrenheit has no effect on heating up a room?? thats not even taking into consideration all the other pc related hardware. I can tell you that hell yes a gaming pc with a hot running gaming card can have significant bearing on room temperature.

Thanks man, that's all I was trying to say.
 
I increased the OC on my card, and now my whole house is hot. I ACTUALLY HAD TO TURN THE A/C DOWN GODDAMN U ATI...

I just re-read some of the comments defending the "my card heats up the room" arguments.

"Please get a life..." - God

You strike me as a middle aged, very grumpy man grumbling about things out of his control or most things inconsequential.... but grumbling nevertheless, LOL.

Thanks but my life's just fine. :rolleyes:
 
My HD4870 was so damn hot an arc of fire shot out from my card (in the case mind you) and singed my nut hair.

Needless to say I am suing AMD for 3rd degree burns to my satchel.
 
Thanks man, that's all I was trying to say.

That really wasn't a statement you want to stand behind.

The combination of epic fail and logical thought in this thread is making my head hurt.

OP, if your Master's Degree is in anything even remotely technical, I would like to know what school you received it from so I can sue them for some form of educational malpractice.
 
That really wasn't a statement you want to stand behind.

The combination of epic fail and logical thought in this thread is making my head hurt.

OP, if your Master's Degree is in anything even remotely technical, I would like to know what school you received it from so I can sue them for some form of educational malpractice.
Well I did not know that someone had to have a Masters degree in a specific field to realize 200 degrees was hot. Next time somebody bitches about it being 95 degrees outside I will question their educational background. :rolleyes:
 
The combination of epic fail and logical thought in this thread is making my head hurt.
If this thread doesn't abuse you enough, try explaining the difference between temperature and windchill to someone.
 
so you actually think that having a video card at 200 degrees Fahrenheit has no effect on heating up a room?? thats not even taking into consideration all the other pc related hardware. I can tell you that hell yes a gaming pc with a hot running gaming card can have significant bearing on room temperature.

NO. The idle/load temp of the card is COMPLETELY IRRELEVANT. The *only* thing that matters in regards to heating up the room is how much heat it is producing, *NOT* at what temp it is at.

These newer graphics cards need better stock coolers and the ability to step-down ( like CPUs ) during non-gaming situations.

Video cards have been down clocking themselves when idle for a long time now. And why do you think they need a better stock cooler? From what I've read (as I don't have one myself), the 4850's cooler is very very quiet, and gets the job done. That is pretty much the ideal for 95% of the buyers out there...
 
Hey, I'm reading this post and it's funny some of the comments.

Yes, I know this room is freaking hot, I've had AC technicians even increase the size of the duct going to it. It's my office, I agree that's my first problem; but I still can't believe all the insults about my intelligence because the fact the room heated up noticeably after I got this card and so I returned it.

simple.

My plan is to wait until the refresh and some new products hit, watch prices fall, and eventually if I might have to I'll deal with a hotter card. But there better be a decent game out there to use the card on since I don't really need an upgrade anyway and I've actually got this room pretty well balanced.

And to you who wished me well on the Velociraptor, thanks. That is one smokin' HD! I partitioned about 220gb for programs and 80 for VISTA, got all my pics, music etc. on the second partition and it's really nice.


What really gets me here Ooz is that I have a 4870 running in a open room without AC, in a pc with 3 hardrives and an 24inch LCD monitor. My pc runs only when I use it. When Im not at the desk, I turn it off and yes it gets hot in here especially when its really hot outside, but never to the point where I would feel the need to return my card LOL. Thats the last thing Ide do. I would prefer to find alternatives than do that, god help me. This card is absolutely stunning. Ive never experienced this kind of gaming before and Ide rather add a fan, open a window, or even add AC before returning it for msomething inferior performance wise.

So you see...most of the reactions you got here are normal considering that most here are PC enthusiasts who have had to deal with heat before and will most certainly continue to do so as long as there gaming experiences improve with thier overclocks, GPU upgrades...ect. Actually your the very first person I have ever heard of that has posted on a PC enthusiast forum that they returned a perfectly good working card, especially one with raving reviews, because it was overheating there air conditioned room. Its kinda silly dont you think ?
 
NO. The idle/load temp of the card is COMPLETELY IRRELEVANT. The *only* thing that matters in regards to heating up the room is how much heat it is producing, *NOT* at what temp it is at.
And also how much heat it is dumping OUTSIDE the case. Slot blower heatsinks push all of the hot air out of the case, while single slot blowers or downward-blowing heatsinks keep a large portion of the hot air inside the case.

Still...a single video card should not produce so much heat such that your AC can't keep up and you have to return the card...:(
 
And also how much heat it is dumping OUTSIDE the case. Slot blower heatsinks push all of the hot air out of the case, while single slot blowers or downward-blowing heatsinks keep a large portion of the hot air inside the case.
Not going to make much difference. The case will quickly reach equilibrium; if it's a few degrees hotter inside with a different cooler that heat still ends up getting radiated into the room. Where else is it going to go?
 
Still...a single video card should not produce so much heat such that your AC can't keep up and you have to return the card...:(

QFT.

I have never experienced such a radical change in room temperature due to a video card. Frankly, I have a hard time believing that.
 
And also how much heat it is dumping OUTSIDE the case. Slot blower heatsinks push all of the hot air out of the case, while single slot blowers or downward-blowing heatsinks keep a large portion of the hot air inside the case.

Still...a single video card should not produce so much heat such that your AC can't keep up and you have to return the card...:(

Unless his Ac was close to max capacity for his office. From what I undersand is his AC outputs to his entire house through ducts and apparently his office is too much for it. So the problem isnt the card, its the Ac unit that cant deliver. Usually when you have one of those Ac units installed, you always get one that will have more than enough power to cool the the entire house + a few more rooms.
 
QFT.

I have never experienced such a radical change in room temperature due to a video card. Frankly, I have a hard time believing that.

Aye there must be something else there we dont know about. I had a x1900xtx that blew significant ammounts of hot air from the back of my PC and It never became untolerable in my room and even now with my 4870, I dont feel a change, but then again, Im not in a small closed room without any windows or a ways for the heat to get coolled off from a fan or something.
 
You strike me as a middle aged, very grumpy man grumbling about things out of his control or most things inconsequential.... but grumbling nevertheless, LOL.

Thanks but my life's just fine. :rolleyes:

Nope. 25 thanks. Side stepping the ridiculousness of the issue once again i see. I never post anything inflammatory or whatever, but the stupid little justifications that nerds make on here for their decisions never ceases to amaze me.

oozish said:
I could feel the heat increasing in the room within about 5 minutes...I was like, "no....I can't believe I can tell the difference!"

Ah well...this room gets hot enough anyway, in TExas u know...100 plus all week here.

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:
 
Heat is the transfer of energy. Temperature however is just a relative measurement, and is not necessarily related to the absolute amount of energy transferred. Does anyone believe that a single candle would ever heat up a room? NO, because even though the candle is very "hot" the absolute amount of energy a candle can transfer is not enough "heat" up a room. Unless that room itself is a close universe, with no AC, then eventually after a very very very very very long time, and assuming the candle never burns out would the room begin to heat up. However consider sunlight, the temperature felt from sunlight is “cooler” than that of a candle, but the total amount of energy transferred by sunlight through the window of a room is far greater than that of a single candle.

A single video card no matter how "hot" does not transfer enough energy to heat up a room. I can't believe we have had so many people defending such a silly proposition. Esp. after the OP stated that the room itself is also air conditioned.
 
Honestly, some of you have never taken physics a day in your life...
 
The average human, when sitting, outputs between 70 and 130 watts of heat. Are you saying that you don't allow visitors in your room because they make it unbearably hot?
(source: http://ergo.human.cornell.edu/studentdownloads/DEA350notes/Thermal/thcondnotes.html)


Second, heat dissipation: the temperature of an object has nothing to do with how much heat the object is putting in your room. The difference in temperature between two adjacent materials determines how quickly heat will transfer between them, but nothing more. If you fire up your video card, it starts at room temperature and begins to heat up. Unless you're rich, or you've got a card that doesn't generate much heat, your cooler will not be able to keep your card at room temperature, so your card will begin to heat up. As it heats, your cooler will be able to dissipate more and more heat from the card, because the temperature difference between the card and the surrounding air will increase, thus improving heat transfer. Eventually, the card will reach a temperature where heat transfer has increased to the point where your cooler is dissipating as much heat as the card produces. A card that was putting out 100 watts of heat with abysmal cooling (i.e. passive, crappy heatsink) could easily heat up to 100C+, whereas a card with 200 watts of heat could be cooled to 45C with a good water cooling setup. The 200 watt card is still obviously going to make your room warmer than the 100 watt card, but it's at a lower temperature.


Slightly more in depth science: heat is a measure of energy. If an object is at a certain temperature, that temperature is simply a measure of how much thermal energy the object contains. There's no way of knowing how much heat is being dissipated from an object just by knowing its temperature. Watts, on the other hand, are a measure of power. When an object is dissipating some amount of watts, it's generating a constant stream of energy that will heat up itself and everything around it.


Or, in simpler terms, when you're buying a space heater, do you check what temperature the coils heat up to, or how much heat (watts, or btus) gets put out?
 
Not to mention, cards like the 4850 and 8800GT are only hot because the heatsinks are small and the fans don't spin up. They don't actually output much heat at all. It's just that the heatsinks can't dissipate the little amount of heat the GPUs put out. Just because the 4850 heats up to over 90 degrees on the stock cooler doesn't mean it's a space heater like the 8800 Ultra.
 
QFT.

I have never experienced such a radical change in room temperature due to a video card. Frankly, I have a hard time believing that.

I don't believe the OP at all, my card runs at 38C idle, and 75C at load with Crysis on, this is with the fan at 40% too, I'm not even pushing the card that much, When I installed two I still didn't notice anything.

How he goes onto explain certain comparisons are not fair to him or trivialize his "problem" especially when a next gen High End card is compared to LAST GEN Mid Range card WITH AN AFTERMARKET COOLER ON IT. So with that in mind I don't believe the OP ever purchased the card.
 
looks that way........seems to me the only RELEVANT temps should be taken at the system exhaust...

Not even then, because who knows how your machine is exausting the heat. The only numbers that matter is the power into the box and the temps of the devices inside. Beyond that, I really don't give a shit how the heat comes out, I just want it out. ;)
 
Heat is the energy flow from objects or systems of higher temperature to those of lower temperature.

Temperature is a measure of energy an object currently has.

The card constantly generates heatflow to the Heatsink, which then transfers the heat energy to the air and gets blown outside the case.

Can this raise the air temperature of your room? Well yes it can, if the GPU generates more thermal energy per second than the room loses to the outside world per second the average amount of thermal energy in the room increases.

If you have windows and door closed and you have a well insulated house it's only a matter of time before you feel the effect of something electronic heating up the room. If you have the windows open and a breeze blowing through the room is likely the rooms temperature will tend towards the temperature of the air outside the house. With AC turned on which artificially cools the air the AC will have to do more work to cool the air of a room which has a video card dumping additional thermal energy into the room.

As mentioned before, it doesn't really matter what temperature the chip is currently running at, it's generating the same amount of thermal energy per second as anyone elses card (of the same make, under the same load).

The only time the card isn't transfering the same amount of thermal energy to the room is the time it takes for the chip to reach it's equilibrium temperature (max temperature) when you go from idle to load, it's only a few seconds, during this time the thermal energy is being used to raise the temperature of the card so theres less disipated into the room (during that time only)

Once the heatsink of the card has reached a steady temperature (say 90 degrees and 45 degrees) the system is then outputting a set amount of thermal energy per second, if it was less for anyone here that energy would need to go somewhere and basiaclly your chip would rise im temperature and explode, so unless you've actually melted or blown up your own kit, its safe to say its outputting the same amount of thermal energy per second.
 
The average human, when sitting, outputs between 70 and 130 watts of heat. Are you saying that you don't allow visitors in your room because they make it unbearably hot?
(source: http://ergo.human.cornell.edu/studentdownloads/DEA350notes/Thermal/thcondnotes.html)


Second, heat dissipation: the temperature of an object has nothing to do with how much heat the object is putting in your room. The difference in temperature between two adjacent materials determines how quickly heat will transfer between them, but nothing more. If you fire up your video card, it starts at room temperature and begins to heat up. Unless you're rich, or you've got a card that doesn't generate much heat, your cooler will not be able to keep your card at room temperature, so your card will begin to heat up. As it heats, your cooler will be able to dissipate more and more heat from the card, because the temperature difference between the card and the surrounding air will increase, thus improving heat transfer. Eventually, the card will reach a temperature where heat transfer has increased to the point where your cooler is dissipating as much heat as the card produces. A card that was putting out 100 watts of heat with abysmal cooling (i.e. passive, crappy heatsink) could easily heat up to 100C+, whereas a card with 200 watts of heat could be cooled to 45C with a good water cooling setup. The 200 watt card is still obviously going to make your room warmer than the 100 watt card, but it's at a lower temperature.


Slightly more in depth science: heat is a measure of energy. If an object is at a certain temperature, that temperature is simply a measure of how much thermal energy the object contains. There's no way of knowing how much heat is being dissipated from an object just by knowing its temperature. Watts, on the other hand, are a measure of power. When an object is dissipating some amount of watts, it's generating a constant stream of energy that will heat up itself and everything around it.


Or, in simpler terms, when you're buying a space heater, do you check what temperature the coils heat up to, or how much heat (watts, or btus) gets put out?

This is a very good summary of the issues. I was wondering how an extra 80W would heat up a room so fast since the previous video card was a 9600GT (assuming the 9600GT still puts out about 70W - 9600GT's TDP is actually 95W). You must be really close to thermal equilibrium in your room, and yes, having an extra person in your room would have the same, or more of an effect than upgrading from a 9600GT. I leave you to draw your own conclusions.
 
It's my office, I agree that's my first problem; but I still can't believe all the insults about my intelligence because the fact the room heated up noticeably after I got this card and so I returned it.

Dude... have you not read one other thread on this forum prior to posting this bait? Most all of the 107,599 users on this forum are waiting to pounce on threads such as the one you started.....

My plan is to wait until the refresh and some new products hit, watch prices fall, and eventually if I might have to I'll deal with a hotter card. But there better be a decent game out there to use the card on since I don't really need an upgrade anyway and I've actually got this room pretty well balanced.

You are going to have to stay in the mid-low range of cards to keep the temperature in your office at an acceptable level. Doesn't matter who the manufacture; high end cards=heat.
 
my 4870 heats up my room a little bit, but thats fine. That means im ready for winter. :p

On a serious note, my CPU raised 5C, from 31C, when i installed my 4870. During gaming its about the same temp. My NB stays the same, but that could be because i have a fan on the back of the NB.

I have a true 120 on my Q6600.
 
My 8800 Ultra heats up my room like crazy but I've found ways around it , first off the simple solution :

OPEN YOUR FUCKING DOOR TO THE REST OF THE HOUSE :p

Second solution (and vastly more expensive) , watercool everything you can and use a low RPM fan setup on the heater core. Otherwise you're pretty screwed , needless to say I love winter, I'm sick it being 100+ outside and my room being 87. I use to have my 360,PS3 and Wii in my room as well but I just couldn't take the heat being generated by all of it, I moved them out into the family room.

Hell I even have my bathroom fan going almost constantly to help pull out the idle heat and it barely makes a dent. The rest of my house is a frosty 69 degree's ... damn PC.
 
Your second solution is no solution at all. Water cooling does not get rid of heat, it dissipates/transfers it more efficiently. You're not contributing any less heat to the room.
 
I bought this card AGAIN after returning it...for you who said I didn't buy it in the first place, LOL...you are a moron with serious conspiracy issues. Watch out for UFO's at night too. This card gets to 103C running ATI Tool, lol. Also, just playing TF2 I've seen it get up to 99C, floats around in the high 90's.

The stock cooling on this is not acceptable. I have good airflow in my case; my 9600gt (yes it had aftermarket S1 but still, it was 42C; so it's not impossible to get good temps in there).

So If I keep this 4870; I have to get aftermarket cooling option. Why didn't ATI say something like that in the fine print, "great performance but the price is actually 40 dollars more and an hour of your time to install.

Thanks to those of you who are electrical engineers and actually tried to teach something in this post about heat dissipation etc., I have read every post and I appreciate that. I know I have a lot to learn about thermodynamics from reading your posts.

I'm going to buy a 260 today and see what sorts of temps we're looking at idle and load in comparison and weigh the 2 against each other. Another weird thing regarding drivers or something is I can't run Crysis at 1920x1200 at all, it makes the resolution go screwy, like a thin column down my screen, and you miss the whole right hand side of where the window should be. I can only run Crysis at 1600x1050; what do the hot fix drivers not support 1900x1200 yet?

And you guys who want to yell in caps and swear like I need someone to tell me to open the door, LOL...Godmachine, feel tough now? That's a pathetic post, if you're trying to be funny I didn't get it, you're just coming across like a jerk.

"Oh, I see, I should open my door! YES! That fixed it!" haha. Or, yes, just water cool everything...another bright idea. LOL
 
That seems a little too hot. In the [H] review (http://www.hardocp.com/article.html?art=MTUyNCw4LCxoZW50aHVzaWFzdA==), the 4870 reached 84C under load, while the 8800GT reached 90C and the GTX260 also 84C.

The stock cooler on the 4870 is pretty good actually.. The problem (if there is a problem) is that ATI has set the fan speeds very low, trading low temperatures for low noise levels. If you turn up the fan speed a bit, the GPU will run much cooler and the card won't be much louder.
 
Back
Top