Way too Hot: Good Bye Next Gen

Huh?!

No that's wrong...The total heat generated per second is the same no matter what is cooling it, the water may act as a buffer between the heat from the card and the room, the same way a heatsink does. However overall the water is going to cool to the new ambiant temperature of the room which will be higher than the original ambiant temperature because overall thermal enegery has been added.

L2Physics

Actually he's correct. The water can absorb more heat energy before heating up than other substances such as metal or air. So the video card is putting out a certain amount of thermal energy, this is absorbed by the heat sinks/water/air, and water will heat up less than than the heat sinks/air because it has a higher heat capacity.

If you had 100ml of water at 5 degrees and put it in a room which was 50 degrees, once everything balances out the room would be cooler than if you had, for example, 100ml of metal at 5 degrees. Thats because the water requires more thermal energy to heat up compared to the metal.

Remember thermal energy isn't a measure of temperature, the temperature is a result of thermal energy, thermal energy itself is measured in joules and the temperature is dependant on the heat capacity of the material its going in to.

However that assumes the system is entirely closed, blah blah blah, and as another poster mentioned the thermal energy put out by the video card is continually being renewed, so in reality the difference is probably sweet fuck all.

If the video card just heated up to 200 degrees suddenly then you turned off the computer, the difference between water cooled and non-water cooled would probably make a measurable difference. But given the card is constantly outputting a certain thermal energy, it'd more than likely overwhelm any affects of the water. At least I think so, too tired to try and do any maths on it right now.
 
Once you've reached steady state the thermal capacitance of the materials involved is no longer an issue.
 
Hmm, that makes sense, however I'd still like to think about it more before coming to that conclusion. The water COULD retain more heat energy in itself so that at steady state the water holds more thermal energy than if no water was there, allowing the room itself it hold less thermal energy whilst still dissipating at a constant rate. Although even in my head that doesn't make sense... to be honest I'm thinking of it kinda like an electric circuit, where the higher resistance part of the circuit (in series) "absorbs" more of the voltage which is across the entire circuit. Though I'm not sure if that translates the same way into a "thermal circuit".

I say COULD because I haven't thought too in depth about it and its currently 4:22am and I have lectures starting at 9am tomorrow, so good night! If you prove me wrong I wont be offended because I'm too tired to care. :p

EDIT: Also thats why I said his reasoning assumes a closed system, where if the video card were outputting a constant thermal energy in a closed system the water would make a difference... however in such a system temperature would keep increasing and increasing infinitely.
 
The heat from the water still gets transferred to the room via the radiator. Once you reach equilibrium the water starts dissipating the same amount of heat as it's taking in from the GPU, which is the same amount of heat transfer that a normal heatsink would do.
 
1) I skimmed it...
2) I read that... but actually, it does change the amount of heat added to the room, albeit minimally. Because the water acts as a reservoir for thermal energy, it holds excess heat: heat that doesn't transfer to the room. (i.e. a cool glass of water in a room will cool the room by absorbing heat to itself)

Of course, for practical purposes, he won't feel much of a difference, but it's there. lol
That same thermal energy in the water (or more) could be used to keep the GPU at 90C load instead of the 45C or whatever it would be with water cooling. So technically it could also be the same because it's all a giant heatsink :p.
 
The heat from the water still gets transferred to the room via the radiator. Once you reach equilibrium the water starts dissipating the same amount of heat as it's taking in from the GPU, which is the same amount of heat transfer that a normal heatsink would do.

At least someone here understands first year thermodynamics. :)
 
First year thermodynamics? I never took such a course. This is pure logic if you just attempt to think about it...logically.
 
Actually he's correct. The water can absorb more heat energy before heating up than other substances such as metal or air. So the video card is putting out a certain amount of thermal energy, this is absorbed by the heat sinks/water/air, and water will heat up less than than the heat sinks/air because it has a higher heat capacity.

If you had 100ml of water at 5 degrees and put it in a room which was 50 degrees, once everything balances out the room would be cooler than if you had, for example, 100ml of metal at 5 degrees. Thats because the water requires more thermal energy to heat up compared to the metal.

Remember thermal energy isn't a measure of temperature, the temperature is a result of thermal energy, thermal energy itself is measured in joules and the temperature is dependant on the heat capacity of the material its going in to.

However that assumes the system is entirely closed, blah blah blah, and as another poster mentioned the thermal energy put out by the video card is continually being renewed, so in reality the difference is probably sweet fuck all.

If the video card just heated up to 200 degrees suddenly then you turned off the computer, the difference between water cooled and non-water cooled would probably make a measurable difference. But given the card is constantly outputting a certain thermal energy, it'd more than likely overwhelm any affects of the water. At least I think so, too tired to try and do any maths on it right now.

Noting I said was incorrect, the total thermal energy output is the same, assuming the water in the reservoir is at room temperature when the system starts (which is a reasonable assumption) then the water is going to rise in temperature and pass thermal energy to the rest of the room. It takes more thermal energy to rise the temperature of water, compared to something like the metal of a heatsink which is why i said it acts like a buffer for the heat.

But the statements I made earlier in the thread about thermal output of the system specifically stated that this was after the heatsink had reached equilibrium temperature wise, where it's stopped heating up. At this point you're passing an equal amount of thermal energy to the room as anyone else with the same card.

Remember the opposite of water is also true, when the water passes thermal energy back to the room it passes a larger amount of thermal energy back than say a metal heatsink when dropping a set temperature.

When a system is running for a significant amount of time, the time it takes to heat the cooling system to equilibrium is insignificant, for a HSF its much less than 1 minute, all thermal output after that is indentical.
 
Noting I said was incorrect, the total thermal energy output is the same, assuming the water in the reservoir is at room temperature when the system starts (which is a reasonable assumption) then the water is going to rise in temperature and pass thermal energy to the rest of the room. It takes more thermal energy to rise the temperature of water, compared to something like the metal of a heatsink which is why i said it acts like a buffer for the heat.

But the statements I made earlier in the thread about thermal output of the system specifically stated that this was after the heatsink had reached equilibrium temperature wise, where it's stopped heating up. At this point you're passing an equal amount of thermal energy to the room as anyone else with the same card.

Remember the opposite of water is also true, when the water passes thermal energy back to the room it passes a larger amount of thermal energy back than say a metal heatsink when dropping a set temperature.

When a system is running for a significant amount of time, the time it takes to heat the cooling system to equilibrium is insignificant, for a HSF its much less than 1 minute, all thermal output after that is indentical.

Hahaha you're right. Thanks for destroying my argument.
 
Powercolor HD48x0 with 1 Gigabyte VRAM

"Powercolor releases the first HD 4870 (OC) with 1 GiByte
Jul 18, 2008 16:13

...Powercolor is the first producer to take advantage of the situation and releases a overclocked HD 4870 with one Gigabyte video memory and an own cooling system..."

http://www.pcghx.com/aid,652504/News/Powercolor_releases_the_first_HD_4870_OC_with_1_GiByte/


PowerColor Radeon HD 4870 PCS OC Emerges

Saturday, July 19 2008

http://www.techpowerup.com/66241/PowerColor_Radeon_HD_4870_PCS_OC_Emerges.html
 
When a system is running for a significant amount of time, the time it takes to heat the cooling system to equilibrium is insignificant, for a HSF its much less than 1 minute, all thermal output after that is indentical.
Maybe the 9600GT has a ridiculously massive heat capacity (ie. twice that of the entire room) and ridiculously low thermal conductivity (ie. 0.001 W/(m·K) ). This would cause it to horde energy and the room would heat up more slowly.
 
Maybe the 9600GT has a ridiculously massive heat capacity (ie. twice that of the entire room) and ridiculously low thermal conductivity (ie. 0.001 W/(m·K) ). This would cause it to horde energy and the room would heat up more slowly.

That's exactly what I was thinking :eek: - SnailSink technology ( like that used on the secret NASA space program ; for transdimensional travel ). The heat transfer is unidirectional - it soaks up large amounts of heat but releases it at an average snail's walking pace.
 
Not going to endulge in what ifs, we're talking about real world circumstances here.
 
Not going to endulge in what ifs, we're talking about real world circumstances here.

Real world you can't tell the difference in the heat of a room between a 260, 280, or a 4870. You will not be able to tell the difference between water cooling or air cooling.

Hell, Turning on a single 100Watt light bulb will make more of a difference than the difference between those cards.
 
I would like to thank everyone here for participating on this tread, dear god i felt like going back to high school.

I am ROFL for almost 30 minutes reading eleven !!!! pages of absolute technical BS. thak you guys! I really need to laugh out my worries.

Now would be a bad time to ask a moderator to close this amazing tread?
 
So I just got my 4850, and DEAR GOD MY COMPUTER EXPLOADED FROM THE HEAT!!!! Seriously guys, it was like, 80c, omgz! I almost burnt myself installing it it was so hot!

:p :rolleyes:
 
Back
Top