Better yet...op should get a day++++ ban....for starting a pointless thread about how 4870 heats up his room more then a 280 and runs at higher temps.
Banned for stupidity.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Better yet...op should get a day++++ ban....for starting a pointless thread about how 4870 heats up his room more then a 280 and runs at higher temps.
Huh?!
No that's wrong...The total heat generated per second is the same no matter what is cooling it, the water may act as a buffer between the heat from the card and the room, the same way a heatsink does. However overall the water is going to cool to the new ambiant temperature of the room which will be higher than the original ambiant temperature because overall thermal enegery has been added.
L2Physics
Actually he's correct. The water can absorb more heat energy before heating up than other substances such as metal or air. So the video card is putting out a certain amount of thermal energy, this is absorbed by the heat sinks/water/air, and water will heat up less than than the heat sinks/air because it has a higher heat capacity.
If you had 100ml of water at 5 degrees and put it in a room which was 50 degrees, once everything balances out the room would be cooler than if you had, for example, 100ml of metal at 5 degrees. Thats because the water requires more thermal energy to heat up compared to the metal.
Remember thermal energy isn't a measure of temperature, the temperature is a result of thermal energy, thermal energy itself is measured in joules and the temperature is dependant on the heat capacity of the material its going in to.
However that assumes the system is entirely closed, blah blah blah, and as another poster mentioned the thermal energy put out by the video card is continually being renewed, so in reality the difference is probably sweet fuck all.
If the video card just heated up to 200 degrees suddenly then you turned off the computer, the difference between water cooled and non-water cooled would probably make a measurable difference. But given the card is constantly outputting a certain thermal energy, it'd more than likely overwhelm any affects of the water. At least I think so, too tired to try and do any maths on it right now.
That same thermal energy in the water (or more) could be used to keep the GPU at 90C load instead of the 45C or whatever it would be with water cooling. So technically it could also be the same because it's all a giant heatsink .1) I skimmed it...
2) I read that... but actually, it does change the amount of heat added to the room, albeit minimally. Because the water acts as a reservoir for thermal energy, it holds excess heat: heat that doesn't transfer to the room. (i.e. a cool glass of water in a room will cool the room by absorbing heat to itself)
Of course, for practical purposes, he won't feel much of a difference, but it's there. lol
The heat from the water still gets transferred to the room via the radiator. Once you reach equilibrium the water starts dissipating the same amount of heat as it's taking in from the GPU, which is the same amount of heat transfer that a normal heatsink would do.
Actually he's correct. The water can absorb more heat energy before heating up than other substances such as metal or air. So the video card is putting out a certain amount of thermal energy, this is absorbed by the heat sinks/water/air, and water will heat up less than than the heat sinks/air because it has a higher heat capacity.
If you had 100ml of water at 5 degrees and put it in a room which was 50 degrees, once everything balances out the room would be cooler than if you had, for example, 100ml of metal at 5 degrees. Thats because the water requires more thermal energy to heat up compared to the metal.
Remember thermal energy isn't a measure of temperature, the temperature is a result of thermal energy, thermal energy itself is measured in joules and the temperature is dependant on the heat capacity of the material its going in to.
However that assumes the system is entirely closed, blah blah blah, and as another poster mentioned the thermal energy put out by the video card is continually being renewed, so in reality the difference is probably sweet fuck all.
If the video card just heated up to 200 degrees suddenly then you turned off the computer, the difference between water cooled and non-water cooled would probably make a measurable difference. But given the card is constantly outputting a certain thermal energy, it'd more than likely overwhelm any affects of the water. At least I think so, too tired to try and do any maths on it right now.
Noting I said was incorrect, the total thermal energy output is the same, assuming the water in the reservoir is at room temperature when the system starts (which is a reasonable assumption) then the water is going to rise in temperature and pass thermal energy to the rest of the room. It takes more thermal energy to rise the temperature of water, compared to something like the metal of a heatsink which is why i said it acts like a buffer for the heat.
But the statements I made earlier in the thread about thermal output of the system specifically stated that this was after the heatsink had reached equilibrium temperature wise, where it's stopped heating up. At this point you're passing an equal amount of thermal energy to the room as anyone else with the same card.
Remember the opposite of water is also true, when the water passes thermal energy back to the room it passes a larger amount of thermal energy back than say a metal heatsink when dropping a set temperature.
When a system is running for a significant amount of time, the time it takes to heat the cooling system to equilibrium is insignificant, for a HSF its much less than 1 minute, all thermal output after that is indentical.
Noting I said was incorrect, the total thermal energy output is the same, assuming the water in the reservoir is at room temperature when the system starts (which is a reasonable assumption) then the water is going to rise in temperature and pass thermal energy to the rest of the room. It takes more thermal energy to rise the temperature of water, compared to something like the metal of a heatsink which is why i said it acts like a buffer for the heat.
But the statements I made earlier in the thread about thermal output of the system specifically stated that this was after the heatsink had reached equilibrium temperature wise, where it's stopped heating up. At this point you're passing an equal amount of thermal energy to the room as anyone else with the same card.
Remember the opposite of water is also true, when the water passes thermal energy back to the room it passes a larger amount of thermal energy back than say a metal heatsink when dropping a set temperature.
When a system is running for a significant amount of time, the time it takes to heat the cooling system to equilibrium is insignificant, for a HSF its much less than 1 minute, all thermal output after that is indentical.
Hahaha you're right. Thanks for destroying my argument.
Maybe the 9600GT has a ridiculously massive heat capacity (ie. twice that of the entire room) and ridiculously low thermal conductivity (ie. 0.001 W/(m·K) ). This would cause it to horde energy and the room would heat up more slowly.When a system is running for a significant amount of time, the time it takes to heat the cooling system to equilibrium is insignificant, for a HSF its much less than 1 minute, all thermal output after that is indentical.
Maybe the 9600GT has a ridiculously massive heat capacity (ie. twice that of the entire room) and ridiculously low thermal conductivity (ie. 0.001 W/(m·K) ). This would cause it to horde energy and the room would heat up more slowly.
Not going to endulge in what ifs, we're talking about real world circumstances here.