6800Ultra and Heat!!!

Oddly enough, I downloaded the 61.45 drivers today from guru3d and idle temp is now 54C.
 
creedAMD said:
FUD, prove that it will look better than the x800. What are you basing this on?

Not FUD, truth. Can read more about it here:
http://forums.guru3d.com/showthread.php?threadid=91628&perpage=10&pagenumber=16

I could easily find more links though about the higher quality of FP blending with HDR versus HDR without fp blending.

Here is another:

http://tech-report.com/reviews/2004q2/geforce-6800ultra/index.x?pg=27

"In fact, I should mention in relation to high-dynamic-range lighting that the NV40 includes provisions to improve performance and image fidelity for HDR lighting techniques over what ATI's current GPUs support. John Carmack noted one of the key limitations of first-gen DirectX 9 hardware, including R300, in his .plan file entry from January 2003:

The future is in floating point framebuffers. One of the most noticeable thing this will get you without fundamental algorithm changes is the ability to use a correct display gamma ramp without destroying the dark color precision. Unfortunately, using a floating point framebuffer on the current generation of cards is pretty difficult, because no blending operations are supported, and the primary thing we need to do is add light contributions together in the framebuffer. The workaround is to copy the part of the framebuffer you are going to reference to a texture, and have your fragment program explicitly add that texture, instead of having the separate blend unit do it. This is intrusive enough that I probably won't hack up the current codebase, instead playing around on a forked version.

So in order to handle light properly, the cards had to use a pixel shader program, causing a fair amount of overhead. The NV40, on the other hand, can do full 16-bit floating-point blends in the framebuffer, making HDR lighting much more practical. Not only that, but NV40 uniquely supports 16-bit floating-point precision for texture filtering, including trilinear and anisotropic filtering up to 16X. I'd hoped to see something really eye-popping, like Devebec's Fiat Lux running in real time using this technique, but no such luck yet. Perhaps someone will cook up a demo soon."

One thing you must realize creed is that I don't post something unless I can back it up :)
 
tranCendenZ said:
Not FUD, truth. Can read more about it here:
http://forums.guru3d.com/showthread.php?threadid=91628&perpage=10&pagenumber=16

I could easily find more links though about the higher quality of FP blending with HDR versus HDR without fp blending.

The proof is in the pudding, I don't forsee nvidia doing anything "higher quality" without tanking in performance, until they can get far cry to run without using FP16. You are marketing for games in the future when nvidia is struggling with performance/quality of games in the present. Take it one step at a time. High hopes is one thing, but spreading it around like its a fact is another.
 
creedAMD said:
The proof is in the pudding, I don't forsee nvidia doing anything "higher quality" without tanking in performance, until they can get far cry to run without using FP16. You are marketing for games in the future when nvidia is struggling with performance/quality of games in the present. Take it one step at a time. High hopes is one thing, but spreading it around like its a fact is another.

If you read, the poster in that thread already has the higher quality blending working on NV40 cards, on FarCry at a very playable rate. They only take a 20% hit when it is enabled:

"Actually FarCry with fp16 HDR runs at 35-80 fps depending on scene.
It's about 20% slower than non-HDR if scene isn't CPU bound."

He was also using the Shader Model 3.0 patch.
 
tranCendenZ said:
One thing you must realize creed is that I don't post something unless I can back it up :)

Until the games come out using the technology and we compare side by side, no one can back it up.
 
Here are some screenshots of NV40/6800 Ultra using FarCry Shader Model 3.0 patch (1.2) and high quality FP blending HDR patch (1.3). Kind of puts rthdribl to shame ;)

farcryhd1.jpg

farcryhd2.jpg

farcryhd3.jpg

farcryhd4.jpg
 
Ver said:
No. No digital camera for me :(

However, I thought that might be a problem so i unplugged everything and move wires/cables around best I could. That tornado in my box is at the front of the case blowing air directly onto the 6800U. I might try reversing this later.

I have one of those Tornados and if that thing is in your case you should have no trouble with heat or airflow (Noise mabye). Send back the card it should not be running that hot in a case with 4 fans one of them being a Tornado.
 
ouch, host is a little slow.

But the x800 series should make it look the same way, correct? I thought this has been beaten to death.
 
creedAMD said:
ouch, host is a little slow.

But the x800 series should make it look the same way, correct? I thought this has been beaten to death.

No, at least with not with the lighting patch (1.3) - the x800 will likely look similar to the 6800 in the SM3.0 patch (1.2), but should look worse in the HDR patch (1.3), assuming the dev posting those screens was informed correctly. Those shots according to the poster were taken with true HDR featuring FP blending. The x800 does not support this, only a lower quality workaround. The x800's implementation will be lower quality, and might be lower performance too for the 1.3 patch (assuming the poster was informed correctly about the nature of the HDR in the patch he was using - I'd assume so though since he seemed very knowledgable, probably a dev who is a friend of one of the Far Cry devs).
 
tranCendenZ said:
No, at least with not with the lighting patch (1.3) - the x800 will likely look similar to the 6800 in the SM3.0 patch (1.2), but should look worse in the HDR patch (1.3), assuming the dev posting those screens was informed correctly. Those shots according to the poster were taken with true HDR featuring FP blending. The x800 does not support this, only a lower quality workaround. The x800's implementation will be lower quality, and might be lower performance too for the 1.3 patch (assuming the poster was informed correctly about the nature of the HDR in the patch he was using - I'd assume so though since he seemed very knowledgable, probably a dev who is a friend of one of the Far Cry devs).

Do you have a screenshot of the x800 doing it? Something to compare by?
 
creedAMD said:
Do you have a screenshot of the x800 doing it? Something to compare by?

Nope, only one person has posted previews of the 1.3 patch, and that person was using an NV40.
 
Why don't you post other information from the thread discounting the false PR statements this guy is spreading?
 
SnakEyez187 said:
Why don't you post other information from the thread discounting the false PR statements this guy is spreading?

Because they aren't false, and the people "discounting" him were proved wrong in that thread. Here is another source detailing the importance of the 6800's FP blending (which the X800 does not support) in true HDR:

http://www.extremetech.com/article2/0,1558,1567089,00.asp

"The FP32 texture unit found in each pixel shader pipeline also tends to filtering chores, doing bilinear, trilinear, and up to 128-tap anisotropic filtering. However, these texturing units can also perform filtering on FP16 color values, such as those used in ILM's OpenEXR format. This preserves pixel color precision by not "dumbing down" the pixel color value to a fixed point 32-bit value, where filtering operations could introduce rounding errors. These errors can show up as banding or blotching in the image, particularly in areas with higher than normal dynamic range. It also means that an FP16 color value can be written into the frame buffer, and then be read back into the GPU without any loss of precision.

The top image uses a standard sRGB data representation to store texture data. Note the substantial loss of information in the left (darker) window. On the right, the overly bright light creates banding and saturation in the lit floor and ceiling area. The bottom image uses OpenEXR data. You can see more of the terrain and cloud detail in the left window, and the floor and ceiling texture detail is more apparent.

OpenEXR is a data format developed by Industrial Light and Magic, and uses 16-bit floating point precision for graphics data. OpenEXR is often used to light high dynamic range images. By having many more possible color values, 3D renderers can express much greater contrast ratios between the brightest values and the darkest values. This allows for better ramping toward the brightest values (as illustrated above), and better detail in dark areas, where surface detail can be better preserved.

Previous GPUs didn't have the ability to write FP color values out to the frame buffer, but instead could only write those values to temporary registers inside the GPU when additional processing still needed to be done. Floating-point values are key to being able to deliver convincing high dynamic range (HDR) lighting such as overly bright light blooms, and glare from sunlight."
 
tranCendenZ said:
Because they aren't false, and the people "discounting" him were proved wrong in that thread. Here is another source detailing the importance of the 6800's FP blending (which the X800 does not support) in true HDR:

http://www.extremetech.com/article2/0,1558,1567089,00.asp

"The FP32 texture unit found in each pixel shader pipeline also tends to filtering chores, doing bilinear, trilinear, and up to 128-tap anisotropic filtering. However, these texturing units can also perform filtering on FP16 color values, such as those used in ILM's OpenEXR format. This preserves pixel color precision by not "dumbing down" the pixel color value to a fixed point 32-bit value, where filtering operations could introduce rounding errors. These errors can show up as banding or blotching in the image, particularly in areas with higher than normal dynamic range. It also means that an FP16 color value can be written into the frame buffer, and then be read back into the GPU without any loss of precision.

The top image uses a standard sRGB data representation to store texture data. Note the substantial loss of information in the left (darker) window. On the right, the overly bright light creates banding and saturation in the lit floor and ceiling area. The bottom image uses OpenEXR data. You can see more of the terrain and cloud detail in the left window, and the floor and ceiling texture detail is more apparent.

OpenEXR is a data format developed by Industrial Light and Magic, and uses 16-bit floating point precision for graphics data. OpenEXR is often used to light high dynamic range images. By having many more possible color values, 3D renderers can express much greater contrast ratios between the brightest values and the darkest values. This allows for better ramping toward the brightest values (as illustrated above), and better detail in dark areas, where surface detail can be better preserved.

Previous GPUs didn't have the ability to write FP color values out to the frame buffer, but instead could only write those values to temporary registers inside the GPU when additional processing still needed to be done. Floating-point values are key to being able to deliver convincing high dynamic range (HDR) lighting such as overly bright light blooms, and glare from sunlight."

Man, please stop, you are a PR machine. Please open your eyes and wait to see facts before you jump off the deep end. The 6800u currently has horrible quality issues in FarCry and has since launch. Let them get those fixed first before believing that they are going to fix them, keep performance, then give better image quality than the x800pro and beat them in speed.

It is possible, but just give it some time. I just hate seeing people's hope crushed from hype.
 
creedAMD said:
Man, please stop, you are a PR machine. Please open your eyes and wait to see facts before you jump off the deep end. The 6800u currently has horrible quality issues in FarCry and has since launch. Let them get those fixed first before believing that they are going to fix them, keep performance, then give better image quality than the x800pro and beat them in speed.

It is possible, but just give it some time. I just hate seeing people's hope crushed from hype.

I think the screenshots above speak volumes about where FarCry is headed for the 6800 series :)
 
tranCendenZ said:
I think the screenshots above speak volumes about where FarCry is headed for the 6800 series :)

We'll see, I know you have already placed your bet, I've yet to place mine. ;)
 
tranCendenZ said:
I think the screenshots above speak volumes about where FarCry is headed for the 6800 series :)

But they speak nothing about where the X800 is, yet people like you and the guy posting those screenshots keep claiming things without showing any proof or even using first-hand experience. Recounting PR and what you hear on websites can only go so far when trying to prove a point that can't be simply proven with words. Just because it says "higher quality" doesn't definitively mean there is going to be a difference, just look at the whole fp32-24 argument, yeah it's lower quality if you look at what they both can do on paper, but it really doesn't make any difference
 
tranCendenZ said:
I think the screenshots above speak volumes about where FarCry is headed for the 6800 series :)


Funny since HDR has been available since the release of the 9700 Pro.:)
 
merlin704 said:
Funny since HDR has been available since the release of the 9700 Pro.:)

Tranz don't tell you that, you are only going to get one side of the story from him everytime. Either he really hates Ati for some reason, has stock in Nvidia, works for nvidia, or just likes to troll people who have made big purchases. I'm thinking a little of all of the choices. I just like to think of him as the nvidia undercover PR guy.
 
creedAMD said:
Tranz don't tell you that, you are only going to get one side of the story from him everytime. Either he really hates Ati for some reason, has stock in Nvidia, works for nvidia, or just likes to troll people who have made big purchases. I'm thinking a little of all of the choices. I just like to think of him as the nvidia undercover PR guy.


That's true with anything. No matter which side someone sits on, you are only going to hear that one side.

I find that it is better to have the facts from both sides of the conversation before making any accusations/assumptions about the topic at hand.:)
 
merlin704 said:
Funny since HDR has been available since the release of the 9700 Pro.:)

That's like saying "Funny since pixel shaders have been available since the GeForce 3 series." There are less advanced and more advanced implementations of HDR, with differing levels of quality. The numerous links I posted detail the quality improvements that can be offered by Nvidia's new implementation versus the older implementation that the 9700pro/x800pro use.
 
Does anyone actually research Far Cry information or do they just post screenshots they saw somewhere?

Look at #8

Crytek comments on Shader 3.0

"8) Is the same level of image quality seen when using Shader 3.0 possible using Shader 2.0? If so, what dictates which Shader you decide to use?

In current generation engine quality of PS3.0 is almost the same as PS2.0. PS3.0 is used for performance optimisation purposes."

There is mention in the Far Cry forums that patch 1.2 will likely add virtual displacement mapping to the game, and HDR could appear in 1.3. Maybe the 6800 Ultra will pull this off faster but we will have to see. There is nothing else mentioned anywhere on any of the Far Cry sites that I can find mentioning 6800-only additions.
 
Arioch said:
Does anyone actually research Far Cry information or do they just post screenshots they saw somewhere?

As I stated a few posts ago, the 1.2 (SM3.0) patch will likely look similar on the 6800 and x800 series, since it is not using true displacement mapping but rather offset mapping (or virtual displacement mapping). OTOH, the HDR being used in the 1.3 patch looks to take advantage of the 6800's FP blending, which should yield a higher quality result with higher precision and better lighting range than the X800 because the X800 does not support FP blending for hdr like the 6800. This isn't to say that the X800 won't have HDR, but if this is the case it won't look as good.

Note that FP blending for hdr (along with SM3.0) will most definitely be added to R500, as it is a necessary step for "true" HDR.
 
Well Crytek has delayed the 1.2 patch indefinitely now. It's anyone's guess if and when any of this will appear. It would not surprise me to see the virtual displacment mapping not make it in the next patch at all.
 
tranCendenZ said:
As I stated a few posts ago, the 1.2 (SM3.0) patch will likely look similar on the 6800 and x800 series, since it is not using true displacement mapping but rather offset mapping (or virtual displacement mapping). OTOH, the HDR being used in the 1.3 patch looks to take advantage of the 6800's FP blending, which should yield a higher quality result with higher precision and better lighting range than the X800 because the X800 does not support FP blending for hdr like the 6800. This isn't to say that the X800 won't have HDR, but if this is the case it won't look as good.

Note that FP blending for hdr (along with SM3.0) will most definitely be added to R500, as it is a necessary step for "true" HDR.

Yes, it will have it for the R500, when there are more games that support it, and the R500 will be able to take the performance hit as well. So far it's looking like the Ati gamble to not do SM3.0 is paying off.
 
Arioch said:
Well Crytek has delayed the 1.2 patch indefinitely now. It's anyone's guess if and when any of this will appear. It would not surprise me to see the virtual displacment mapping not make it in the next patch at all.

I thought the patch would have been out a week after E3, they were parading everywhere about how easy it was to implement SM3.0 into Farcry, yet we are still without it months later. I guess we'll know the reason for the early hype soon enough.
 
tranCendenZ said:
The numerous links I posted detail the quality improvements that can be offered by Nvidia's new implementation versus the older implementation that the 9700pro/x800pro use.

Um, no they didn't, all they did was tell how they were different. You can't quantify a quality difference with words
 
creedAMD said:
I thought the patch would have been out a week after E3, they were parading everywhere about how easy it was to implement SM3.0 into Farcry, yet we are still without it months later. I guess we'll know the reason for the early hype soon enough.

Some people think maybe the 6800 does not have enough power to push PS3.0 features effectively.

Well the whole SM3.0 in Far Cry has many people believing the effects shown in the screenshots everyone is alluding to is 6800 exclusive. I still remember the launch pictures that show supposed PS2.0 vs. PS3.0 shots that were incorrect as the screen shots were in fact PS2.0 and PS1.1 effects.
 
Arioch said:
Some people think maybe the 6800 does not have enough power to push PS3.0 features effectively.

Well the whole SM3.0 in Far Cry has many people believing the effects shown in the screenshots everyone is alluding to is 6800 exclusive. I still remember the launch pictures that show supposed PS2.0 vs. PS3.0 shots that were incorrect as the screen shots were in fact PS2.0 and PS1.1 effects.

Oh yeah! I think that may become a classic. What amazed me were all of the wannabee review sites that actually fell for it. LOL! The nvidiots were plastering that shit all over the place. That was indeed funny.
 
Arioch said:
Some people think maybe the 6800 does not have enough power to push PS3.0 features effectively.

Well the whole SM3.0 in Far Cry has many people believing the effects shown in the screenshots everyone is alluding to is 6800 exclusive. I still remember the launch pictures that show supposed PS2.0 vs. PS3.0 shots that were incorrect as the screen shots were in fact PS2.0 and PS1.1 effects.

Heh, well if many people still believe that about the sm3.0 patch, then the marketing for FarCry on NV40 worked well :) There would have been a difference if CryTek used true displacement mapping, but what they implemented was simply offset mapping. SM3.0 itself will likely just give 6800 users a performance boost.

For those screens of FarCry on the last page though, if they are using FP blending for HDR as the user who posted them claimed (and they do look better than any HDR screenshots in-game I've seen to date), the 6800/NV40 will have an IQ advantage (and possibly performance advantage) in HDR as that is something the X800 cannot do.
 
creedAMD said:
Tranz don't tell you that, you are only going to get one side of the story from him everytime. Either he really hates Ati for some reason, has stock in Nvidia, works for nvidia, or just likes to troll people who have made big purchases. I'm thinking a little of all of the choices. I just like to think of him as the nvidia undercover PR guy.
Funny how the ONLY person posting links to backup the stuff they say is tranCendenZ. Then there's creed just screaming how it doesn't matter since the R500 will be out by the time it is really needed. Somehow I don't think he was arguing that NVIDIA would have a card that can handle PS2.0 fine when it is really needed a year ago.
 
obs said:
Funny how the ONLY person posting links to backup the stuff they say is tranCendenZ.

Nice way to come into the thread. :rolleyes:

I'm not debating whether the links are real or not, hell, I'm not even debating that the 6800u can do everything he says that it can do. I am debating the claims of the 6800u being faster and better IQ than the x800. His links do not show the x800pro as a comparison for all claims. Read the thread. Comprehend. Then post.
 
obs said:
Funny how the ONLY person posting links to backup the stuff they say is tranCendenZ. Then there's creed just screaming how it doesn't matter since the R500 will be out by the time it is really needed. Somehow I don't think he was arguing that NVIDIA would have a card that can handle PS2.0 fine when it is really needed a year ago.

Please find someone else to troll, your obsession with me is becoming unhealthy. It makes you seem, kinda like a groupie. There are plenty others here to play with, either add to the thread or send me PM's to ignore. Don't clutter up the [H] with this shit.
 
obs said:
Then there's creed just screaming how it doesn't matter since the R500 will be out by the time it is really needed. Somehow I don't think he was arguing that NVIDIA would have a card that can handle PS2.0 fine when it is really needed a year ago.

Heh, that is a funny analogy. The amount of SM3.0 games that will be out by the end of the year looks to be similar to the amount of SM2.0 games that were out by the end of last year. So in a way regarding SM3.0 support at least, the X800 is in a similar position games-wise to the FX5900 last year, and as users debated then whether the faster SM2.0 performance of the ATI 9800 was "needed" for games last year, people could similarly debate whether the potential speed increases of SM3.0 or HDR IQ increases of FP blending of the Nvidia 6800 are "needed" for games this year.
 
tranCendenZ said:
As I stated in another thread, there is no need to be condescending.


Hey can I get a free T-shirt? I mean since you are nVidia PR and all...
 
Yea, me saying that the 5950 and PS2.0 and the X800 and PS3.0 being in similar situations is being way too serious and just cluttering [H] up with shit. Oh, and being a groupie too. Any other personal comments you want to make or are you going to actually argue my post?
 
obs said:
Yea, me saying that the 5950 and PS2.0 and the X800 and PS3.0 being in similar situations is being way too serious and just cluttering [H] up with shit. Oh, and being a groupie too. Any other personal comments you want to make or are you going to actually argue my post?

What is there to argue? Can you actually convey a point without it being argumentative?
 
Back
Top