Half Life 2 Benchmarks @ HardOCP.com

Status
Not open for further replies.
burningrave101 said:
You cannot disable ALL of the optimizations in the ATI drivers through their new AI option. Your able to disable what ATI's wants you to disable.

I said go look at some game forums and see which video card has more issues.

With 4xAA TAA you get the "effect" of around 6xAA, not 8xAA. And if your FPS drops then you have a bad shimmering effect. And like i said already, TAA only applies AA to every other frame. It IS NOT as good as the real thing.
And what optimizations can't you disable?
Surely you aren't talking about AF?
I'm gonna play need for speed underground 2.. see you in a couple of hours
 
burningrave101 said:
Clock speeds are RECOMMENDED by nVidia and ATI but the vendors are free to clock them at any speed they want and as long as that card falls into the same price point as the other cards then its just as permissible in a review as any other video card. :rolleyes:

AFAIK only some vendors close to nVidia can sells overclocked cards.

I agree with everything you have said about drivers except you swaped Ati and nVidia. :D
There is endless list of threads with complains about drivers on nvnews
Linux,heh, do you play games on linux? Congrats man :D
 
Brent_Justice said:
I've got it on 2 computers, can only have one instance of it goin at one time though cause it connects through steam.
Thanks Brent, that's what I was hoping to hear. :D
 
the 6800nu did well in the test, and, with mine have the 6th vertex shader unlocked and oc'ed to 390/900, i should be flying when i get the game on saturday.
 
-=bladerunner=- said:
AFAIK only some vendors close to nVidia can sells overclocked cards.

I agree with everything you have said about drivers except you swaped Ati and nVidia. :D
There is endless list of threads with complains about drivers on nvnews
Linux,heh, do you play games on linux? Congrats man :D

Any of the vendors can sell cards clocked past reference speeds. The vendors order the number of cores they want from nVidia and ATI and the rest is up to the vendor.

And do you suppose there are so many threads about nVidia cards instead of ATI cards on nV News because its an nVidia site? Eh, maybe not. :rolleyes:

I doubt every single member on nV News togethor accounts for 1% of the PC Gaming community lol. Thats why i said check out the official game forums. Most of them have a technical support forum.

And how about you go and just try and install a few different Linux distro's with an ATI video card.

SnakEyez187 said:
doesn't seem any different or more magical then the other reviews, well except for the scores :rolleyes:

The scores were more like Anandtech and HardOCP found.

Most of the these other reviews floating around are just about worthless because some of them didn't even use the new 67.02 drivers nVidia put up for HL2 while some are clearly bias like *cough* DH.
 
mappy said:
Since this thread is 20 pages long I skimmed though it in a bleak attempt to catch up. What I have found with the reviews so far it that the Nvidia camp is only slightly behind the ATI camp in most benches unless water effects are in that scene. Sometimes though we see one like this http://www.xbitlabs.com/articles/video/display/half-life.html in which the high end ATI cards trounce Nvidia's, sometime doubling them. I think Nvidia needs to get another driver set out soon because there is no reason the ATI cards should seperate by that much. 20% is fine but 50% - 100% something is wrong.

Yikes! The Pro is stomping the Ultra :eek:

I hope the next Forceware set has some serious hl2 opts... in some of those tests the GT was pulling like 45 fps :(
 
6800GTOwned said:
Yikes! The Pro is stomping the Ultra :eek:

I hope the next Forceware set has some serious hl2 opts... in some of those tests the GT was pulling like 45 fps :(

They didn't even use the new 67.02 drivers that HardOCP used and their results are nowhere near what other sites like HardCP and Anandtech found.

Over the last few months xbitlabs has put out some rather iffy reviews. Their last roundup a few months back showed the ATI cards whipping up on the nVidia cards in several OpenGL games like Call of Duty lol.
 
burningrave101 said:
They didn't even use the new 67.02 drivers that HardOCP used and their results are nowhere near what other sites like HardCP and Anandtech found.

Over the last few months xbitlabs has put out some rather iffy reviews. Their last roundup a few months back showed the ATI cards whipping up on the nVidia cards in several OpenGL games like Call of Duty lol.


yeah but they also summed it up with this:

"As usual, we set the graphics quality of the game to the same level so that the RADEON and the GeForce hardware produced an image of the same quality. We also wanted to check out the influence of NVIDIA’s new driver (ForceWare 67.02) on the performance, but found that the difference between the new version and ForceWare 66.93 was no more than 1-2fps in the three scenes we used in our tests. Considering that the absolute fps rates are about 60-100fps, this difference fits into the measurement error range."

We'll have to wait for more in depth reviews i guess
 
burningrave101 said:
If you had ever ran Linux on an ATI video card then you would know exactly what i'm talking about lol. The performance is horrid. A 5950u could easily outpace an X800XT PE in Linux. The FX 5200 beats a 9800 pro.

ATI might as well not even support Linux because if you go and talk to some Linux guru's they will tell you what video card you want to have if your into Linux.



Actually, I *currently* run linux on an ati card.

Pffft to you. I don't think of linux as a good "gaming" operating system. Seeing as how most video games aren't written for it and all.

And, any serious video renderer wouldn't be using something so droll as a radeon or a 6600..
 
I tried the 67.02 drivers on my FX53 system and they didn't work. The HL2 splash screen would start to appear, then freeze.
I went to 66.93 and had great results.

FX53 at stock speed, no OC.
BFG 6800 Ultra Waterblock @ 451/1200
HL2 at max settings, Reflect World, 1600x1200, 4xAA, 8xAF
This is the recommended setting for high end cards.

Map --------- [H] result ------- My Result
D1_Canals ----- 91.55 ----- 96.55
D2_Coast ------- 83.06 ----- 88.18

So I'm getting a few frames more with my OC on the video card. I'm happy.
Of coarse, you're all asking me to OC my CPU, and of coarse, I will.

FX53 OCed 10% to 2639. The [H] result remains the same.
Map --------- [H] result ------- My Result
D1_Canals ----- 91.55 ----- 98.25
D2_Coast ------- 83.06 ----- 89.57

Only a couple frames from OCing the CPU. Not really worth it if you ask me. But having a faster video card does matter.
 
Jonsey said:
I really like my x800 pro vivo -> XT PE too. It was just a feel good moment when I flashed it to an XT PE and realized it would work.


Add me to that list, my modded VIVO does over XT-PE speeds flawlessly with NO errors or artifacts thanks to ViperJohn...makes me wonder why ATI can't get more of them out....kinda fishy.
:p
 
Laforge said:
BFG (as well as sapphire) are as much a MANUFACTURER as dell MANUFACTURES pentium 4 processors.

Yeah they are, and you know why? Because they both have to do quality control to be sure what they're shipping out will be reliable. The only difference is BFG OC's their cards and in the process probably does a better job of quality control, they know they cards have to be solid at the speeds they're getting shipped at or they won't do business.

Nobody is claiming anything regarding stock or what have you here, Brent clearly stated he used BFG OC cards and you know what clocks they're running at (you can also deduce the insignificant difference it makes, unless you're on a crusade, then by all means, flame on). This is not a deception campaign, just a lot of whiny posters making mountains out of molehills.


Offtopic: It's nice to have someone from the manufacturers' driver team like CATALYSTMAKER around here (and elsewhere) sometimes, unless your head is up your rear end you can tell that he really is working for the better being of all the ATI owners.

Sure, he's obviously gonna give you a PR line once in a while, but his best interests are still towards the improved performance of the ATI cards and the betterment of gameplay for their owners. He cares about what he's working on. That's more than you can say for some of the regular fanatics who just seem to duke it out for the sheer thrill of it.

P.S. I just thought I'd point that out, I do own a BFG 6800 OC and I still think NVidia still has some of the lower/mid price-points better covered (overall, not for HL2 strictly, 'least what I personally define as overall), but the competition is definitely hot at the top and that's a GOOD THING© for the industry and the consumer both.
 
TheRapture said:
Add me to that list, my modded VIVO does over XT-PE speeds flawlessly with NO errors or artifacts thanks to ViperJohn...makes me wonder why ATI can't get more of them out....kinda fishy.
:p

Maybe they're focusing on OEM supply more, which may earn them more $ at this point in time. The VIVO suppliers may also have stricter standars or higher priority than the rest of the chain. Nothing particulary wrong with either thing I guess, no veteran enthusiast really expects large companies to cater to them, they're just happy when they do. :p

mappy said:
yeah but they also summed it up with this:

"As usual, we set the graphics quality of the game to the same level so that the RADEON and the GeForce hardware produced an image of the same quality. We also wanted to check out the influence of NVIDIA’s new driver (ForceWare 67.02) on the performance, but found that the difference between the new version and ForceWare 66.93 was no more than 1-2fps in the three scenes we used in our tests. Considering that the absolute fps rates are about 60-100fps, this difference fits into the measurement error range."

We'll have to wait for more in depth reviews i guess

The supposed value of the newer 67.02s (other than the shimmering bug fix) is it was said on the release notes they had "increased shader performance". Basically overall scores may not seem any higher with these drivers but the lowpoints in a given scene may indeed be performing better, that's why I'm looking forward to a more in-depth article or my own testing (after I play some more :p ).

While on the topic of the 67.02s, Brent didn't make any mention of it but I've heard one or two people saying they saw weird banding in the water with these drivers (maybe due to some opt or whatever). Anyone else see this? Currently running the last official ones myself, I don't wanna bother testing them if the IQ isn't there.
 
Impulse said:
Maybe they're focusing on OEM supply more, which may earn them more $ at this point in time. The VIVO suppliers may also have stricter standars or higher priority than the rest of the chain. Nothing particulary wrong with either thing I guess, no veteran enthusiast really expects large companies to cater to them, they're just happy when they do. :p



The supposed value of the newer 67.02s (other than the shimmering bug fix) is it was said on the release notes they had "increased shader performance". Basically overall scores may not seem any higher with these drivers but the lowpoints in a given scene may indeed be performing better, that's why I'm looking forward to a more in-depth article or my own testing (after I play some more :p ).

While on the topic of the 67.02s, Brent didn't make any mention of it but I've heard one or two people saying they saw weird banding in the water with these drivers (maybe due to some opt or whatever). Anyone else see this? Currently running the last official ones myself, I don't wanna bother testing them if the IQ isn't there.

The only graphical difference I heard was that the latest Nvidia drivers were displaying thinner fog at a distance than ATI's. Whether its Nvidia's card displaying too little fog or ATI displaying too much of it I have no idea.
 
mappy said:
The only graphical difference I heard was that the latest Nvidia drivers were displaying thinner fog at a distance than ATI's. Whether its Nvidia's card displaying too little fog or ATI displaying too much of it I have no idea.

Well if it is a bug in CATALYST I will definately have it looked at and get it fixed. I love this game too much to not let it present itself in its full glory

:)
 
CATALYST MAKER said:
Well if it is a bug in CATALYST I will definately have it looked at and get it fixed. I love this game too much to not let it present itself in its full glory

:)

They have some screen shots here: http://www.xbitlabs.com/articles/video/display/half-life_5.html . You will see in some of the pictures that the fog is thicker in the ATI pics. I'm just guessing here but normally Fog should look thicker at a distance, so ATI looks to have it right and Nvidia needs a revamp.
 
mappy said:
They have some screen shots here: http://www.xbitlabs.com/articles/video/display/half-life_5.html . You will see in some of the pictures that the fog is thicker in the ATI pics. I'm just guessing here but normally Fog should look thicker at a distance, so ATI looks to have it right and Nvidia needs a revamp.


Ya I just read the review. Nonetheless might be prudent for me just to check it out and make sure we are doing it right.
 
CATALYST MAKER said:
Ya I just read the review. Nonetheless might be prudent for me just to check it out and make sure we are doing it right.

Whichever way it's supposed to be, I think it looks better on the radeon, lighter just looks more natural...

*edit* looking at some of the other screenies, it looks like just an all around gamma issue...but it looks selective...a couple of places the nvidia rendering looks more natural...some places ATi...
 
Laforge said:
Actually, I *currently* run linux on an ati card.

Pffft to you. I don't think of linux as a good "gaming" operating system. Seeing as how most video games aren't written for it and all.

And, any serious video renderer wouldn't be using something so droll as a radeon or a 6600..

Umm...I would disagree with you there...we have several small renderfarms at work which consist of racks of dual xeons with 9800 pros...and the linux support is super shaky...we're trying to upgrade them to Fedora Core3 which just came out...and we're having problems...

Basically, anyone who runs linux thinks of ATi as a joke...like it's literally laughable...
 
^eMpTy^ said:
Umm...I would disagree with you there...we have several small renderfarms at work which consist of racks of dual xeons with 9800 pros...and the linux support is super shaky...we're trying to upgrade them to Fedora Core3 which just came out...and we're having problems...

Basically, anyone who runs linux thinks of ATi as a joke...like it's literally laughable...

We are working on it. Its one of my missions to improve our Linux drivers big time.

Not that that topic has much to do with this thread.

Let me know if you need help in your upgrade I can see if there is any help I can offer.

PM me if you want help
 
CATALYST MAKER said:
We are working on it. Its one of my missions to improve our Linux drivers big time.

Not that that topic has much to do with this thread.

Let me know if you need help in your upgrade I can see if there is any help I can offer.

PM me if you want help

will do...we'll be knee deep in it tomorrow, hopefully all will go smoothly, but if not, I'll come try to pick your brain...

and thanks by the way...:)
 
CrimandEvil said:
There was alot of hot air in his post, I guess he's just disheartened by Gabe's claim of a 40% performance for ATI cards turned out to be a pile of horse sh**.. LOL

Exacly when was this claim made?
 
Spike23 said:
I hate ppl like you that don't buy an actual game, but instead download it. The game is worth the money. :(

I downloaded it from Steam after paying $59.95+ taxes on my credit card, thank you very much.
 
Do you get the same results in benches with XT-PE if you overclock the XT to XT-PE speeds?
 
Great review - nice to see unbiased reporting.

I have to say I don't have a cutting edge video card or PC but I am enjoying HL2 immensely!

My PC is a 3Ghz P4 (on the "old" 533 bus!!), 768Megs of single-channel PC2700 and a humble GF FX5900XT vid card and the game runs lovely with everything full at 1280x1024 on my LCD panel. I don't use AA which I guess helps with the performance but I do have 8xAF switched on which doesn't really impact performance at all. It is still the most beautiful looking game I have ever seen and I reckon it pulls >30fps for most of the time so is very playable.

So if you aren't lucking enough to own the latest Uber ATI or Nvidia card - the chances are you'll still get a great gaming experience so just buy it and enjoy :D
 
Emret said:
Do you get the same results in benches with XT-PE if you overclock the XT to XT-PE speeds?

The X800XT and X800XT PE have the same core so your results should be exactly the same. I'm pretty sure the vanilla X800XT uses the same 1.6ns RAM modules as the PE. If it uses 2.0ns then there will be a slight difference although if you can get the 2.0ns clocked to PE speeds then it will be faster then the 1.6ns.

I probably would of bought one of the X800XT Phantom Edition's back when they were the same price as the 6800GT's if there were actually any in stock. Right now people are on a waiting list a mile long to get one and even the X800Pro TIVO's are over $450. I only payed $405 for my XFX 6800GT.
 
fallguy said:
Exacly when was this claim made?

Jabba said 40%, Doug Lombardi - Director of Marketing 30%, Gary McTaggart - Software Developer 20%, and Gabe Newell claimed 40% as well.
 
Moloch said:
40-50 is kind of low for a FPS, most start to feel laggy around 40-50, and if it's 40-50fps, there must be some drops to the 30s or lower during heavy battle, btw how many fps do you get at default clockrate, apples to apples please.

If I am running a Current Gen -1 card (ATI or NV) why the HELL would I WANT to clock it back just to make the freaks happy? I am looking at from the standpoint of what guys could do with systems they might have right now to make this puppy fly. Just last year anything over 35 fps was considered playable, now if you score less then 100, you suck. Ok, that is logical.

You can have my overclock and my FPS results when you pry them from my cold dead fingers.

BTW, 50.3 FPS on a highly overclocked 5700 Ultra on the non-canals timedemo. I have yet to run the Canals demo, but will tonight.
 
CATALYST MAKER said:
Well if it is a bug in CATALYST I will definately have it looked at and get it fixed. I love this game too much to not let it present itself in its full glory

:)
I read stuff like this that you write and I just gotta say, I love you Terry!

Seriously, you get it and are into the gaming as much as I or anyone of us....'cept you actually make the stuff to make gaming better and actually help us to fix it.

Thank you, and I love you! :cool:

(Damn it man, there you go making me embarress meself in public again, stop it!)
 
digitalwanderer said:
I read stuff like this that you write and I just gotta say, I love you Terry!

Seriously, you get it and are into the gaming as much as I or anyone of us....'cept you actually make the stuff to make gaming better and actually help us to fix it.

Thank you, and I love you! :cool:

(Damn it man, there you go making me embarress meself in public again, stop it!)

Dude stop it your making me blush!!!!
 
from another thread:

Originally Posted by Brent_Justice
Our timedemos are unique it seems.

We are the only ones to use such long timedemos, 10 and 11MB each which come out to about 15-20 minutes of actual gameplay that was recorded, basically the entire level of canals_01 and coast_03. We are the only website to use such lengthy timedemos.

But for most of those 15-20 minutes you're limited by, and thereby testing, the CPU rather than the various GPUs, and that's not the intention of a GPU benchmark is it? As a consequence, the GPU candidates cannot be ranked based on your study. You claim that you are " facing off NVIDIA’s and ATI’s latest AGP video cards in Half Life 2", but your choice of method fails to do that. What you have done is not a ranking/screening test, but a performance test, which in itself is very interesting.

One of the ground rules for experimental screening of candidates is to eliminate any parameters constraining or accelerating the phenomenon to be studied beyond the measurable range. For example, when ranking the corrosion resistance of different metallic alloys one would attempt to chose testing conditions that give severe corrosion on the worst alloy, some corrosion on the intermediates and no corrosion on the best alloy (rather than no corrosion on all alloys or massive corrosion on all alloys).

I think the above principle applies for CPU ranking tests as well, but not necessarily for CPU performance tests.
 
Seems like HL2 leaves people happier the longest before bitchin. I think it went what..15 pages before it started going downhill

Swear it makes my head hurt.


HOLY CRAP PEOPLE. Both cards awesome! WHAT ARE YOU ARGUING ABOUT?!

Stop with the bickering!
 
jon67 said:
from another thread:



But for most of those 15-20 minutes you're limited by, and thereby testing, the CPU rather than the various GPUs, and that's not the intention of a GPU benchmark is it? As a consequence, the GPU candidates cannot be ranked based on your study. You claim that you are " facing off NVIDIA’s and ATI’s latest AGP video cards in Half Life 2", but your choice of method fails to do that. What you have done is not a ranking/screening test, but a performance test, which in itself is very interesting.

One of the ground rules for experimental screening of candidates is to eliminate any parameters constraining or accelerating the phenomenon to be studied beyond the measurable range. For example, when ranking the corrosion resistance of different metallic alloys one would attempt to chose testing conditions that give severe corrosion on the worst alloy, some corrosion on the intermediates and no corrosion on the best alloy (rather than no corrosion on all alloys or massive corrosion on all alloys).

I think the above principle applies for CPU ranking tests as well, but not necessarily for CPU performance tests.

You have a good point. But I think your missing the idea of a [H] review. [H] likes to show video cards and games from the perspective of the gamer. The point I got from Brent's review is that the game is mostly CPU limited, and therefore, except for a few occasions, what ever highend card you run will be great for the game.
 
-=bladerunner=- said:
So enjoy your chceck box features and broken video processor, something , what only nVidia can offer :D :D


How are features I use often, check box features?

I use 8xSSAA often.
I use stereo 3d often.

If Ati had these features, than I might own one. I have no bias towards a certain company. I just need to have the features I want.

It amazes me that so many people ignore stereo 3d. The effect is way more compelling than AA or AF.
 
^eMpTy^ said:
Umm...I would disagree with you there...we have several small renderfarms at work which consist of racks of dual xeons with 9800 pros...and the linux support is super shaky...we're trying to upgrade them to Fedora Core3 which just came out...and we're having problems...

Basically, anyone who runs linux thinks of ATi as a joke...like it's literally laughable...

OK.
You use, 9800 pros (video game cards) for rendering?

Uhm, and you don't see a problem here?

Should not QUADRO or FIREGL or 3DLABS be in your boxes?

and you expect ati to have written a driver for an UNFINISHED os?

Does nvidia do this? Did nvidia just come out with "Fedora3 drivers"?

Part ot the big problem with supporting *nix is there are so damn many of them!

http://www.nvidia.com/object/linux_display_ia32_1.0-6629.html

I see no mention of Fedora3
yeah.. the driver there should probably work..just like the ati ones SHOULD work.

But damn.. asking any hardware vendor to write for unfinished operating systems is like asking me to pick a motherboard for a 8 ghz pentium 6 cpu.
 
Jonsey said:
You have a good point. But I think your missing the idea of a [H] review. [H] likes to show video cards and games from the perspective of the gamer. The point I got from Brent's review is that the game is mostly CPU limited, and therefore, except for a few occasions, what ever highend card you run will be great for the game.
DING DING DING, we have a winner. :)
 
burningrave101 said:
Jabba said 40%, Doug Lombardi - Director of Marketing 30%, Gary McTaggart - Software Developer 20%, and Gabe Newell claimed 40% as well.

I asked when.
 
fallguy said:
I asked when.
IIRC..

it was before the source code was 'stolen' and an entire 'rewrite' was done.. so .. those claims could pertain to the 'original' half life 2..
 
Status
Not open for further replies.
Back
Top