Any way to force a GeForce FX to run HL2 in DX9 mode?

Status
Not open for further replies.
Strange, reflect world/reflect all doesn't reflect the world or everything on my 9600. :p I'll have to try it later on my desktop.
 
pxc said:
Strange, reflect world/reflect all doesn't reflect the world or everything on my 9600. :p I'll have to try it later on my desktop.
That's because it has shitty DX9 peformance.. oh wait.. a 9600 is faster than a 5900 ultra.. sorry..
Driver version etc?
 
Moloch said:
we could doto the low frame rate Ace!

LOL.... Your inexperience just keeps adding up.

Anyway, they only frame rate difference you would recognize with would be in a results chart from a benchmark, yapping about how a 9800 card is 10 frames faster than a 5900 card. When someone like you saw the game running at, say, 80 fps vs. 70 fps, you'd have to guess which was which. And you'd have to guess again between DX9 vs. DX8.1.
 
Is shit Half-Life 2 performance for NV3x cards so hard to believe? Anybody remember last September when Valve release preliminary benchmark results, stating that it took three times the man-power to code Half-Life 2 for NV3x cards than R3x0 cards? NV3x cards just aren't as good in DX9 as R3x0 cards, never have been (although that isn't the exact case here). Honestly, nVidia's current generation is a great product, but I personally wouldn't bother purchasing last-gen. This is one of the reasons why. You bought a 5900 or 5950 or whatever, and now you must live with that blind faith consequence of having ghetto image quality in Half-Life 2 (which, I might add, is made worse to the NV3x's already inferior IQ to that of the R3x0).
 
dderidex said:
*sigh* ATI fan-bois out in force tonight, it seems.

In any case, I've been busily playing a PS2.0 game on my 5900xt. I suppose I should tell my card that it sucks and performs worse than ATI cards at shaders, otherwise it might keep winning benchmarks in the game (PS 2.0 support was added in this patch - notice how the 5950 beats the 9800s!) As a flight sim, 35 fps is *more* than perfectly playable, anyway.

And if you think nVidia is bothering to do shader replacement on THAT game, you are simply delusional and that's all there is to that. ANY kind of flight sims are a ridiculously niche market, and WW2 combat flight sims of non-American aircraft an even smaller niche of a niche market.



It seems you would be the only fan"boi" here, Deridex. is the Pixel Shader 2.0 plug supposed to make you look intelligent? Just because you play one of the games in the 10% margin where the 59x0 cards do slightly better than 9800 cards do does not make it a better over-all product. ;)
 
Matt Woller said:
It seems you would be the only fan"boi" here, Deridex. is the Pixel Shader 2.0 plug supposed to make you look intelligent? Just because you play one of the games in the 10% margin where the 59x0 cards do slightly better than 9800 cards do does not make it a better over-all product. ;)
Why do you people insist on putting words in my mouth?

I NEVER SAID IT WAS.

I was responding to the poster who keeps chanting "FX is crap and can't do anything, FX is crap and can't do anything, FX is crap and can't do anything". My point was merely that it is NOT crap, it can do a HELL of a lot of stuff very well, and EVEN SOME DX9 things very well!

I mean, if the "FX is crap and can't do anything" and it manages to beat a 9800xt in even a SINGLE game with no shader replacement or anything....isn't that pretty astonishing and worthy of note? (Oh, and the reason I used that resolution is that you can't go any higher with that game and still have playable FPS on anything but a 6800 or X800. It's so hard on graphics cards, that's all the more you can go. Although, if you MUST know, the 5950 still beats the 9800xt at 1280x1024, AND at 1600x1200 - but, as you can see, the framerate is unusable on any card less than the 6800s or X800s...and even then, it's marginal)
 
Badger_sly said:
LOL.... Your inexperience just keeps adding up.

Anyway, they only frame rate difference you would recognize with would be in a results chart from a benchmark, yapping about how a 9800 card is 10 frames faster than a 5900 card. When someone like you saw the game running at, say, 80 fps vs. 70 fps, you'd have to guess which was which. And you'd have to guess again between DX9 vs. DX8.1.
Are you that stupid?
the FX cards suck in HL2.. get it through your skull?
http://www.xbitlabs.com/images/video/half-life/canals_1024_candy.gif
http://www.xbitlabs.com/images/video/half-life/town01_1024_candy.gif
And btw, check out this article for the IQ differences between different versions of DX.
http://gear.ign.com/articles/567/567437p1.html
 
CleanSlate said:
"single player games" yes I consider CS:S a single player game, as well as HL death match :rolleyes: . ;) :p

~Adam


What about a half-assed revision 2 Beta that's about 6 years out-dated and has slightly prettier graphics than Counter-Strike? You know... the only feature about CS: Source that makes it "appealing". Yep, the one that gives it the graphics of, say, UT2K3... which, in itself is 2 years old. ;)
 
dderidex said:
Why do you people insist on putting words in my mouth?

I NEVER SAID IT WAS.

I was responding to the poster who keeps chanting "FX is crap and can't do anything, FX is crap and can't do anything, FX is crap and can't do anything". My point was merely that it is NOT crap, it can do a HELL of a lot of stuff very well, and EVEN SOME DX9 things very well!

I mean, if the "FX is crap and can't do anything" and it manages to beat a 9800xt in even a SINGLE game with no shader replacement or anything....isn't that pretty astonishing and worthy of note? (Oh, and the reason I used that resolution is that you can't go any higher with that game and still have playable FPS on anything but a 6800 or X800. It's so hard on graphics cards, that's all the more you can go.)


But the FX IS crap and it CAN'T do anything.
 
Matt Woller said:
But the FX IS crap and it CAN'T do anything.
Sorry, but I just proved you wrong.

Again, since you seem to have missed it, let me present you with the list of games the 5950 beats the 9800 Pro (sometimes XT) in:
Note that IL2 Sturmovik: Forgotten Battles IS a DX9 game, using PS2.0 shaders. And Halo and Painkiller are ALSO both DX9 games (although I don't think either uses PS2.0 shaders).

For a card that is 'crap' and 'can't do anything', it sure does pretty well in my book.
 
dderidex said:
Why do you people insist on putting words in my mouth?

I NEVER SAID IT WAS.

I was responding to the poster who keeps chanting "FX is crap and can't do anything, FX is crap and can't do anything, FX is crap and can't do anything". My point was merely that it is NOT crap, it can do a HELL of a lot of stuff very well, and EVEN SOME DX9 things very well!

I mean, if the "FX is crap and can't do anything" and it manages to beat a 9800xt in even a SINGLE game with no shader replacement or anything....isn't that pretty astonishing and worthy of note? (Oh, and the reason I used that resolution is that you can't go any higher with that game and still have playable FPS on anything but a 6800 or X800. It's so hard on graphics cards, that's all the more you can go. Although, if you MUST know, the 5950 still beats the 9800xt at 1280x1024, AND at 1600x1200 - but, as you can see, the framerate is unusable on any card less than the 6800s or X800s...and even then, it's marginal)
Hey Ace, nice frame rate there.. I wouldn't want to use any of those if my monitor was capable of 1600x1200.
Btw, how many of those games you listed make heavy use DX9, and how many of them show the 59XX series having FSAA as good as the 3 year old 9700?
None;)
The 9700/9800 was for people who played with the eye candy on, and actually looked like FSAA was on, gamma corrected fsaa> FX crappy fsaa.\
dderidex said:
Sorry, but I just proved you wrong.

Again, since you seem to have missed it, let me present you with the list of games the 5950 beats the 9800 Pro (sometimes XT) in:
Note that IL2 Sturmovik: Forgotten Battles IS a DX9 game, using PS2.0 shaders. And Halo and Painkiller are ALSO both DX9 games (although I don't think either uses PS2.0 shaders).

For a card that is 'crap' and 'can't do anything', it sure does pretty well in my book.
Again with the 3dmark.
Let's see how the FX series will do in supposed "next gen" DX9 games, shall we?
http://www.techreport.com/etc/2004q3/3dmark05/img/3dm-overall.gif
All your other tests don't feature DX feasts which the FX was "made for".
 
3DMark is synthetic, hell nShittia even cheated it's way into 03 (and partially into 05). The vast majority of the rest are DirectX 9 games. If you would look at actual reviews and comparisons you would see the 9800 Pro/XT walking all over the 5900U/5950U. That is, if your biased eyes can see that far. As far as nVidia is concerned, all they're really done is use Doom 3 as a reason to sell cards. And, might I mention, the only reason to sell cards.

It seems to me you're just trying to justify your purchase, are you not? It's obviously too late to go back in time and purchase the superior product, so you spend all this time trying to make your purchase seem less insignificant, corret? I know I would... hell, anybody would try and make as much as they could out of what... $450? $500? How much did you pay for your pile of crap? It seems to me you're just in denial. If you could do it all over again, you'd probably get a Radeon 9600XT... after all, it has about the same performance as the 5950 Ultra (in current games... only the 9600XT looks much better (nVidia's shitty IQ with the NV3x coupled with the fact that anything over 2xAA isn't worth the time really sucks balls for NV3x users) and costs what, half as much? Currently a good 1/3 as much. For the same damn thing, only better. What a shame. =P

Even if you wanted to argue the fact that the 5950 is decent in some games (and even legitimately beats the 9800XT in a few) than fine, we could compare price-wise the 5950 Ultra to, say, a 9800 Pro. Roughly the same thing when combining performance with Image Quality (although the 9800 Pro beats the living crap out of the 5950 in IQ, as usual)... even the 9800 Pro is $190. Hell, the 9800 Pro has been hovering around the $200 price range for well around a year. Your games are slower, and have less Image Quality on a vastly more expensive product... you're trying to justify the spending of large sums of money to acheive results that a) your !!!!!!ishness made you buy, b) some idiot at Best Buy made you purchase, c) nVidia made you purchase, or d) a review site gave you the idea to purchase due to a skewed and biased review. Sounds about right... my friend, if I just realized I blew $250 I'd be pissed too. ;)
 
All the more reason why I have a 6800GT in my main rig. :D My 9800Pro and 5900 are now in other rigs. It's well known the FX series lacks in DX9 games.
 
Matt Woller said:
Sounds about right... my friend, if I just realized I blew $250 I'd be pissed too. ;)
Actually, if you took a look at my sig, you'd see I'm running an FX 5900xt - going for around $140-$150 - way faster than a 5950 Ultra.

The XFX 5900xts *all* clock up to 5950 Ultra spec, and are much cheaper than a 9800xt.
 
dderidex said:
Actually, if you took a look at my sig, you'd see I'm running an FX 5900xt - going for around $140-$150 - way faster than a 5950 Ultra.

The XFX 5900xts *all* clock up to 5950 Ultra spec, and are much cheaper than a 9800xt.

XFX GeForceFX 5900xt (518/985, Det 66.81)


Actually that is a damn good overclock for a 5900xt. My 5900 non ultra will max at about 450/950.

How did you manage that overclock with slower ram? You flash the bios?
 
Matt Woller said:
3DMark is synthetic, hell nShittia even cheated it's way into 03 (and partially into 05). The vast majority of the rest are DirectX 9 games. If you would look at actual reviews and comparisons you would see the 9800 Pro/XT walking all over the 5900U/5950U. That is, if your biased eyes can see that far. As far as nVidia is concerned, all they're really done is use Doom 3 as a reason to sell cards. And, might I mention, the only reason to sell cards.

1) You might be right on the cheating, but it wasn't the only one. Almost all the major corporations have cheated at one time or another in the comptuer industry. Ati and nVidia are no exceptions.

2) I'm sorry.. but have you looked at anything but FarCry, Half Life 2 and Doom3 benchmarks? Regardless of if they're the "hottest" games, they are by far not the only ones. nVidia has many selling pionts, as well as ATi. Neither games sells on one game...
 
number69 said:
Actually that is a damn good overclock for a 5900xt. My 5900 non ultra will max at about 450/950.

How did you manage that overclock with slower ram? You flash the bios?
See, that's the thing, XFX uses 2.2ns ram on their cards.

I had a BFG 5900nu too, and it topped out at 475/950 (and at that, artifacting a bit). In my experiences (BFG 5900nu vs MSI and XFX 5900xts) the XTs are better overclockers than the NUs are. Maybe because they are mostly newer parts? (The XT came to market MANY months after the nu)

Dunno. Anyway, no, didn't do any BIOS flash or anything. No cooling mods, either - that's just with the stock cooling.
 
DropTech said:
1) You might be right on the cheating, but it wasn't the only one. Almost all the major corporations have cheated at one time or another in the comptuer industry. Ati and nVidia are no exceptions.

2) I'm sorry.. but have you looked at anything but FarCry, Half Life 2 and Doom3 benchmarks? Regardless of if they're the "hottest" games, they are by far not the only ones. nVidia has many selling pionts, as well as ATi. Neither games sells on one game...
Those games are DX9, and doom3 is OGL.. whatever the equiv to it is, that's what the FX should be able to run genius, not a game that had the geforce 3 or 8500 in mind.
Ya think with all the extra instructions the FX is able to do, nvidia would atleast make the performance acceptable :rolleyes:
I dont remember ati forcing FP16, and doing obvious texture filterting hacks.
Let me save you the troble.. OMG QUACK3..
A huh.. and the "cheat" was disabled 1 driver release later, and with better performance.
 
http://www.anandtech.com/video/showdoc.aspx?i=2281&p=2

You really want to run in DX9?

"Overall, the move from DX9 down to DX8 isn’t horrible; while it does reduce some of the appeal of Half Life 2, the game still looks incredible in DX8 mode. There are some issues with forcing NV3x GPUs to run in DX9 mode mainly involving the water, but as you will see on the coming pages, if you've got a NV3x you're not going to want to play in DX9 mode."

Still want to argue?

OUCH even an X300SE is faster then your fx5900 in DX9 lol
 
DASHlT said:
Actually, it doesn't look bad, really. If a 'stock' 5900xt gets 27fps at 1024x768 on a 'worst case' test, my card which is a solid 50% faster should be absolutely playable at that resolution in DX9. And that really is the worst case test - in all their other levels, the 5900xt does much better - sometimes even reaching 40fps. Again, my 50% performance advantage on that card should give me an ace DX9 experience!

IF I was interested in HL2. Which, due to discussions in other threads about the type of game it is, I'm not. But, at least it's good to know that my "crap" card that "can't do anything" can still play HL2 in DX9 mode at High Quality just fine.
 
dderidex said:
Actually, it doesn't look bad, really. If a 'stock' 5900xt gets 27fps at 1024x768 on a 'worst case' test, my card which is a solid 50% faster should be absolutely playable at that resolution in DX9. And that really is the worst case test - in all their other levels, the 5900xt does much better - sometimes even reaching 40fps. Again, my 50% performance advantage on that card should give me an ace DX9 experience!

"IF I was interested in HL2. Which, due to discussions in other threads about the type of game it is, I'm not."

Then why did you start a post about HL2 then, if your not interested??? DASHlT
 
dderidex said:
Actually, it doesn't look bad, really. If a 'stock' 5900xt gets 27fps at 1024x768 on a 'worst case' test, my card which is a solid 50% faster should be absolutely playable at that resolution in DX9. And that really is the worst case test - in all their other levels, the 5900xt does much better - sometimes even reaching 40fps. Again, my 50% performance advantage on that card should give me an ace DX9 experience!

IF I was interested in HL2. Which, due to discussions in other threads about the type of game it is, I'm not. But, at least it's good to know that my "crap" card that "can't do anything" can still play HL2 in DX9 mode at High Quality just fine.

How do you get your card is 50% faster than a normal clocked 5900XT? While its clocked a lot faster than stock, it doesnt mean its going to get 50% more fps.

Note, those links are without AA/AF, and a low res. Maybe a good setting for some.. sure isnt for me.
 
DASHlT said:
Then why did you start a post about HL2 then, if your not interested??? DASHlT
Well, at the time I started the thread, I didn't KNOW that.

I had read all the reviews proclaiming how Half-Life 2 was so revolutionary, and it redefined the shooter genre, and blahblahblah. Turns out, it's just another 'on rails' shooter like Doom3. There is only one way past a level, and it's through every monster spawned in the same spot every time.

You don't make any choices that effect the outcome of the game. The game is level-based, and there aren't multiple different 'endings' for each level! Let alone multiple ways to play the level depending on your style, etc.

Deus Ex was a revolutionary shooter. Half-Life 2, it turns out, is the same ol', same ol' we've come to be used to, just with flashier graphics and more realistic death animations. Big freaking deal.

Seriously, aside from the improved graphics and improved death animations (which we call 'evolutionary' not 'revolutionary') how does the game play in HL2 differ from Doom3, Return to Castle Wolfenstein, Serious Sam, or Unreal? It's the same damn thing over and over, just with flashier graphics and 'improved physics models' each time.

So, IOW, my interest in the game that spawned this thread was generated entirely by pre-release hype, and now that I've had a chance to TALK with people about what the game is actually LIKE, I am not interested any more. Still, as with Doom3, I'll probably pick up the demo just to have something flashy to show off the graphics capability of the PC, but, like the Doom3 demo, won't actually PLAY it.
 
fallguy said:
How do you get your card is 50% faster than a normal clocked 5900XT? While its clocked a lot faster than stock, it doesnt mean its going to get 50% more fps.

Note, those links are without AA/AF, and a low res. Maybe a good setting for some.. sure isnt for me.
3dMark03. Yeah, yeah, it's 'not a valid game benchmark' - fine, whatever. It still shows how optimizations and tinkering with your system affect the score. My 'stock' score is a under 5k by a bit, and I'm at 7k now (haven't updated sig in a test run or two). Ergo, 50% faster.

As to 'low res' - I'd hardly call 1024x768 "low res". Maybe this is just a matter of perspective, but I only have a 17" LCD - I can't even GO any higher than 1280x1024 if I wanted to. Using 1024x768 works *just fine* for me.
 
dderidex said:
3dMark03. Yeah, yeah, it's 'not a valid game benchmark' - fine, whatever. It still shows how optimizations and tinkering with your system affect the score. My 'stock' score is a under 5k by a bit, and I'm at 7k now (haven't updated sig in a test run or two). Ergo, 50% faster.

As to 'low res' - I'd hardly call 1024x768 "low res". Maybe this is just a matter of perspective, but I only have a 17" LCD - I can't even GO any higher than 1280x1024 if I wanted to. Using 1024x768 works *just fine* for me.

You will not get 50% more frames than a stock clocked 5900XT in HL2. Unless its different than any game Ive played with an overclocked card.

Yes, "low res' is subjective. 1024x768 is probably more use than any res, but this is [H], and we typically have the upper scale of hardware. At least more so than the common PC folk. 1024 shows jaggies very badly. But, that too, is subjective.

IF you do play HL2, I would just run it in DX8. Other than the water, there isnt any really big difference to me. Of course you could always try to run DX9, and see if the decrease in frames is worth it.
 
LOL, just as I wrote:
"Anyway, they only frame rate difference you would recognize with would be in a results chart from a benchmark, yapping about how a 9800 card is 10 frames faster than a 5900 card."

Thanks for proving my point. ;)

Moloch said:
Are you that stupid?
the FX cards suck in HL2.. get it through your skull?
http://www.xbitlabs.com/images/video/half-life/canals_1024_candy.gif
http://www.xbitlabs.com/images/video/half-life/town01_1024_candy.gif
And btw, check out this article for the IQ differences between different versions of DX.
http://gear.ign.com/articles/567/567437p1.html
 
DASHlT said:
OUCH even an X300SE is faster then your fx5900 in DX9 lol

HOLY SHIT that's pathetic!!

You know, I often wonder if NV3x as a whole wasn't one big "cheat". nVidia would do it. Think about it... far inferior image quality... lack of AA above 2x... if saying that nVidia just made a shit product wasn't so fun, I think the majority of us would be pondering the intentional dumbing-down of image quality for performance gain... after all, nVidia has done it before. ;)

Seriously, it's cool that you've OCed the 5900XT to 9800XT+ speeds (I've even considered it before.. but then was slapped out of it with the glaring IQ differences) but honestly, in the end, isn't Image Quality a bit more important that speed? I'd rather have my games run slow as hell than look like ass... not that my games run slow as hell (on my 9700 Pro) but seriously... the Image Quality is a huge difference. nVidia f@cked up these last two years, nobody can deny it. It's ok though, they're making up for lost time... I recommend nVidia cards at certain price points.

However, with that being said... NV3x sucks balls, always has and always will... it took them 2 years to get their product's overall IQ up to the standards that ATI set with it's 9700 Pro, so unless you're talking NV40+ then just stop talking. You're screwed as far as the vast majority of DirectX 9 games... accept it and get over it. =P
 
Ok, so Half-Life 2 is "generic", right? So it should be like, oh, I don't know... Serious Sam 2, or Call of Duty, right? Oh shit, I know, 95% of all First Person Shooters involve fighting enemies that spawn in specific spawn points and you're pretty dumb if you didn't know that already. What did you expect Half-Life 2 to be? UT? UT uses bots in completely different types of gameplay... it's not really the same genre. Tribes? Same thing.

Perhaps if you stopped being a complete idiot for a few moments in your life (I know, hard) than you'd be able to realize that what's revolutionary about Half-Life 2 is the environment, the atmosphere... the sound, the visual quality and clarity, the physics, the engaging interaction. Hell, you don't have to love Half-Life 2 or any other game to witness such aspects of it. Hell, at least Half-Life 2 wasn't Halo for Christ's sake! God knows how repititous that level design was. =P
 
LOL, just as I wrote:

I find it hard to believe you or anyone else on this board isn't guilty of the same thing, and looking from your posts you seem to act in an extremely condescending manner to those with 9800 pro cards and towards Valve especially. Stop being so quick and rabid to judge and snap back with snide remarks and try to be objective
 
Matt Woller said:
but honestly, in the end, isn't Image Quality a bit more important that speed? I'd rather have my games run slow as hell than look like ass... not that my games run slow as hell (on my 9700 Pro) but seriously... the Image Quality is a huge difference.
Actually, image quality IS important to me. That's why I went WITH nVidia.

Wife and I still play Grim Fandango and The Longest Journey. I still play Jane's F/A-18 and Starfleet Command 1. ATI can't anti-alias any of those games at all. Heck, ATI can't even RUN Grim Fandango in 3d mode - you have to use the software renderer to play that game!

ATI's support for older games is abysmal, and I WANT to have anti-aliasing in them and I WANT to run them in 3d mode. Which means nVidia is my only option if image quality is a consideration.
Oh shit, I know, 95% of all First Person Shooters involve fighting enemies that spawn in specific spawn points and you're pretty dumb if you didn't know that already. What did you expect Half-Life 2 to be? UT? UT uses bots in completely different types of gameplay... it's not really the same genre. Tribes? Same thing.
Who is being dense here?

I mean, first off, Tribes IS an exception - as is America's Army, Planetside, I could go on. You aren't fighting through 'levels' with enemy monsters spawning in the same spot each time, with only one way 'through' the level. Deus Ex also is an exception, then. And NOLF and NOLF 2. Etc.

Secondly, if 95% of the shooters 'involved the same thing' and Half-Life 2 is supposed to be 'revolutionary', can't you understand how someone would expect a 'revolutionary' game NOT to be a carbon copy of the other 95%, just with incremental improvements to graphics, sound, and physics?

that what's revolutionary about Half-Life 2 is the environment, the atmosphere... the sound, the visual quality and clarity, the physics, the engaging interaction. Hell, you don't have to love Half-Life 2 or any other game to witness such aspects of it. Hell, at least Half-Life 2 wasn't Halo for Christ's sake! God knows how repititous that level design was. =P

Yeah, but those AREN'T "revolutionary". It's not like Half-Life 2 is the first game to have 'engaging interaction' with NPCs. It's not like Half-Life 2 is the first to use 3d accelerated environments. It's not like Half-Life 2 is the first game to use any kind of physics calculations.

Arguably, all Half-Life 2 offers are hardly more than incremental changes to the same ol' formula we'v seen before. I'm not arguing that it has the BEST graphics in a shooter to date, maybe even the most 'realistic' physics in a shooter, or 'engaging interaction'. Those are still all things that have been done before, HL2 just improved them.

Again, not 'revolutionary' - the game doesn't take any risks, doesn't try anything NEW.
 
SnakEyez187 said:
I find it hard to believe you or anyone else on this board isn't guilty of the same thing, and looking from your posts you seem to act in an extremely condescending manner to those with 9800 pro cards and towards Valve especially. Stop being so quick and rabid to judge and snap back with snide remarks and try to be objective
Going from 50fps to 38 should be pretty noticeable, and from 60 to 47 in the 2nd benchmark, and while doing less work.
40fps is borderline playable in a FPS, when 10 fps difference is the dif beetween 50 and 40fps, that's gonna be noticeable, and from 60 to 47, again noticeable.
He needs to cling to his shit product, because I know if he were you let go, you'd die, since he loves his FX so much.
Check out the IQ diff- lighting is across the board better in DX9 mode, and the IQ on the FX is no where near the IQ of a 9700/9800 board, so through more jaggies, and lighting, it should be easy to tell the difference, as well as water reflections.
 
I'm trying to see how many more times Matt Woller can repeat himself with reworded sentences and make it look like he's saying something different. We got it Matt, you hate the FX series and your two year old 9700 is the greatest thing on the planet.
 
number69 said:
I'm trying to see how many more times Matt Woller can repeat himself with reworded sentences and make it look like he's saying something different. We got it Matt, you hate the FX series and your two year old 9700 is the greatest thing on the planet.
9700 is much better than ven the 5950 ultra for DX9.. ;)
 
Moloch said:
9700 is much better than ven the 5950 ultra for DX9.. ;)

lol. We............ know.

How about we reword that into "the 5950 is not as good as the 9700 in DX9 applications".

:p

The topic of the thread was forcing the FX to run HL2 in DX9 mode and it was answered by aZn_plyR in the third post into the thread. This turned into a debate that would have been germane over a year ago.

Some people here are taking the fact that the guy likes his 5900xt as a personal slight. If he likes his card let him like it. Calling him an idiot and dumb is considered flaming.
 
number69 said:
Some people here are taking the fact that the guy likes his 5900xt as a personal slight. If he likes his card let him like it. Calling him an idiot and dumb is considered flaming.
Indeed, and if you were to search on my email address, you'd find, I've owned:And that's just on 3dMark03. Checking out my 3dMark2001 scores, I go all the way back to the Voodoo5 (with a few Radeon 7200s and 9100s thrown in for good measure).

So, it's not exactly like I don't know what I'm talking about. I've used LOTS of ATI cards, and I've used LOTS of nVidia cards, and a bunch of 3dfx cards, too.

I gotta say, the 6800nus were easily the finest cards I've ever owned, but I just couldn't justify the price tag for them. And given the choice of the 9800 Pro/XT or a 5900....well, I've *owned* several versions of each, and I know which is better in the games I play:

games_installed.jpg


Although this isn't *quite* a current list, you'll notice there is no 'Far Cry' or 'Doom 3' or 'Half-Life 2' there. As I've said, oh, a million times - if 'Far Cry' and 'Half Life 2' are the ONLY games you play, HELL YEAH ATI has the best cards for you. If you have a broader gaming scope than that...well....I do, and I've owned both, and I know which is better overall.
 
dderidex said:
That's a little unbelievable somehow.

I mean, the 5950 Ultras do *so well* in Doom3, and are certainly quite excellent in...well, hell, virtually everything else out there right now. I mean, in Doom3 they handle 'High Quality' settings with 1024x768, 4xFSAA, and 8xAniso and still keep the average over 30fps.

And 3dMark03 (while everyone proclaims it's not a "valid benchmark") was certainly the heaviest use of shaders abusing the GeForce FX line to date, yet my 5900xt @ 5950 is pushing 7k in it.

Hard to believe a card that can do both those things can't manage playable framerates in HL2 using NO anti-aliasing with the DX9 path.

Doom3 is OpenGL. 3dmark is pointless and useless.

FarCry is running in a DX8 path for your card too, IIRC.

The FX series was a mistake, IMHO...
 
dderidex said:
That's a little unbelievable somehow.

I mean, the 5950 Ultras do *so well* in Doom3, and are certainly quite excellent in...well, hell, virtually everything else out there right now. I mean, in Doom3 they handle 'High Quality' settings with 1024x768, 4xFSAA, and 8xAniso and still keep the average over 30fps.

And 3dMark03 (while everyone proclaims it's not a "valid benchmark") was certainly the heaviest use of shaders abusing the GeForce FX line to date, yet my 5900xt @ 5950 is pushing 7k in it.

Hard to believe a card that can do both those things can't manage playable framerates in HL2 using NO anti-aliasing with the DX9 path.

Have you ran 3dmark 2005 at all? Hell a 9600xt kicks a 5950 all over the place. :eek:
Its a FACT that the Fx line are very poor at dx9, everyone knows it.
 
Badger_sly said:
LOL, just as I wrote:
"Anyway, they only frame rate difference you would recognize with would be in a results chart from a benchmark, yapping about how a 9800 card is 10 frames faster than a 5900 card."

Thanks for proving my point. ;)

Moloch said:
Are you that stupid?
the FX cards suck in HL2.. get it through your skull?
http://www.xbitlabs.com/images/vide..._1024_candy.gif
http://www.xbitlabs.com/images/vide..._1024_candy.gif
And btw, check out this article for the IQ differences between different versions of DX.
http://gear.ign.com/articles/567/567437p1.html

1. Its 14 FPS faster, not 10.
2. The 5950U is in DX8 mode, not DX9.
3. It would be even worse at a higher res than 1024x768.
4. If you notice here the impact the 5900 series take when running DX9, its not even going to be close to a 9800 series card.
 
dderidex said:
Actually, image quality IS important to me. That's why I went WITH nVidia.


Right there you lost all credibility...the FX 5900 has the WORST image quality of all the nvidia cards out there....wow.....how pathetic lol

DASHlT
 
Status
Not open for further replies.
Back
Top