Mister E said:It was a shame that Doom3 wasnt coded more appropriately for radeon cards too.
lol funny he hasnt said a word since
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Mister E said:It was a shame that Doom3 wasnt coded more appropriately for radeon cards too.
palabared said:lol funny he hasnt said a word since
That's cause you're running it in full precision, not FP16tazz said:half life 2 looks sweet on my gt 6800 card.
its looks even better then the pic thats on the frist post
Neck in neck on anything NOT DX9.starhawk said:p3n00b... if you recall... the fx5950's were neck-and-neck with the radeon 9800's... i personally planned on getting an albatron fx5950 ultra (second best of the pack) before the 6800's came out- and i loathe to h*** going with anything worse than second-best.
FP16 shouldn't make that big of a difference. I'm a little disappointed on my Mobility 9600 because there's visible banding in some places, but it's not a big deal.Met-AL said:That's cause you're running it in full precision, not FP16
true.dat. The 9800 had far superior DX9 performance to the NV35/NV38. No point in arguing that.Neck in neck on anything NOT DX9.
gordon151 said:The point that a lot are missing (especially ruined, whom is going around to multiple forums posting ridiculous statements on this) is that the performance boost is from the default performance after they have *forced* the game to run into dx9 mode, as well as using the 9800 device id. Performance in that mode was already unbearably low, most likely due to the fact many of the shaders were running in full precision mode, which is why forcing partial precision improved performance. The thing I'm wondering is, how does performance after doing that compare to the default nv3x mode for hl2. If it compares good and the iq gain is worth the performance loss, then the question people should be asking is "why did they remove the mixed-mode?".
{HLH} said:
dderidex said:Well, the funny thing is that Valve coded Half-Life 2 to use FP24 shaders all the time every time. And it's really not needed. Nope. In fact, FP16 seems to do the trick all the time - as seen in that above pic. FP16 and FP24 are indistinguishable in Half-Life 2.
add -dxlevel 90 to the HL2 shortcut, after the \hl2.exe" part.amdownzintel said:Sorry I must be a nub, but how do you run HL2 in DirectX9.0? Can anyone help me
No, it's Valve's bad decision to use FP24 all the time (sticking to the DX9 spec) but then not ALSO using 'partial precision hints' to let cards know when FP24 was not needed (and that's ALSO in the DX9 spec).fallguy said:No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24. NV chose to use FP32 and FP 16 with the FX cards. Its not Valves fault for going with the spec. It was NV's (bad) decision to not use FP24.
The FX cards are slower at DX9, get over it. Upgrade to a newer/better card.
The point of this thread is why HL2 runs slowly. Speculation for the motive Valve had is all over the place. So let me add my theory.chrisf6969 said:Instead of putting a tag in there that forces FX cards to run DX8, why not properly code the game to use FP16 where 16 will suffice instead of forcing 24, which makes all of NV's cards use fp32 which hurts them all, but MOSTLY the FX line, b/c they were fairly crappy cards!
ATI does all PS2.0 calculations in FP24, IIRC.GabooN said:Ok what we need here is some benchies and IQ testing.
And what FP does ATi's 9800/X800 series use in HL2?
fallguy said:No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24. NV chose to use FP32 and FP 16 with the FX cards. Its not Valves fault for going with the spec. It was NV's (bad) decision to not use FP24.
The FX cards are slower at DX9, get over it. Upgrade to a newer/better card.
Actually, according to Microsoft, partial precision is part of the PS2.0 standard. http://msdn.microsoft.com/library/d...ixelShaders/Instructions/Modifiers_ps_2_0.asp I posted that on the first page. _pp is only a hint and can be ignored, which is what ATI does when it runs a shader with that hint.fallguy said:No, the funny thing is that Valve went with DX9 minimum spec at the time, which was FP24.
^eMpTy^ said:It's valve's fault for not delivering the best gaming experience for nvidia users. While they were spending all that time making "ATi levels" and bragging about the performance of ATi cards, they could have been flipping some switches to try to boost performance on the FX series which is an extremely popular line of cards. But hey, if marketing dollars are more important to you than your customers getting the most out of your game, then go ahead...stick "ATi" all over the box and cd and include vouchers and coupons for ATi cards all over the place...
ajm786 said:I think now it is prevalent that it is beside the point for the 6800 series. The 6800s run in FP32, where it has strong performance. The whole point of the argument is that the FXs run very poorly with FP32, which it defaults to since it can't run FP24.
So again, I think it's totally beside the point for the 6800 series. For those of us who have it, carry on.
fallguy said:Yeah, dang that Valve for going with DX9 specs.
mohammedtaha said:Maybe they shoulda made Doom 3 more forgiving for the Ati cards ... maybe just maybe the same thing was done on Doom 3 ... they intentionally made X800 cards run slower .. but that doesn't matter because ATi people don't care .. their cards run the game WELL ...
dderidex said:Look, I keep bringing this point up, and you keep ignoring it.
VALVE DIDN'T FOLLOW THE DX9 SPEC!!!
So they went with FP24, big deal, DX9 spec ALSO calls for using partial precision hints when possible, and they DIDN'T do that.
AGAIN: VALVE "PICKED AND CHOOSED" WHICH PART OF THE DX9 SPEC TO FOLLOW TO HURT THE FX CARDS THE MOST.
Actually there's a thread about beyond3d which explains how doom3 was coded for cards which didn't have strong FPU performance(read fx series) and that actually hurts performance for both the 6800 and X800dderidex said:Doom3 just followed the OpenGL spec, they didn't do anything WITH it to intentionally harm ATI cards....it's just that ATI cards suck in OpenGL. (Seriously - in ALL OpenGL, they are slower in Quake 3, Call of Duty, etc - anything OpenGL, ATI is slower in than nVidia)
if valve increased performance with nv3x based cards, i'll be willing to bet that that 2.5% would jump up significantly because more people will want to try it out, knowing that it will run somewhat decently.DaveBaumann said:Now, given that Valve had already stated that the low end FX series would be treated as DX8, the only boards that they intended for the Mixed mode to operate with would be the 5800/5900/5950. If you take a look at the Steam video card stats this constitutes 2.55% of their install base
Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?
DaveBaumann said:Given the time for coding and validation within the game environment, especially in a title that has already slipped 14 months further than it should, is the further coding and support requirements worth it for 2.5% of your userbase?
(cf)Eclipse said:i'm quite curious as to how much this tweak will help out th nv4x based nvidia cards, since they are already superior to the nv3x in every way, and not terribly far behind ati in most benchmarks.
mohammedtaha said:Maybe they shoulda made Doom 3 more forgiving for the Ati cards ... maybe just maybe the same thing was done on Doom 3 ... they intentionally made X800 cards run slower .. but that doesn't matter because ATi people don't care .. their cards run the game WELL ...
You do notice that some companies are selling Doom 3 games with their nVidia cards .. right ? so why can't ATi do the same ?
Keep this debate smart .. and stop attacking the wrong people ... sponsoring is one thing and cheating is another ...
but if you really don't need fp32... even if the nv40 is way stronger with fp32 than the nv30/35 is, fp16 is still faster. if it can be proven that there is no discernable image difference, why not do it?ajm786 said:How would this help out any of the NV4X based video cards? We don't want to run HL2 in FP16; we want to run it in FP32.
Actually, yes, I am.fallguy said:"when possible" I guess you're a coder now, right?
Just get over the fact the FX cards are not very good compared to the ATi counterparts in DX9 games, and upgrade or stop crying about it every week.
(cf)Eclipse said:if valve increased performance with nv3x based cards, i'll be willing to bet that that 2.5% would jump up significantly because more people will want to try it out, knowing that it will run somewhat decently.
kind of a catch-22 there huh?
(cf)Eclipse said:but if you really don't need fp32... even if the nv40 is way stronger with fp32 than the nv30/35 is, fp16 is still faster. if it can be proven that there is no discernable image difference, why not do it?