Valve sucks

Status
Not open for further replies.

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
First (before I explain how this is relevant to the "video card" thread), guess what graphics card this is rendered on? It's not my pic (I'm just hosting at the moment). Do pay attention to the FPS counter in the lower right.

guess_who.jpg
 
Another hint....some tweak was done....

Using the Valve CS:Source stress test, this tweak boosted performance on a certain card from 36 FPS to 64 FPS with virtually no image quality loss.

Guess the card yet?

OR the tweak?
 
heh, can this tweak be performed on other nvidia cards? Not that I'd need it, but it'd be nice.
 
Oh, it's pretty sad really.

Basically, some guys on Guru3d figured out what Valve did to cripple nVidia cards.

First off, you need 3dAnalyze. I'm assuming everyone knows that you can force HL2 to run in DX9 mode on FX cards, right? Only, you get artifacts in the water and other areas?

Well, that's pretty easy to fix. Just have the 3dAnalyze util report your card as an ATI Radeon instead of a GeForce FX.

*taddah* All the artifacts go away, and you get true DX9 reflections!

Okay, but there IS a performance hit doing that. How to get around that?

Well, the funny thing is that Valve coded Half-Life 2 to use FP24 shaders all the time every time. And it's really not needed. Nope. In fact, FP16 seems to do the trick most the time - as seen in that above pic. FP16 and FP24 are indistinguishable in Half-Life 2 for the most part.

Again, using 3dAnalyze you can test this. It is capable of forcing a card to use only FP16 shaders no matter what is requested. You'll see virtually no image quality difference doing that - just a HUGE performance boost. Why? Well, because while FP16 is all that Half-Life 2 *needs* almost all the time, if they let the GeForce FX cards do THAT, they might have been competitive! So, instead, they forced full precision in every shader op (unneeded), which caused the GF-FX cards to render the DX9 mode in FP32 all the time. With the obvious associated performance hit.

Try it yourself. The link to the article is here. Download 3dAnalyze, and follow these instructions:
Presi said:
Open it and follow the numbers:
1. select HL2.exe file in half-life 2 folder
2. select any file inside the folder half-life 2\bin
3. select Steam.exe
than check these options:
- Under the section Pixel and Vertex Shader: FORCE LOW PRECISION PIXEL SHADER
- Under the section Remove stuttering: PERFORMANCE MODE
- on the bottom left: FORCE HOOK.DLL

If you haven't change the file dxsupport.cfg with the method described in the beginnig of this thread, you can obtain the same result typing in the section DIRECTX DEVICE ID'S the ATI Vendor and Device ID, there are just two device though.
....
In the end 3D ANALYZE gives me an error, CREATEPROCESS FAILED, I launch HL2 anyway, the water looked awesome, awesome detail and I noticed a boost in performance too. I think around 20/30% which allowed me to play the WATER HAZARD level with this setting: 1024x768 everything max, water relection to ALL, 2xAA, 4xAnisotropic with a range of fps of 40 and >150.

Amazing, huh?

AND NOW, AN EDIT:
With more data!


Frosteh said:
While writing that marathon I decided to try out some of this stuff.

I ran downloaded the program, installed tryed it out and it wouldn't load HL2 itself, however testing showed that in fact it was effecting HL2 when it was loaded and HL2 was ran from steam.

GameSettings:
1024x768
Maximum everything
DX9.0 enforced with fixed bugs (using method of ATI 9800 product ID)


Driver Settings:
Driver Version 61.77
2xQAA (acts like 4xaa with speeds of 2xaa, aa that i have grown to love)
4xAF
High Quality
V-sync: OFF
Trilinear Optimisations: OFF
Ansiotropic Filtering: OFF

http://www.hackerz.tc/hzinternal/tot/HL2-PixelShader-16bit-and-32bit.jpg

The picture is around 700k, and is a side by side comparison of the two 1024x768 screenshots, added together with adobe photoshop, saved to jpg with maximum quality (100) and no other alterations made.

The "cl_showfps 1" command was used to display the current average FPS and the screenshots were taken with the in game screenshot capture.

32bit is on the left, 16 bit is on the right, frame rates are roughly 29FPS and 41FPS respectivly, and the performance was a lot better in game with 16bit forced obviously, while this area ran particuarly badly compared to most other areas I considered 30FPS playable, but with 41 FPS I could easily up the resolution one step to 1280 960.

Machine specs for anyone who missed them:

XP3000 @ 11.5x200
1Gb PC3200 Ram @ 200
FX 5900 Ultra Modded to FX 5950 Ultra, further overclocked to 550/975

Let me know if the screenshot method is not accurate enough, if you guys want it done again with other methods it will have to wait untill tomorrow im afraid.

Now, this tweak is GREAT for proving what kind of performance hit Valve kindly provided GeForce FX users....but it's obviously not usable to play the game with. Why?

1) Well, first, 3dAnalyze is simply not stable enough to use as a workaround to play the whole game
2) In the case of some specific shaders (some windows, a few surfaces), there ARE visible artifacts - color banding - as a result of forcing partial precision. Is this a problem? Not really - the whole point of this observation is that Valve should have allowed partial precision MOST of the time, not ALL of the time. GeForceFX cards have a more-than-is-needed full precision mode (which they are stuck with running full-time currently) that would be perfectly suitable for the few times full precision is actually NEEDED in the game.

So, in short, Valve handicapped the GeForceFX cards by 'picking and choosing' which part of the DX spec to follow - they chose not to implement the partial precision hints allowed for in the spec, and which are obviously usable in almost every case in the game, and which would have made the GeForce FX cards competitive!
 
Ouch, tsk tsk Valve. Anyone going to do some in depth benches / results on this? Brent? :p
 
lol i called this so long ago

funny even at FP32 all the time, 6800 still kicks this game's ass. But it would be even faster if Valve used good shader programming and put in partial precision calls where appropriate.
 
hmmm, even as a 6800 gt owner I need some side by side screenshots before I jump on this bandwagon. Maybe I'll have time to test it out, but it is sad if true considering that the 6800 gt still ties with with the x800 pro.
 
Hey Kyle and Brent are you going to investigate this? I would like a second opinion on this.
 
Top of the line video cards from both companies are supposed to be neck-to-neck with each other this round of the fight anyways. Good discovery though. :D
 
tranCendenZ said:
lol i called this so long ago

funny even at FP32 all the time, 6800 still kicks this game's ass. But it would be even faster if Valve used good shader programming and put in partial precision calls where appropriate.
Yeah, that's the real kicker.

All 3dAnalyze can do is force FP16 *all the time*. Although all the screenshots taken to date of it show *no* image quality difference....you gotta wonder.

All that means is that it's a damn shame Valve didn't code this right. Let's assume we DO somewhere find some artifacts by forcing it to FP16 (none found yet, just saying). Let's say as much as 5% of the rendering needs at least 24-bit floating point instructions to actually render properly.

If they'd used _pp hints for the REST, then 95% of the time, the FX cards would be running FP16, and 5% of the time running FP32 (instead of 100%, as they are now). There would be literally *NO* image quality difference in this hypothetical situation - anywhere, at all - and the performance would still be 95% of the boost we are seeing by forcing it to always use FP16.

Valve could have done this, and then the FX cards would be running in DX9 mode perfectly competitively instead of using DX8 mode. I'll grant the FX 5900s would probably still lose to the Radeon 9800 Pros and XTs....but we would be talking about playable framerates still, and identical image quality - rather than the 9800s slaughtering the FX cards we 'see' now.

(Course, it's *entirely* possible Half-Life 2 really doesn't EVER need more than FP16, and forcing it to use that 100% of the time will come up with no artifacts at all. In which case....shame on Valve, seriously!)
 
the 9700-9800 line and relatives do 24 percision while the FX line does 16 or 32 depending.

so yes.. its official by forcing the program to do 24 and not letting teh card decide its like valve locked in ati and exluded valve. can you say black mail for not bidding the highest?

haah what bs.

fuck valve..and fuck steam.

too bad for the fx owners..

but the 6800 or 6600 owners should be just fine.
 
Unfortunatly because the compatition is so tight between both vendors we are gonna start seeing more such instances as this i think personally :( It really is Not on when companys start to disadvantage one card in pref to the other :mad:
I just REALLY hope to god that Game Devs Dont adopt this trend more often?
 
Mister E said:
It was a shame that Doom3 wasnt coded more appropriately for radeon cards too.

Er, in what way?

As I understand it the performance delta between 6800 and X800 in D3 is more to do with ATI's inferior OpenGL implementation, rather than how D3 is coded.

Or are you suggesting otherwise?

Chip
 
He's just trying to cover for ATi having horrendous OpenGL support at the time, D3 uses the same code path for both NV and ATi which is something he should have known by now. :rolleyes:

Thanks for the tipoff Dderidex, really makes you wonder what the fuck Valve was doing with all that time. :rolleyes: Maybe Valve will now release a patch to fix the numberous bugs as well as their intentional crippling of NV cards..... yeah right. Not after they took all that Red ATI money. ;)

So has anyone run any extensive benchmarks as well as played through the game fully to make sure there wasn't any problems?

Honestly I would feel sort of conflicted, this is pretty much the same thing as the "Humus tweak" (without the fact that NV didn't release the info though) since it gives a performance increase while it "lowers" image quality but I really don't like Valve nor do I respect them enough to worry about it myself. This isn't really like the whole D3 thing anyways. ;)
 
If Valve did really tweak to improve performance for ATi .. then that's their own decision .. they can do whatever they want to improve performance for the company that sponsors them .. or did u miss the ATi sign on the CD or whatever u used to purchase the game ...

If this is indeed what happened .. yes I do agree .. Shame on Valve .. but U CAN'T say they suck ... cause HL2 is an AMAZING game ... I love the music .. they brought back the feel of the old HL to the new one with amazing physics and graphics ..

there's no other game out there that can prove itself against the HL2 engine .. not even D3 ... cause all I've seen in D3 is shadows and skin texture and lighting ... everything else was dark or hidden in shadows ... oh ..

There's no war to start over the different engines ... it's obvious which one is better NOW ...
 
mohammedtaha said:
If Valve did really tweak to improve performance for ATi .. then that's their own decision .. they can do whatever they want to improve performance for the company that sponsors them .. or did u miss the ATi sign on the CD or whatever u used to purchase the game ... ...

Cant you read....they didnt just tweak performance on ATI cards, they crippled performance on nvidia cards. Thats just low.

On a side note, i think this thread should be stickied and renamed to something like "Half life 2 Performance Tweak for Nvidia Cards" so people actually know what its about
 
defiant said:
Cant you read....they didnt just tweak performance on ATI cards, they crippled performance on nvidia cards. Thats just low.

On a side note, i think this thread should be stickied and renamed to something like "Half life 2 Performance Tweak for Nvidia Cards" so people actually know what its about

Er...can you read?

First of all, you're assuming something that hasn't been proven yet, by you especially or anyone for that matter.......

Also, I'm in agreement that they didn't "cripple" performance on nVidia cards if this does turn out to be true. They optimized for ATI, just like software companies do for Intel processors. So you're saying that if X software company makes their product work better if a processor uses SSE3 (which is done, btw) that means they're "crippling" Athlon 64 cpu's? Didn't think so.

I think you're right about the sticky part, though, if this does turn out to work.
 
Legend said:
Er...can you read?

First of all, you're assuming something that hasn't been proven yet, by you especially or anyone for that matter.......

Also, I'm in agreement that they didn't "cripple" performance on nVidia cards if this does turn out to be true. They optimized for ATI, just like software companies do for Intel processors. So you're saying that if X software company makes their product work better if a processor uses SSE3 (which is done, btw) that means they're "crippling" Athlon 64 cpu's? Didn't think so.

I think you're right about the sticky part, though, if this does turn out to work.

This in no way helps out ATi as they can run variable FP also. They hindered the lower end graphics users because they made it run at 32 all the time. This isn't a propriatary issue... (which you're saying it is with your instruction example), it's an issue that valve went out of their way basically (32FP I assume would be longer then variable 16-32..or do they have to write code sets for both?) to cripple the FX series whent they didn't have to.
 
Legend said:
Er...can you read?

First of all, you're assuming something that hasn't been proven yet, by you especially or anyone for that matter.......

Also, I'm in agreement that they didn't "cripple" performance on nVidia cards if this does turn out to be true. They optimized for ATI, just like software companies do for Intel processors. So you're saying that if X software company makes their product work better if a processor uses SSE3 (which is done, btw) that means they're "crippling" Athlon 64 cpu's? Didn't think so.

I think you're right about the sticky part, though, if this does turn out to work.
There is a reasonable expectation in this industry that all video cards are supported the best they can. Sure, in some cases extra features that may only exist on one brand are utilized much like SSE2/3 with processors. However, when you have to tell the game that your card is some other card in order to get the game to run properly, there is a problem. If this problem is true, I really hope Valve issues a patch to fix it. I am going to be optimistic here and think they will.
 
mohammedtaha said:
there's no other game out there that can prove itself against the HL2 engine .. not even D3 ... cause all I've seen in D3 is shadows and skin texture and lighting ... everything else was dark or hidden in shadows ... oh ..

There's no war to start over the different engines ... it's obvious which one is better NOW ...

The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.

Don't get me wrong, I'm not saying that Source is a bad engine. It's just that a lot of people are saying things about the engines when they are really commenting on things not related to the actual engines. HL2 is, imo, a much better game than D3. But D3's engine is capable of much more, I think.
 
So, it's Valve's fault that they used the DX9 spec? FP16 is not DX9 spec. FP24 or HIGHER is DX9 spec. FP16 is LOWER than FP24.

It isn't like anyone was forcing you to by a FX card. It was well known that they sucked in this respect and shader performance. There was no conspiracy to pull a fast one. Go back a year ago and these board were full of information pointing to the fact that the FX line had problems..big problems. You chose not to take that advice and now your mad that you have to run a hack to get your card to run a game in DX9 mode.
 
DanK said:
The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.

Don't get me wrong, I'm not saying that Source is a bad engine. It's just that a lot of people are saying things about the engines when they are really commenting on things not related to the actual engines. HL2 is, imo, a much better game than D3. But D3's engine is capable of much more, I think.

That really depends on what are u thinkin about when u talk about the WHOLE engine. I really, really love the way Half life 2 is graphically, but i think that INDOORS doom 3 probably can have a better engine, outdoor I don't know since D3 have none.

BUT:
If you think about physics engine, source can't be mached by D3.
When u think about the face emotion, again source can't be mached and this time by noone.

Obviously, not everyone want to see or even care for these things, but for those who do, like me, source is a better engine than D3, not that I didn't liked the engine of D3 (engine because D3 sucked for me), but source is better in my op.

P.S: Source as far as I can see is way more forgiving with still good image.
I can run really fine HL2 with 1280 in my 9700 PRO, but not even with 1024 with good fps in D3. :(
 
Met-AL said:
So, it's Valve's fault that they used the DX9 spec? FP16 is not DX9 spec. FP24 or HIGHER is DX9 spec. FP16 is LOWER than FP24.

It isn't like anyone was forcing you to by a FX card. It was well known that they sucked in this respect and shader performance. There was no conspiracy to pull a fast one. Go back a year ago and these board were full of information pointing to the fact that the FX line had problems..big problems. You chose not to take that advice and now your mad that you have to run a hack to get your card to run a game in DX9 mode.
Read the post. First, he had to tell HL2 he had a different card to get it to run without artifacts. That is the real issue here as of course the Radeon card he was pretending to be would use FP24 as that is its native mode. However, the parent found a perfectly acceptable way to run DX9 on a FX card, and all you can say is "he shouldn't have bought that card."
 
There is no comparison to D3. That game is coded very efficiently and runs the same path on both cards. It just comes down to ATI's OpenGL support sucking compared to Nvidia's.

Games are supposed to be coded as efficiently as possible and only use resources when needed. That would be like a game developer forcing SM3.0 at all times when only ps1.1 is needed.

Valve purposely coded everything to 24fp which is not really that much better than 16fp. You can notice a difference if you compared pixels, but not really from gameplay. The main difference is on FX cards... 24fp is bumped up to 32fp, b/c they dont have a 24fp mode. They could have coded most of the game in 16fp, and only used 24fp (32) where needed. For efficiency's sake, like many games do.

More people could probably be playing with higher resolutions, etc... It should help ATI cards (and Nv 6x00 cards) some, and Nv FX 5x00 cards A LOT.
 
True, there are areas that I didn't consider, such as facial animation, that Source wins hands down.

As far as Source vs. D3 performance, Source was made to run on ATi's 9xxx series, so it's not surprising that it works well on that hardware.

I think both engines will give us some great games in the future, though.

On an unrelated note, I thought DX9 allowed for partial precision shaders in situations where it would help performance, so long as the original shader length was maintained (i.e., by using temporaries)? Can someone provide a link to relevant documentation? (I'd look for it, but I have class! gotta go.)
 
Funny everything i throw @ my X800XTPE runs super sweet :D , im sure if i had a 6800ultra id be saying the same

Get over it and just play the f'n game :p
 
Value did have a mixed mode for FX cards. They showed the Mix mode results during the shader day event last year. They also showed the results of runing it in full DX9 mode. In the end they said it was faster to run HL2 on an FX card in DX8 (8.1) mode. Thats something we all knew about LAST year. Why is this news now? Are you sure when your force something to run your not breaking something else? I mean you force to use ATI ID and then Force 3D analyize to run something? Are you 100% sure thats not breaking something in the chain? Are those results vaild? Are you 100% sure the IQ is the same? But no instead of trying to make 100% sure on all of this you start a post. Nice :confused:
 
obs said:
Read the post. First, he had to tell HL2 he had a different card to get it to run without artifacts. That is the real issue here as of course the Radeon card he was pretending to be would use FP24 as that is its native mode. However, the parent found a perfectly acceptable way to run DX9 on a FX card, and all you can say is "he shouldn't have bought that card."

No, you read the posts. The problem is that DX9 requires FP24. Since the FX can do 16 or 32, it has to do 32, which kills the performance. The hack makes the Source engine render in FP16 which is not DX9 spec. Complaining that Valve stuck to the DX9 spec and saying they suck is stupid. The problem is not with Valve. It is with the hardware of the FX cards. Same goes with the D3 engine on ATi cards.
 
DanK said:
The "war" over which engine is technically better was over before it was started... and the winner was Doom 3. Half-life 2 looks excellent... because Valve has amazing texture artists. id Software's just don't compare. If you search around (I believe I saw it in these forums), there is a thread wherein someone used the respective editors for each game to remove textures from HL2 and place them into a D3 level. The resulting screenshots (to me, at least) look phenominal: D3's engine lighting HL2's world.

Don't get me wrong, I'm not saying that Source is a bad engine. It's just that a lot of people are saying things about the engines when they are really commenting on things not related to the actual engines. HL2 is, imo, a much better game than D3. But D3's engine is capable of much more, I think.

I think you need to clarify your perception of what an "engine" is. I guess your definition of "engine" excludes everything that HL2 does better than D3 , i.e. textures, water, physics simulation, AI, long-distance rendering... everything but the lighting of course, since D3 is nothing but lighting effects.
 
Met-AL said:
So, it's Valve's fault that they used the DX9 spec? FP16 is not DX9 spec. FP24 or HIGHER is DX9 spec. FP16 is LOWER than FP24.
Uh, actually DX9 does allow FP16 as part of the spec for PS2.0 (_pp = partial precision, which is what 3D-Analyze is forcing): http://msdn.microsoft.com/library/d...ixelShaders/Instructions/Modifiers_ps_2_0.asp

nvidia will probably just force the option in future drivers anyways through app detection. Nice trick to see though and interesting that Valve intentionally tanked performance.
 
Status
Not open for further replies.
Back
Top