Left 4 Dead Gameplay Performance and IQ @ [H]

i played it with 8xQ AA and 16xAF and it was very smooth on my pc at 1680x1050
the atmosphere from the source engine leaves alot to be desired, having played gears of war and crysis, iv gotten too used to good lighting and shadowing.
 
i play 1680x1050 everything maxed in game with transaprancy ss enabled in nvidia control panel 8xAA 16xAF and 99% of the time it does't go below 60 FPS, lowest ive seen is 40 FPS but its very very rare. im getting my 1080p lcd soon cant wait to play it on it.
 
i played it with 8xQ AA and 16xAF and it was very smooth on my pc at 1680x1050
the atmosphere from the source engine leaves alot to be desired, having played gears of war and crysis, iv gotten too used to good lighting and shadowing.
Yeah, the Source engine was never known for very advanced graphics. They can talk about HDR all they want but in the end it's just a 2D animated glow effect.

It sucks a bit that in L4D and the latest build of the Source engine (also used for Episode 2 and Day of Defeat: Source), only your flashlight casts dynamic shadows. Everyone else's doesn't, so that's why things don't quite look so amazing. If at least explosion, flashlight and gunshot lights casted dynamic shadows, the games would have looked a lot nicer.

When it comes ot Gears of War though, it looks good because of the great normal maps they used for all the models in the games, not because of advanced lighting. That engine crawls as soon as you turn on a few lights that cast dynamic shadows, so it's a lot like Source in that respect.
 
Yeah, the Source engine was never known for very advanced graphics. They can talk about HDR all they want but in the end it's just a 2D animated glow effect.

It sucks a bit that in L4D and the latest build of the Source engine (also used for Episode 2 and Day of Defeat: Source), only your flashlight casts dynamic shadows. Everyone else's doesn't, so that's why things don't quite look so amazing. If at least explosion, flashlight and gunshot lights casted dynamic shadows, the games would have looked a lot nicer.

When it comes ot Gears of War though, it looks good because of the great normal maps they used for all the models in the games, not because of advanced lighting. That engine crawls as soon as you turn on a few lights that cast dynamic shadows, so it's a lot like Source in that respect.

LFD is not by any stretch of the imagination the DX10 lighting godness that Clear sky achieves in 0.2FPS mode. On the other hand, LFD is a HELL of a lot of fun. Which there is really something to be said for. I'd much rather more developers worked on making games fun, and less uber graphics of doom.

That being said, I wouldn't mind better graphics, but I don't really want to see another Assasin's Creed. Insane graphics, giant open sand box, and .... about 2 hours worth of game play before it starts redressing and repeating.
 
LFD is not by any stretch of the imagination the DX10 lighting godness that Clear sky achieves in 0.2FPS mode. On the other hand, LFD is a HELL of a lot of fun. Which there is really something to be said for. I'd much rather more developers worked on making games fun, and less uber graphics of doom.

That being said, I wouldn't mind better graphics, but I don't really want to see another Assasin's Creed. Insane graphics, giant open sand box, and .... about 2 hours worth of game play before it starts redressing and repeating.

Agreed. I would rather have passable, good graphics with amazing gameplay than amazing graphics and passable gameplay.
 
I just bought an 8800GS (OC to GT speeds) and this game is SMOKING!. No lag whatsoever and super high settings. Not bad for a $50 card (with rebate) lol.
 
I personally love the Source engine. After all these years it's still proven to be a beautiful and downright powerful graphics engine. Best of all, it's so easy on the hardware, they way ALL PC games should be.

L4D is beautiful even though it's ugly. Maybe that's because I'm a filmmaker and I love the look and feel (not literally) film grain produces. When Mass Effect used film grain, it was very gimicky and didn't add to the overall experience. It was just thrown in there for that "wow, cool" factor. However, L4D applies the gritty 16mm film look perfectly and makes sense in every way.
 
LFD is not by any stretch of the imagination the DX10 lighting godness that Clear sky achieves in 0.2FPS mode. On the other hand, LFD is a HELL of a lot of fun. Which there is really something to be said for. I'd much rather more developers worked on making games fun, and less uber graphics of doom.
Well I won't argue with you there. I didn't buy Clear Sky but I did pre-order Left 4 Dead the day it was available for pre-ordering (and I've played the demo then the full game as soon as they were out). I have all Valve's games so I definitely enjoy them, and not for their graphics. I enjoy making 3D graphics for games though so I'll still notice things they could have done a lot better.

Oh and, the reason Source engine games run well is because their use simpler graphics, not because the engine is somehow superior to others. If you load a L4D map in the CryEngine 2 and turn off all the dynamic shadows and advanced effects (object motion blur, depth of field, parallax occlusion mapping, etc.), it'll run just as fast if not faster than in the Source engine. They use lower res textures, less textures, a lot less objects, no advanced dynamic shadows (except for one light source - your flashlight) and you rarely see too many objects on-screen at once. Any forest scene from Crysis or Clear Sky will have way more objects and shadows than anything in a Valve game, so of course it'll run slower.
 
I've not read any of these features before, but I thought this one was great. The only question I have concerns the average/maxium/minimum frame rate measurements and the corresponding graphs.

How do you make sure that the comparisons are fair? Unless the player input is identical from one session to the next, and that the AI does the same things at the same time, then surely the images being rendered from one gameplay session to the next will be different, therefore different loads will be put on the system at any given point?
 
I've not read any of these features before, but I thought this one was great. The only question I have concerns the average/maxium/minimum frame rate measurements and the corresponding graphs.

How do you make sure that the comparisons are fair? Unless the player input is identical from one session to the next, and that the AI does the same things at the same time, then surely the images being rendered from one gameplay session to the next will be different, therefore different loads will be put on the system at any given point?

They do it by using the law of large numbers. They aren't taking a 1 min "snap shot" of the game. They'll play an entire level, generally ~5-10 mins. Even so, the margin of error is still probably +/-5%. However, even time demos you'll get different results every time you run it. If you look at the graphs though you can see the loading on both GPUs is similar as they both dip and rise at the same times.
 
The graphs are somewhat disingenous then surely? As at any given point you are not comparing like with like? They're only really useful for general trends and yet a lot of the graphs look awfully close - peaks and troughs in the same places.

Peak, low and average frame rate on the other hand should be absolutely fine.
 
anyone give me a ball park for performance of L4D on

AMD x2 5000+
4g PC6400
x2 Sapphire 3850's w/512 ea. Crossfire
22" widescreen he likes 1440 x 900
dual boot XP-Pro/Vista both 32bit

my brothers system, and he wants this game. he may be adding a Phenom II, when they surface in his price range. i just wanna have some idea before i get a phone call , ya know, the 24 hour family help desk.
 
heck yea... especially at that resolution, you may even max everything out.

Don't be scarred to try this game out people... If you've been able to run the HL2 Episodes, you shouldn't have any issues with this game...except finding time to sleep. You'll be up late lost in the addiction. :D
 
I've not read any of these features before, but I thought this one was great. The only question I have concerns the average/maxium/minimum frame rate measurements and the corresponding graphs.

How do you make sure that the comparisons are fair? Unless the player input is identical from one session to the next, and that the AI does the same things at the same time, then surely the images being rendered from one gameplay session to the next will be different, therefore different loads will be put on the system at any given point?

The graphs are somewhat disingenous then surely? As at any given point you are not comparing like with like? They're only really useful for general trends and yet a lot of the graphs look awfully close - peaks and troughs in the same places.

Peak, low and average frame rate on the other hand should be absolutely fine.

I highly suggest you guys rely on canned benchmarks and ePenisMark 06 for your needs. Obviously our content has not worth to you.
 
Back
Top