Eh..TF2 slower than Crysis?

Crysis? um you picked the wrong game to claim 4 core support for. 4 cores really doesnt do crap for Crysis and a slightly faster dual core will beat 4 cores. here you can see that a 2.93 dual core beats a 2.66 quad so even clock for clock a quad core is basically useless in Crysis.

http://www.gamespot.com/features/6182806/p-6.html

also the Q9300 is 2.5ghz not 2.0 and it is hardly a slouch.

Sorry ebuyer had the chip listed as 2 ghz when I checked, I assumed they'd be right.

Anyhow the point remains, Crysis has better multicore support than the source engine, it will dump more work across the cores where as the source engine doesnt really use more than 1 core, I think this is a valid point.
 
V-sync does not lower your fps, it just caps it to 60 or whatever your refresh rate is.


True, but I have found that in source based games it adds a weird laggy feeling to the game and so I never use it.
 
V-sync does not lower your fps, it just caps/gives it a ceiling at 60 fps or whatever your refresh rate is.

If your FPS renderd goes down to 59FPS, VSYNC would make that 30FPS, check your facts.
 
Whew, glad I'm not the only one that sees low FPS in TF2.
I'm on a E6600 + 8800GTS 512 w 4xAA but the FPSs can dip as low as 20s-30s during heavy battles with tons of people such as the opening of dustbowl and goldrush.

i'm quite skeptical about people that claim 'have never dipped below 60'.
They must have 1)REALLY fast CPUs, or 2)never actually measured true FPS with cl_showfps 1 or FRAPS
 
Whew, glad I'm not the only one that sees low FPS in TF2.
I'm on a E6600 + 8800GTS 512 w 4xAA but the FPSs can dip as low as 20s-30s during heavy battles with tons of people such as the opening of dustbowl and goldrush.

i'm quite skeptical about people that claim 'have never dipped below 60'.
They must have 1)REALLY fast CPUs, or 2)never actually measured true FPS with cl_showfps 1 or FRAPS

I use the showfps flag all of the time. My fx5200 setup 6600gt setup, and even current setup with an 8600 have fallen into the red. But my current setup with the 4850 now in place of the 8600 never does. GREEN 100% of the time, 200 on CS:S test and 130 on Lost Coast test. Its pretty awesome. :D
 
I actually don't have problems with CS:S, which is also the source engine. The FPS in CS:S are too high and useless for me to put here, but it's probably because there are lesser open enviorments like in TF2.
 
I use the showfps flag all of the time. My fx5200 setup 6600gt setup, and even current setup with an 8600 have fallen into the red. But my current setup with the 4850 now in place of the 8600 never does. GREEN 100% of the time, 200 on CS:S test and 130 on Lost Coast test. Its pretty awesome. :D

I just did those two tests. Got 140fps in the Lost Coast test and 235 in the CS:S test. (4XAA everything else max @ 1280x1024 res)
TF2 is another story, when theres loads of people firing everywhere the FPS does dip into 20-30 area on worst case scenarios.

This doesn't make sense, people with GTX 260s and 280s are reporting dips in the 30s sometimes even 20s. How does a X2 5200+ 4850 have %100 >60fps?
Even on 24-32 man full servers during dustbowl/goldrush opening, you never ever dip below 60fps?
 
I just did those two tests. Got 140fps in the Lost Coast test and 235 in the CS:S test. (4XAA everything else max @ 1280x1024 res)
TF2 is another story, when theres loads of people firing everywhere the FPS does dip into 20-30 area on worst case scenarios.

This doesn't make sense, people with GTX 260s and 280s are reporting dips in the 30s sometimes even 20s. How does a X2 5200+ 4850 have %100 >60fps?
Even on 24-32 man full servers during dustbowl/goldrush opening, you never ever dip below 60fps?
He said CS:S and Lost coast, not TF2. TF2 is heavy on the CPU like all Source games. If you play in a 50 player + CS:S server your FPS will drop just the same. Nature of the beast I guess.
 
Anything HD3870/9600GT or better is enough not to be a GPU bottleneck in TF2 and any dips in frame rate is almost entirely because of the CPU. Mainly because TF2 has very poor multi-processor support. Not sure if the same holds true for other source based games, but that's the way it is for TF2.
 
I just did those two tests. Got 140fps in the Lost Coast test and 235 in the CS:S test. (4XAA everything else max @ 1280x1024 res)
TF2 is another story, when theres loads of people firing everywhere the FPS does dip into 20-30 area on worst case scenarios.

This doesn't make sense, people with GTX 260s and 280s are reporting dips in the 30s sometimes even 20s. How does a X2 5200+ 4850 have %100 >60fps?
Even on 24-32 man full servers during dustbowl/goldrush opening, you never ever dip below 60fps?

Those numbers i gave are at 1680x1050 with max AA / AF etc.

I will load up TF2 tonight and do some testing - I haven't played it much since I got the new card.
 
I played TF2 all this weekend on my my new E8400 + 8800GT rig and it never dropped below 60 fps on 32 player goldrush & dustbowl servers (I played with net_graph 1 on). My X2 frequently goes down to 30 though.
 
One thing that really kills TF2's framerates is when people shoot into water with shotguns and create large splashes and ripples. I made a timedemo with a friend (in 2fort) which will stress your system immensely. Yes, the game is heavily CPU-dependent, but a faster video card will also help greatly in these water areas. A 4850 is nearly twice as fast as a 3870 in this timedemo with all settings cranked to the max, even on the same system with the same CPU.
 
I just did those two tests. Got 140fps in the Lost Coast test and 235 in the CS:S test. (4XAA everything else max @ 1280x1024 res)
TF2 is another story, when theres loads of people firing everywhere the FPS does dip into 20-30 area on worst case scenarios.

This doesn't make sense, people with GTX 260s and 280s are reporting dips in the 30s sometimes even 20s. How does a X2 5200+ 4850 have %100 >60fps?
Even on 24-32 man full servers during dustbowl/goldrush opening, you never ever dip below 60fps?

Oppps. Well now that I saw his post I am also questioning those FPS. I get 93 fps on 1680x1050 in lost coast. I dont know how he gets higher FPS than me at the same settings because I am pretty damn sure my e6550 @ 3.0ghz + my 4850 would beat his system.
 
It's impossible to ALWAYS have over 60 FPS in TF2. In 2fort, go to the basement tunnels that have a little stream of water running in them. Crouch down, take out your shotgun and start spraying directly downward at your feet. This will bring your system to its knees.
 
Main problem in TF2 is that it's multi-threaded support is nearly non existant. Most of the work is being done on a single core.

The engine now is very, very well threaded and scales well to numerous cores... unfortunately support is still buggy. There is some threading enabled by default now. However, while FPS gains are enormous when you have all the threading stuff enabled, stability goes down the drain.

It's impossible to ALWAYS have over 60 FPS in TF2. In 2fort, go to the basement tunnels that have a little stream of water running in them. Crouch down, take out your shotgun and start spraying directly downward at your feet. This will bring your system to its knees.

This used to happen to me, but got fixed for the most part in one of the updates. Some update had notes about particle optimizations, and that did it I guess.
 
The engine now is very, very well threaded and scales well to numerous cores... unfortunately support is still buggy. There is some threading enabled by default now. However, while FPS gains are enormous when you have all the threading stuff enabled, stability goes down the drain.

That's a bit of a contradiction. It can't be that well threaded if it's completely unstable. And IIRC, it's not all that well threaded unless you add a flag to the shortcut or something like that. Obviously if it was designed to properly use multi-core CPU's it would be enabled and stable by default.
 
That's a bit of a contradiction. It can't be that well threaded if it's completely unstable. And IIRC, it's not all that well threaded unless you add a flag to the shortcut or something like that. Obviously if it was designed to properly use multi-core CPU's it would be enabled and stable by default.

Heh, it works wonderfully until it crashes.

It will work and is enabled by default, however some things you tweak will usually make it crash. I'm pretty sure I recall valve stating that some of the stuff there was mainly for the 360 so that's why it's not functioning 100% yet on the PC.
 
I'm not seeing a difference between -32bit and -64bit TF2 in Vista.

Compared to XP SP3, I'm getting a 30 FPS decrease in Vista.
 
This used to happen to me, but got fixed for the most part in one of the updates. Some update had notes about particle optimizations, and that did it I guess.
Well I just ran my time demo again and don't see any improvements. The water ripples and splashes still destroy your framerates.

Here's my time demo. Try it out if you want. I get around 47 FPS in this at 1680x1050 with all settings maxed.

http://www.sendspace.com/file/8a4t1v
 
Wow jeez Source does love MHZ. I've been using a e8400 on a p5n-e for over a month without updating the bios. Turns out that the bios I was on didn't even support the e8400 45mn chip. It was only letting me use 1 core @ 2ghz and my fps was dipping to around 24 [/B ]fps with a overclocked 260 on TF2.

Yesterday I updated the bios and OC'd the e8400 to 3.825 and TF2 never dips below 55 fps. This is with all settings at max including aa and af at full 16 and 19x12.

I even tried going to 2fort and shooting the water in the tunnel, that crippled my system down to 40 fps.

I posted on the first page of this thread a while ago when I "thought" that I had a e8400@3ghz.
 
Oh and my old 3.0ghz Pentium 4 with a 6600GT @ 1440x900 at low details runs this at a staggering 24fps.

It just seriously blows my mind when I see it running at nearly 200fps in the starting area and averaging 100+ fps outdoors.

I only hit 55ish fps when there's a world war 3 like game going on.
 
Try my timedemo if you want to stress your system and see what the "worst case scenario" is on your system.
 
i have that same annoying studder problem with my Nvidia 6800Ultra i have all the settings turn down to low @ 1280 x 1024 it really fuckin annoying....the studders happen every 5-10 mins and they last for about 2-5 mins each....ugh..
 
Wow jeez Source does love MHZ. I've been using a e8400 on a p5n-e for over a month without updating the bios. Turns out that the bios I was on didn't even support the e8400 45mn chip. It was only letting me use 1 core @ 2ghz and my fps was dipping to around 24 [/B ]fps with a overclocked 260 on TF2.

Yesterday I updated the bios and OC'd the e8400 to 3.825 and TF2 never dips below 55 fps. This is with all settings at max including aa and af at full 16 and 19x12.

I even tried going to 2fort and shooting the water in the tunnel, that crippled my system down to 40 fps.

I posted on the first page of this thread a while ago when I "thought" that I had a e8400@3ghz.


Ouch. 2ghz... E8400 at 3.0GHz was still experiencing fps drops to 25-35 on an 8800gt, no aa/af, 1680x1050. at 4GHz, my fps drops no lower than 60-65. it's wonderful...
 
Back
Top