Holy crap my Crysis framerate!!

I have the demo set as high as it will go, no AA, 1660x1050 and my frames go min - 21 - average 26- max 31.
 
this is what I have to say about Crysis reading all your comments and frustrations:

failxx3.jpg





:mad:
 
Anyone else with SLI noticed that is seems like SLI is not working with this SP demo? I get ok frames @ 1920x1080 with everything on medium, but as soon as I enable AA or anything else it gets pretty choppy.. hrm... looks very pretty though!
 
ok, thanks for the confirmation.. SLI or not, it's effin pretty man.. wow.
 
Anyone else with SLI noticed that is seems like SLI is not working with this SP demo? I get ok frames @ 1920x1080 with everything on medium, but as soon as I enable AA or anything else it gets pretty choppy.. hrm... looks very pretty though!

SLI does not work as has been stated a few times.

Anyway I've been playing it on Vista

Played through it twice, first time w/o fraps installed as Crysis was one of the first things I installed and just wanted to play.

Ran it @ 1280x1024 with everything on High. Ran very smoothly and never noticed any stuttering except in the cutscenes where there was some tearing.

Anyway, ran it again with fraps installed and did a very long run through of the level.

Time: 76.58minutes
Min: 0
Avg: 31.042
Max: 61

Think the min was from loading, as I never stopped the benchmark even though I died (wanted to Max str + punch a bunch of guys and lost...badly :p)

Overall I'd say the game ran very smoothly though, w/o fraps I wouldn't think that it was running that low, and my god is the game beautiful.

Anyway, thats with my 8800gts 640mb card @ 621/1458/900

Also my core temp was ~60c max, so not sure why people are getting such high temps.
 
so no one else has noticed that when you enable AA, the shader graphics option automatically goes to high?...
 
so no one else has noticed that when you enable AA, the shader graphics option automatically goes to high?...

Probably because AA requires shaders to be on High. Maybe Crysis uses shader based AA or for some other reason needs it.
 
I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.

I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.

Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.

I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.

I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.

I'd like to know how much better it runs in 64-bit mode with 4GB of ram.
 
I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.

I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.

Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.

I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.

I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.

I'd like to know how much better it runs in 64-bit mode with 4GB of ram.

QFT
 
I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.

I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.

Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.

I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.

I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.

I'd like to know how much better it runs in 64-bit mode with 4GB of ram.

I agree. This game is nothing more than a benchmark. The game is not playable on current hardware. 20 FPS? Come on, that's not right, especially for those of us who are accustomed to running nearly every game out there at 1680x1050, with settings maxed and games run smooth. This game feels like I am playing on my old computer and am messing with settings just to get the darn thing to run right. Why can't we have something that allows us to just set everything to the highest and have some fun? I can't believe that they are releasing this game when even the absolute best hardware (8800 Ultra) can't even run the game correctly.

Imagine what the average consumer who buys this game is going to do? LOL, god have mercy.
 
SLI does not work as has been stated a few times.
[snip]

ok. got it.. as stated twice now. ;)

I found the best IQ/performance balance for my setup is:

168.01 drivers @ 1920x1080 (video oc 605/950 / cpu oc 3.2) on winXP
no aa
texture qual = high
object qual = high (fixed)
shadow qual = low
physics qual = med
shader qual = med (edit: as I see others have reported, this on high really kills the fps)
vol fx qual = med
game fx qual = med
postproc qual = high
particle qual = med
water qual = med
sound qual = high

I might try and force vsync in the drivers as there is a bit of tearing.. but all in all I think this game looks/runs/plays great so far..
 
How do you know it's in DX10 mode?

According to the tweakguide guy, Vista 32 bit defaults to DX10.

I cannot get DX10 at all in my Vista 64 bit, it wont start, I have to run it in DX9 mode.

I can say, that running the FPS from the console, I AVERGAE 17-25 FPS in Vista 64 bit, that is with a Quad 6600, 4 GB Ram and 8800 GTX in SLI (although I guess SLI isnt working in the Demo)All setings on high, 4x AA in game. The game is pretty smooth and looks beautiful. I just hope nvidia figures out how to harness SLI before the retail box comes out.

In XP my frame rates are much better, although I havent run the FPS. In Vista the sound seems much better as well....but thats subjective anyway.
 
I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.

I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.

Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.

I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.

I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.

I'd like to know how much better it runs in 64-bit mode with 4GB of ram.
Please. On my 8800gts it almost never drops below 30fps with almost everything set to high and it looks miles better than anything we've played before. Surely on a GTX it would look even better. Couple that with the excellent destructible environments and fun gameplay and you've got yourself one good game.
 
got about 20fps with everything on medium except for textures, shaders and models on high @ 1024x768.

anyone know how to force in dx9 mode?
 
I just got hooked from picking up the Korean troops and launching them into huts. Sold the game for me, shitty frames or not.
 
I just got hooked from picking up the Korean troops and launching them into huts. Sold the game for me, shitty frames or not.


I thought it was more fun with the chickens :p


oh i'm cruel :D

well because when I alt-tab out it says "crysis DX10" on the task bar

also its playable on DX9 25-60 fps BUT stilll..I think i'll wait for a 9800GTX first



Don't believe i forgot, now you said it i remember.
 
Seriously, who is bitching here? I have a x800xt and a Barton 2500+ overclocked to 2.2ghz and the game ran fine ..................................................... 800x600 everything low!
 
I just blew my own mind. It was running OK and looked great on my setup:

5600+ @ 3.5Ghz, 2G OCZ 1066SLI, X1900XT with a good OC on it.

Was a bit slow, but played nice, though I'd prefer the gun dynamics be realistic like America's Army (totally accurate action/physics for Army training purposes) but then reading thisI realized.... I had 4x AA, 8x AF forced in the ATI control panel.... OOPS :eek::rolleyes::p

Guess I should go play through again, huh :D
 
One thing about this game is I think people ought to compare the visual quality per performance rather than blindly wanting to set everything to Very High

Even in High, I believe its already comparable any title there is out there, which is as far as current gen of cards can go...

If a card can't run everything Very High at LCD naive resolution, that's because current gen of cards are not meant to run such visual quality.

To me, it's an amazing game. and it's meant to harvest the brute power of next gen hardware. That being said, it scales very well with current hardwares, even for budget conscious gamers like me.

As someone stated earlier in this thread, maybe the developers should had name the setting High, Very High, Ridiculously High, Reserve-For-Next-Gen High, and probably everyone would be happy then:D
 
As someone stated earlier in this thread, maybe the developers should had name the setting High, Very High, Ridiculously High, Reserve-For-Next-Gen High, and probably everyone would be happy then:D


Heh! I agree. Or maybe we should start blaming the monitors. If everyone was still on a 14" bubble screen there wouldnt be any complaints! ;)
 
It plays pretty well for me at 1680x1050, high settings except Object Detail set to very high, DX10 on my sig rig.

It looks better than any game I've seen, yet a number of people say that its bad code vs. state of the art code, which I tend to believe the latter. This game was designed to be a beast and that's what it is. Could it have been designed differently and worked better on current gen hardware. Anything is possible, but then how much greater the effort and cost, and what would have been the result? Who knows.

For most people on this forum, all that's needed to get this game into the realm of performance that most have been accustomed to is a next gen GPU, which should be available soon. I know that its on my list.

I love PC gaming because of titles like Crysis. Its like a art form, taking gaming to next level. So many complain about the lack of PC titles compared to consoles, though it seems like this year the PC has gotten tons of great titles.

With my current hardware this is still a great fun to play game in single player mode, and a new GPU takes it to the next level.

PC gaming hardware is more more expensive than consoles, but it advances more quickly and as a result its more bleeding edge, and with games like Crysis, tends to be a bit more satisfying for me.

I'd like to thank CryTek for pushing the envelope and delivering a next gen game on the PC. People love to criticize so much that we forget how to say thanks!
 
It plays pretty well for me at 1680x1050, high settings except Object Detail set to very high, DX10 on my sig rig.

It looks better than any game I've seen, yet a number of people say that its bad code vs. state of the art code, which I tend to believe the latter. This game was designed to be a beast and that's what it is. Could it have been designed differently and worked better on current gen hardware. Anything is possible, but then how much greater the effort and cost, and what would have been the result? Who knows.

For most people on this forum, all that's needed to get this game into the realm of performance that most have been accustomed to is a next gen GPU, which should be available soon. I know that its on my list.

I love PC gaming because of titles like Crysis. Its like a art form, taking gaming to next level. So many complain about the lack of PC titles compared to consoles, though it seems like this year the PC has gotten tons of great titles.

With my current hardware this is still a great fun to play game in single player mode, and a new GPU takes is to the next level.

PC gaming hardware is more more expensive than consoles, but it advances more quickly and as a result its more bleeding edge, and with games like Crysis, tends to be a bit more satisfying for me.

I'd like to thank CryTek for pushing the envelope and delivering a next gen game on the PC. People love to criticize so much that we forget how to say thanks!

QFT
 
After testing my aging PC, a Barton 2500+ CPU overclocked to 2.2ghz and x800xt @ stock, the Crysis demo runs fine at 1024x768, med textures, med shaders, med water effects, everything else low. The video card did better than expected and the fps were above 25 most of the time at these settings, which is not optical fps, but it was playable.

The CPU didn't fare as well. The cpu required all physics, and audio settings set to low. And it still paused every once in a while during heavy fire fights with explosions. The cpu had a hard time with the auto saves too.

Anyways, I don't see why people are complaining if my aging PC can run the demo decently. My system is over 2 years old. The game looks great even at medium settings. Reduce the resolution a little if needed. I am glad to see high end video cards struggling with a game. It actually means there's a reason to purchase them. You shouldn't be able to run games new games at resolutions above 1600x1200 with any video card. People that buy 8800gtx for WOW should be shot............ok, I dont know where that came from.
 
After testing my aging PC, a Barton 2500+ CPU overclocked to 2.2ghz and x800xt @ stock, the Crysis demo runs fine at 1024x768, med textures, med shaders, med water effects, everything else low. The video card did better than expected and the fps were above 25 most of the time at these settings, which is not optical fps, but it was playable.

The CPU didn't fare as well. The cpu required all physics, and audio settings set to low. And it still paused every once in a while during heavy fire fights with explosions. The cpu had a hard time with the auto saves too.

Anyways, I don't see why people are complaining if my aging PC can run the demo decently. My system is over 2 years old. The game looks great even at medium settings. Reduce the resolution a little if needed. I am glad to see high end video cards struggling with a game. It actually means there's a reason to purchase them. You shouldn't be able to run games new games at resolutions above 1600x1200 with any video card. People that buy 8800gtx for WOW should be shot............ok, I dont know where that came from.


The hard time comes from the fact that people who spend $600+ on the video card alone want the game to run better than simply "decently"

Personally, I agree with you and a few others on here. You can't expect a card that's been around as long as the 8800 series has to be able to play a brand new game, that was designed from the ground up to not only push, but break the envelope in all its glory. Anyone who purchased an 8800 for the sole purpose of being able to play Crysis at its highest settings isn't too bright. Every sinlge first generation Direct Xx card has blown away the competition when running older Direct X titles but has been brought to it's knees when running the new games that make us of the new DXx feature set. It isn't until the second generation cards come out that they can really run the games they way you expect them to.
 
OK, turned off AA and set things to application in the ATI control panel. Used HIGH on all but textures and shadows which are medium as suggested here.

CPU bench got 22fps avg, GPU bench got 20fps avg.

More playable, went through it again, more time on foot, plays great but I still think the gun behavior in America's Army (being designed to be realistic) would be the perfect cherry on the top of Crysis.

looking forward to buying and playing it.... on my new 8800GT.
 
Opteron 170 @ 2.2, 4GB RAM, dual 7900GTX SLI in x64. Runs OK on a mix of High and Medium settings at 1280x1024. Don't want to think what the potential Christmas present monitor upgrade is gonna do to those frames...
 
Opteron 170 @ 2.2, 4GB RAM, dual 7900GTX SLI in x64. Runs OK on a mix of High and Medium settings at 1280x1024. Don't want to think what the potential Christmas present monitor upgrade is gonna do to those frames...

SLI is not supported in the Demo, so the full version should see some significant gains on your system.
 
People complaining along the lines of 'I paid $600 for this card and this game does not run smoothly at max res, therefore it sucks' need to do a reality check.

If you paid $600 for a video card then you're either an utter moron, or you're a smart person with money, means *and* the understanding that you paid premium price for that extra bit of performance over a $350 part because you wanted to.

As we all know, the price: performance ratio dwindles at the highest end. You basically bought a part that's 7 or 8 months old now at least, a part that was peddled as a DX10 part but came out when there were no DX10 games to speak of.

Obviously those of us who picked up a 88XX part knew we were getting excellent DX9 performance, but the DX10 hype was just that, hype, which you should have known.

You'd have to be silly to think that your performance with game engines that have been around for 2-3 years, like HL2 and Doom 3, would be comparable to one that hadn't even been released yet, like Crysis.

I bought my 8800 GTS without any illusions about DirectX 10 bullshit. I expected - and received - top-end DX9 performance, and expected that a better, revised DX10 part would come out when actual DX10 games came out, and that my part would perform pretty much like shit for DX10 if I wanted all the eye candy.

The situation was just the same with the 68XX people bought, and then complained about Doom 3 performance when it came out. Performance wasn't smooth at high settings unless you went SLI or got the later-released 78XX.

Those who decided to pay a 40% premium for 10% more performance with the 88XX Ultra knew full well that all they were purchasing was the status of having the top of the line card of a given chipset; it doesn't give imaginary license to be entitled to good performance at high settings for something down the road.

Edit: And lo and behold, it looks like a single 8800 GT, which should retail for $250, pulls in almost the same frame rate as an 8800 GTX in Crysis at 19x12, according to the Tweaktown.com article linked elsewhere. Once again proving that if you're buying hardware, buy it for the games that are already around, not for imaginary hype about what will and won't come out and how it will or won't perform.
 
I think people are forgetting the "Pre-Release Demo" text that is displayed. This is not a "gold" build, during now and the final version it is possible optimizations will be made, not to mention improvements from drivers between now and then.
 
I think people are forgetting the "Pre-Release Demo" text that is displayed. This is not a "gold" build, during now and the final version it is possible optimizations will be made, not to mention improvements from drivers between now and then.

One really big bug is that if I run into a stationary barrel I can die somehow. Other than that I freakin love the game.
 
I think people are forgetting the "Pre-Release Demo" text that is displayed. This is not a "gold" build, during now and the final version it is possible optimizations will be made, not to mention improvements from drivers between now and then.

Almost all Demo's are "pre-release" that does not mean there is going to be optomizations. Pre release simply means the game isn't released yet, which is the case for just about all demo's.

Now if it were a Beta, that's another story, then optomizations are still to be made. An official demo though, you've pretty much got what you're going to get for the most part.
 
One really big bug is that if I run into a stationary barrel I can die somehow.

I managed to surf a turtle for a good 10 feet when I first arrived on the beach (post opening sequence). Maximum Strength, run, look down, throw turtle, win.
 
Absolutely no difference for me with 169.01 drivers. If anything the game runs slower now after installing them. Not sure what frame rate I am getting with everything on high but it can't be more than maybe 15-20 with a core2 E6600 at 3.2ghz and an 8800gts640 o/c'd 20%. Was hoping for better than that. This runs about 20% slower for me than the multiplayer demo did. Bummer.
 
I want some screenies people!
Im going to try running it on my setup soon I just have to finish downloading it.
I know this will spur an upgrade very quickly.
 
People complaining along the lines of 'I paid $600 for this card and this game does not run smoothly at max res, therefore it sucks' need to do a reality check.

If you paid $600 for a video card then you're either an utter moron, or you're a smart person with money, means *and* the understanding that you paid premium price for that extra bit of performance over a $350 part because you wanted to.

As we all know, the price: performance ratio dwindles at the highest end. You basically bought a part that's 7 or 8 months old now at least, a part that was peddled as a DX10 part but came out when there were no DX10 games to speak of.

Obviously those of us who picked up a 88XX part knew we were getting excellent DX9 performance, but the DX10 hype was just that, hype, which you should have known.

You'd have to be silly to think that your performance with game engines that have been around for 2-3 years, like HL2 and Doom 3, would be comparable to one that hadn't even been released yet, like Crysis.

I bought my 8800 GTS without any illusions about DirectX 10 bullshit. I expected - and received - top-end DX9 performance, and expected that a better, revised DX10 part would come out when actual DX10 games came out, and that my part would perform pretty much like shit for DX10 if I wanted all the eye candy.

The situation was just the same with the 68XX people bought, and then complained about Doom 3 performance when it came out. Performance wasn't smooth at high settings unless you went SLI or got the later-released 78XX.

Those who decided to pay a 40% premium for 10% more performance with the 88XX Ultra knew full well that all they were purchasing was the status of having the top of the line card of a given chipset; it doesn't give imaginary license to be entitled to good performance at high settings for something down the road.

Edit: And lo and behold, it looks like a single 8800 GT, which should retail for $250, pulls in almost the same frame rate as an 8800 GTX in Crysis at 19x12, according to the Tweaktown.com article linked elsewhere. Once again proving that if you're buying hardware, buy it for the games that are already around, not for imaginary hype about what will and won't come out and how it will or won't perform.

Im guessing people are expecting Crysis to run really well because the game was developed on Nvidia 8 series, it was quoted somewhere, don't ask me to find it but it was.

So everybody though oh it will run great as thats what it was developed on. Thats where the major issue for most people with 8800s The fact the game was developed on the card but can't be played. And that When you select optimal settings with a 8800GTX/Ultra you are give all settings on very High, Crytek know this card should be able to do it, something isn't right somewhere or the folks at Crytek think 20fps on VH is good.

Personally i'm fine with how it plays im getting 33 odd average, but i haven't turned it all on Very high.
 
Back
Top