Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Anyone else with SLI noticed that is seems like SLI is not working with this SP demo? I get ok frames @ 1920x1080 with everything on medium, but as soon as I enable AA or anything else it gets pretty choppy.. hrm... looks very pretty though!
so no one else has noticed that when you enable AA, the shader graphics option automatically goes to high?...
well in DX10 mode I get about 15fps 1920x1200.....with the latest drivers
the 9800GTX better be coming out soon
I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.
I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.
Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.
I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.
I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.
I'd like to know how much better it runs in 64-bit mode with 4GB of ram.
I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.
I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.
Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.
I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.
I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.
I'd like to know how much better it runs in 64-bit mode with 4GB of ram.
SLI does not work as has been stated a few times.
[snip]
How do you know it's in DX10 mode?
Please. On my 8800gts it almost never drops below 30fps with almost everything set to high and it looks miles better than anything we've played before. Surely on a GTX it would look even better. Couple that with the excellent destructible environments and fun gameplay and you've got yourself one good game.I say shelve it until the next generation of cards is out. Right now at playable frame rates it looks a little better than Oblivion but you don't really get into the "next gen" stuff until you get into the very high settings, especially for shaders and post processing. I think it has a lot of potential, and I can see that they really are taking things to next level but we just don't have the hardware for it yet, contrary to statements by Crytek, Nvidia,ect.
I think what it comes down to is that this isn't a corridor crawler like Bioshock. You can't compare it with TF2 either. In Crysis we are asking the videocard to render huge outdoor scenes and tons of foliage. That type of game just kills GPUs, always has. Trees and foliage are the ultimate frame rate killers. That is why so few games take place in the jungle like this.
Some of the stuff that Crysis tries to do we've only seen in 3Dmark06. There are several effects and sequences that remind me of specific 3DMark06 mark sequences and the frame rates are very comparable. It is like running a playable 3DMark.
I'm going to treat it as a benchmark until there is hardware that can run it 1680 x 1050 4xaa with most of the settings on very high with like 40-50 fps. I'm guessing that the true replacement for the 8800GTX will be able to handle that.
I would rather wait for the hardware to mature than play it on my current setup and spoil it, at least the single player part of it. I wouldn't mind playing the multiplayer dialed down, but I like to run single player games maxed out the first time through.
I'd like to know how much better it runs in 64-bit mode with 4GB of ram.
How do you know it's in DX10 mode?
I just got hooked from picking up the Korean troops and launching them into huts. Sold the game for me, shitty frames or not.
well because when I alt-tab out it says "crysis DX10" on the task bar
also its playable on DX9 25-60 fps BUT stilll..I think i'll wait for a 9800GTX first
As someone stated earlier in this thread, maybe the developers should had name the setting High, Very High, Ridiculously High, Reserve-For-Next-Gen High, and probably everyone would be happy then
It plays pretty well for me at 1680x1050, high settings except Object Detail set to very high, DX10 on my sig rig.
It looks better than any game I've seen, yet a number of people say that its bad code vs. state of the art code, which I tend to believe the latter. This game was designed to be a beast and that's what it is. Could it have been designed differently and worked better on current gen hardware. Anything is possible, but then how much greater the effort and cost, and what would have been the result? Who knows.
For most people on this forum, all that's needed to get this game into the realm of performance that most have been accustomed to is a next gen GPU, which should be available soon. I know that its on my list.
I love PC gaming because of titles like Crysis. Its like a art form, taking gaming to next level. So many complain about the lack of PC titles compared to consoles, though it seems like this year the PC has gotten tons of great titles.
With my current hardware this is still a great fun to play game in single player mode, and a new GPU takes is to the next level.
PC gaming hardware is more more expensive than consoles, but it advances more quickly and as a result its more bleeding edge, and with games like Crysis, tends to be a bit more satisfying for me.
I'd like to thank CryTek for pushing the envelope and delivering a next gen game on the PC. People love to criticize so much that we forget how to say thanks!
After testing my aging PC, a Barton 2500+ CPU overclocked to 2.2ghz and x800xt @ stock, the Crysis demo runs fine at 1024x768, med textures, med shaders, med water effects, everything else low. The video card did better than expected and the fps were above 25 most of the time at these settings, which is not optical fps, but it was playable.
The CPU didn't fare as well. The cpu required all physics, and audio settings set to low. And it still paused every once in a while during heavy fire fights with explosions. The cpu had a hard time with the auto saves too.
Anyways, I don't see why people are complaining if my aging PC can run the demo decently. My system is over 2 years old. The game looks great even at medium settings. Reduce the resolution a little if needed. I am glad to see high end video cards struggling with a game. It actually means there's a reason to purchase them. You shouldn't be able to run games new games at resolutions above 1600x1200 with any video card. People that buy 8800gtx for WOW should be shot............ok, I dont know where that came from.
Opteron 170 @ 2.2, 4GB RAM, dual 7900GTX SLI in x64. Runs OK on a mix of High and Medium settings at 1280x1024. Don't want to think what the potential Christmas present monitor upgrade is gonna do to those frames...
I think people are forgetting the "Pre-Release Demo" text that is displayed. This is not a "gold" build, during now and the final version it is possible optimizations will be made, not to mention improvements from drivers between now and then.
I think people are forgetting the "Pre-Release Demo" text that is displayed. This is not a "gold" build, during now and the final version it is possible optimizations will be made, not to mention improvements from drivers between now and then.
One really big bug is that if I run into a stationary barrel I can die somehow.
SLI is not supported in the Demo, so the full version should see some significant gains on your system.
People complaining along the lines of 'I paid $600 for this card and this game does not run smoothly at max res, therefore it sucks' need to do a reality check.
If you paid $600 for a video card then you're either an utter moron, or you're a smart person with money, means *and* the understanding that you paid premium price for that extra bit of performance over a $350 part because you wanted to.
As we all know, the price: performance ratio dwindles at the highest end. You basically bought a part that's 7 or 8 months old now at least, a part that was peddled as a DX10 part but came out when there were no DX10 games to speak of.
Obviously those of us who picked up a 88XX part knew we were getting excellent DX9 performance, but the DX10 hype was just that, hype, which you should have known.
You'd have to be silly to think that your performance with game engines that have been around for 2-3 years, like HL2 and Doom 3, would be comparable to one that hadn't even been released yet, like Crysis.
I bought my 8800 GTS without any illusions about DirectX 10 bullshit. I expected - and received - top-end DX9 performance, and expected that a better, revised DX10 part would come out when actual DX10 games came out, and that my part would perform pretty much like shit for DX10 if I wanted all the eye candy.
The situation was just the same with the 68XX people bought, and then complained about Doom 3 performance when it came out. Performance wasn't smooth at high settings unless you went SLI or got the later-released 78XX.
Those who decided to pay a 40% premium for 10% more performance with the 88XX Ultra knew full well that all they were purchasing was the status of having the top of the line card of a given chipset; it doesn't give imaginary license to be entitled to good performance at high settings for something down the road.
Edit: And lo and behold, it looks like a single 8800 GT, which should retail for $250, pulls in almost the same frame rate as an 8800 GTX in Crysis at 19x12, according to the Tweaktown.com article linked elsewhere. Once again proving that if you're buying hardware, buy it for the games that are already around, not for imaginary hype about what will and won't come out and how it will or won't perform.