Another crysis performance moan. GTS to GT

Quick history lesson.
HL2's delay had nothing to do with Valve feeling benevolent, wanting our hardware to catch up with their software. Ditto for Doom3.

If you recall, Valve claimed that their code had been compromised and that it fell in the hands of some hacker, forcing them to delay the entire project. Whether that is true or not, is an entirely different subject matter.

Doom3 simply just didn't launch when expected.

Their delays had no connection to the release of hardware, or the lack thereof in this case.

It just so happens that when they did eventually come out, another game, FAR CRY, had stolen all of their graphical thunder already by about 7-8 months, and had already benefited from cards such as the X800XTPE's and 6800 Ultra's.

The simple fact of the matter was, that they (id and Valve) were late to the game, and their respective titles had already lost much of their "wow appeal" due to Far Cry.

You're the only person I've come across who has such a distorted version of history with respect to the 2004 FPS era.


And for the record, as a software developer, your game is on a set schedule. You have publishers who are also looking to make a profit off of your product as well.
You can't just sit around and wait for the unpredictable hardware world to catch up to your software.
Id, Epic, Bethesda, Crytek, amongst a slew of others have never done that.

Even if we go as far back in time as Quake1, we come to realize what I say is true, as I noted above in my previous post as well.

Their software is always more cutting edge than the hardware of the time.
That's why with PC games, there are always either advanced settings, tweakable settings, config files that can me modified, etc. to ever-increase the untapped graphics potential of games such as Crysis or even Doom3.

Hell, if you recall correctly, it was Id in 2004 who stated that Doom3 has an ULTRA mode for textures that can't be realized unless you have a minimum of 512MB of VRAM, even citing 1GB as preferrable.
Yeah, as if there were any such cards in 04. The best we had were 256MB cards. :eek:

Seriously... why does that even matter? You can speculate on why the game was delayed or not, but that doesn't change the fact that the game was playable on top of the line systems @ max settings. So really, that doesn't matter. And how is it that you know exactly why these games were delayed? Frankly, if you didn't work for id/Valve at those given times, you don't have accurate information as to why they were delayed.

Honestly tho, you must have some affiliation with Crytek or something, because myself and countless other people I've heard from about Farcry, is that while it looked great, it got pretty damn boring, especially the latter half of the game. I enjoyed Doom 3 more than Farcry, and HL2? Yeah I loved that game.
 
Have you even played the game? :rolleyes::confused:

(a) The demo = the 1st level of the full game.

(b) The demo was based on a final build.

(c) Nothing changed from the demo to the final game

(d) Connect the dots...

Really, why do I even bother sometimes.... :eek:


Furthermore, just because you may get 5 fps during a fire fight (which I doubt, it doesn't even sound like you played the game, unable to tell the difference between the demo and the 1st level kind of gives it away), it certainly doesn't mean the whole world does too.
And, certainly not on a Quad Core, 4GB RAM, 8800GT/GTX/Ultra system.
So, stop trying to make it sound as if your system is representative of the whole. I know mine is not either, but at least I make it a point each and every time to explicitly state under what conditions one can expect to attain very respectable performance (which I define as no noticable slowdowns, i.e. smooth gameplay, and excellent graphics).
Even during the firefights with multiple NPC's on screen at once, MY frames remained > 20 FPS.
Really, the hyperbole, the sophistry, and the sheer amount of misinformation that some of you post on here is just shameful.

I'm definitely not in the minority here with these kinds of problems. Besides, when have I said my problems are YOUR problems or anyone else's for that matter? Everyone has different issues and varying performance, just because I don't put Crytek up on a pedestal, doesn't mean you need to take that as me claiming you have the same issues as I do.

And you really think I'd come in here bringing up these issues, if I haven't tried to play the game? I know that after hours and hours of tweaking formatting reinstalling driver updates etc etc etc that the game plain won't run at respectable levels for me to enjoy it. I'm the type who will spend a lot of time trying to fix a PC related issue, but I've tried all I can think of or find here, and I've gotten nothing. And I'm not going to make it look like ass then go and play it. My gaming time is better spent on games that look and perform better. I'll wait for a patch, or sell the game.
 
I find it to be a steaming pile, sure I can run in the 30's, but way too many times the game is not smooth and 25fps is not smooth to me. This is with the demo, not the retail, which I have no intentions to purchase for it's lackluster performance on a high end machine.

Rig is in sig
 
Seriously... why does that even matter? You can speculate on why the game was delayed or not, but that doesn't change the fact that the game was playable on top of the line systems @ max settings.

It matters a lot.

HL2 and Doom3 were heralded as being these graphical gifts to gamers (honestly, we knew little of their gameplay, so much of the hype was around their looks). However, they both arrived late to the show, and it was the unexpected and comparatively low-profiled Far Cry which stole the show from them in terms of graphics, simply because it came out 7 months before them.

They lost their luster, their pizzaz, that unique quality which made them stand out above the rest of the games, because there was already one game that beat them to the punch, at least in graphics.

People tend to remember that, whether you do or not.


Furthermore, HL2/Doom3 in "Max modes" certainly did not look significantly better than the modes directly beneath them.

Take Doom3 for example, it's ballyhooed "Ultra" mode looked almost identical to its' "High" mode, almost no difference whatsoever.

Crysis on High, and in DX9, can be custom edited to look much like the DX10 very high mode for a nominal performance hit as well.

So really, once you enable parallax occlusion mapping, sunshafts, object motion blur (you don't even have to apply it to yourself, which imho is even better), edgeAA on foliage, 3d waves, then all of a sudden on a quasi-"High" mode, it suddenly looks just about as good as the DX10 Very High, but performs a hell of a lot better.

So in a matter of speaking, rigs even today can run it virtually maxed out, with very playable framerates. If I can more or less do that on a single 8800GT, then im absolutely positive an Ultra SLI rig won't have any problem.
 
I find it to be a steaming pile, sure I can run in the 30's, but way too many times the game is not smooth and 25fps is not smooth to me. This is with the demo, not the retail, which I have no intentions to purchase for it's lackluster performance on a high end machine.

Rig is in sig

No way is that statement true.

I seriously doubt you are experiencing 25 fps then in that case, because 25 fps is smooth in Crysis, so long as it doesnt dip more than a few frames below that.

A constant 25-35 FPS experience has been par for the course for me with the game, rarely, if every dipping below that threshold, or going too high above it. As such, it has been a very smooth ride from beginning to end.

Id like to see you post some screenshots using FRAPS as you are playing it.

I very much doubt you would be complaining about 25 FPS because most everyone can agree that in Crysis 25+ FPS feels smooth when compared to most games out there.

The complaints are usually centered around the 15 - 24 FPS range that many people experience, which I did too when I had a GTS.

Back then I would agree, that the performance left much to be desired. However with my new card, the peroformance/looks ratio is just right.
 
Seeing you guys whine is just sad. The fact that we currently CANNOT run this game at the highest settings is a testiment of how big of a step this game takes in terms of graphics. Sure, you can say that the game is "coded badly", but until you have the source code, you cannot make that assumption. When we can run it at its maximum, it will look and play as good as advertised. I think most of you guys with 8800s are just peeved cuz not being able to run this game at its max shrunk your e-peen. Solution: lower your settings and STFU.
 
I don't believe the "coding" of the game has anything to do with the lack of performance. I think the bottom line is that we have graphics cards with 128 stream processors or whatever and we really need more like 256 or something.
 
No way is that statement true.

Really? why's that? because it doesn't side with what you believe?

I seriously doubt you are experiencing 25 fps then in that case

Actually I'm just making it up to shame my system.

I very much doubt you would be complaining about 25 FPS because most everyone can agree that in Crysis 25+ FPS feels smooth when compared to most games out there.

Looking at most message this is not the case.

I'm not sure why your hell bent on trying to convince people this game runs great, when all I see is performance issues. I don't know if your trying to inflate the *greatness* of your card, or just to increase the size of your e-penis.

Either way it's been a long time since I've seen someone carry on like this and sporting the attitude that * if it works for me, then it's gotta work for you.*
 
Seeing you guys whine is just sad. The fact that we currently CANNOT run this game at the highest settings is a testiment of how big of a step this game takes in terms of graphics. Sure, you can say that the game is "coded badly", but until you have the source code, you cannot make that assumption. When we can run it at its maximum, it will look and play as good as advertised. I think most of you guys with 8800s are just peeved cuz not being able to run this game at its max shrunk your e-peen. Solution: lower your settings and STFU.

I am sorry I am going to have to be the one to tell you, but you my friend are an idiot! With such a dumb ass statement I wonder why you ever thought opening your mouth wold make any difference. First of all us people with 8800's are mad because the game performs like crap, period. When a new card comes out and has a faster clock and less ram but more stream processes, and out does two cards slightly slower. Something is wrong. Not taking away from the GT because it is a great card, but to have a card that is just better than the GTS get the same performance as two in SLI warrants some major bitching from the community!

SO DO YOUR SELF A FAVOR AND STFU!
 
I am sorry I am going to have to be the one to tell you, but you my friend are an idiot! With such a dumb ass statement I wonder why you ever thought opening your mouth wold make any difference. First of all us people with 8800's are mad because the game performs like crap, period. When a new card comes out and has a faster clock and less ram but more stream processes, and out does two cards slightly slower. Something is wrong. Not taking away from the GT because it is a great card, but to have a card that is just better than the GTS get the same performance as two in SLI warrants some major bitching from the community!

SO DO YOUR SELF A FAVOR AND STFU!

Or, you can do yourself a favor, calm the hell down, think rationally for a second (that's all it takes), and sell your GTS while the selling is still good on places such as Ebay, and then add a paltry $20 or so to it, and get a higher perfoming 8800GT, which performs better across the board. And instead of carrying on a PMS fest like many of you, I am actually enjoying the game as it was intended.

You people make it sound like it's the end of the world however.

OH TEH NOES, teh 8800GTS that i boughts 1 yearz ago iz not gud enuf for teh Crysis, Cryteks = teh SUCK!

Get over it.
The GTS is one year old tech by now.

The GT, although a refresh of that tech, and although selling at a lower MSRP, is by no means something you should be holding contempt towards.

Hell, in 3 months time, the 9800 series cards are expected to be launched, so some of you people can shut up once and for all.

It can't come soon enough either.

I swear, it's like some of you are teenagers, or new to PC gaming or something.

It has ALWAYS been the case, that every generation, a new game will be launched that will push existent hardware to the limit, and can not be truly maxed out until a subsequent generation of hardware is released.

Have some of you been living under a rock these last 12 years?

I mean, comparatively speaking, Crysis averaging ~ 35-40+ FPS on GT/GTX/Ultra systems on HIGH mode, all the while putting every single other game out on the market to utter shame graphically, is a testament to the fact that it runs good, not poorly contrary to what GTS owners, and owners of ATI cards believe.

There are cards today, TODAY, that can run it on HIGH, with 35-40+ FPS average once again, and yes that bares repeating, because some people appear to still be ignorant of that fact. As such, you can't complain that your 1 year old graphics card (whether you just purchased it or not, it's still 1+ year old tech), can't run it properly, then it's the fault of the game.

No, it's your fault for expecting too much out of already outdated technology.
 
Let's all buy the fastest car ever invented now and wait 10 years for the fuel to be invented, so you can drive it.
 
Let's all buy the fastest car ever invented now and wait 10 years for the fuel to be invented, so you can drive it.

Once again, posting fallacious and sensational claims such as that only further serves to highlight some of the ignorance of posters on this board with respect to the performance of this game with currently existent technologies, such as 8800GT / GTX / Ultra cards.

If you don't consider 35-45 FPS average on High at 1680x1050 to be playable, then the problem does not lie with the game, the problem lies with you.
 
Or, you can do yourself a favor, calm the hell down, think rationally for a second (that's all it takes), and sell your GTS while the selling is still good on places such as Ebay, and then add a paltry $20 or so to it, and get a higher perfoming 8800GT, which performs better across the board. And instead of carrying on a PMS fest like many of you, I am actually enjoying the game as it was intended.

You people make it sound like it's the end of the world however.

OH TEH NOES, teh 8800GTS that i boughts 1 yearz ago iz not gud enuf for teh Crysis, Cryteks = teh SUCK!

Get over it.
The GTS is one year old tech by now.

The GT, although a refresh of that tech, and although selling at a lower MSRP, is by no means something you should be holding contempt towards.

Hell, in 3 months time, the 9800 series cards are expected to be launched, so some of you people can shut up once and for all.

It can't come soon enough either.

I swear, it's like some of you are teenagers, or new to PC gaming or something.

It has ALWAYS been the case, that every generation, a new game will be launched that will push existent hardware to the limit, and can not be truly maxed out until a subsequent generation of hardware is released.

Have some of you been living under a rock these last 12 years?

I mean, comparatively speaking, Crysis averaging ~ 35-40+ FPS on GT/GTX/Ultra systems on HIGH mode, all the while putting every single other game out on the market to utter shame graphically, is a testament to the fact that it runs good, not poorly contrary to what GTS owners, and owners of ATI cards believe.

There are cards today, TODAY, that can run it on HIGH, with 35-40+ FPS average once again, and yes that bares repeating, because some people appear to still be ignorant of that fact. As such, you can't complain that your 1 year old graphics card (whether you just purchased it or not, it's still 1+ year old tech), can't run it properly, then it's the fault of the game.

No, it's your fault for expecting too much out of already outdated technology.

Look you start by saying we are crying and have no right to pissed off yet when you get called on it you say i need to calm down! Ok fine what ever, I will calm down and you should think before you speak. I didn't say in anyway that I would expect for the 8800gt to outperform one gts but two you got to be kidding me. Then for to say it is because the game is so advanced yet it runs great on the gtx makes no sense. I am pissed because Crytek explained how wonderful their engine would run on systems of today and it was designed for the gts as a nice mid rang card. Yet it doesn't. Oh and you say sell and buy another card, sorry but I have bills, a bitching annoyed wife, a child and a baby on the way. I thought that $800 for gpu's would be enough to give me good visuals (not top of the line but close) for at least two years. These damn things have caused me more grief than I can explain. So i won't dump to buy another mid rage card. I will wait for the 9800 gtx and get those in about a year. and then call it a day. As for coming of so brash to you, I apologize ,but I don't like being told that venting your frustrations should be allowed because it is just crying. As a consumer I have a right to be pissed and not be a mindless drone who buys for the bragging rights instead of for the value.
 
Look you start by saying we are crying and have no right to pissed off yet when you get called on it you say i need to calm down! Ok fine what ever, I will calm down and you should think before you speak. I didn't say in anyway that I would expect for the 8800gt to outperform one gts but two you got to be kidding me. Then for to say it is because the game is so advanced yet it runs great on the gtx makes no sense. I am pissed because Crytek explained how wonderful their engine would run on systems of today and it was designed for the gts as a nice mid rang card. Yet it doesn't. Oh and you say sell and buy another card, sorry but I have bills, a bitching annoyed wife, a child and a baby on the way. I thought that $800 for gpu's would be enough to give me good visuals (not top of the line but close) for at least two years. These damn things have caused me more grief than I can explain. So i won't dump to buy another mid rage card. I will wait for the 9800 gtx and get those in about a year. and then call it a day. As for coming of so brash to you, I apologize ,but I don't like being told that venting your frustrations should be allowed because it is just crying. As a consumer I have a right to be pissed and not be a mindless drone who buys for the bragging rights instead of for the value.


Well, if my MBA taught me anything (much of it's common sense really), it's that you should always try to minimize your losses and maximize your margins.

In this case, now would be the ideal time to sell your GTS's while they are still worth something. You could even turn a profit! ;)

Hell, just 1 week ago, Dell was selling the MSI 8800GT (factory overclocked to 675 Mhz -- performing on par with an 8800GTX) for $209.

Now I know for a fact that you could sell your 8800GTS' for more than that on Ebay.

I just sold a 320MB card 2 weeks ago for $220 for cryin out loud.

In doing so, you could invest in an 8800GT, a pair of them in SLI even for that matter.
The money you could make off the sale of the 8800GTS's would cover the costs.

It's simple economics really. Instead of holding onto something that is depreciating at a higher rate, you should sell while you can still get a good price for them.

Heck, even if you decide to invest in a 9800GTX somewhere down the line, I guarantee you that the resell value of the 8800GT will be higher than that of the 8800GTS by then.
So, you can't really lose money on this transaction if you play your cards right. You can only stand to reduce the costs you will incur for future upgrades by getting rid of those 8800GTS cards now.

Thus, it's a win-win situation. It's very very simple logic.
You don't stand to lose money, you would break even for now, and end up with a huge performance gain on top of it.

How can such simple logic escape anyone is beyond me. :confused:
Nothing personal.
 
Once again, posting fallacious and sensational claims such as that only further serves to highlight some of the ignorance of posters on this board with respect to the performance of this game with currently existent technologies, such as 8800GT / GTX / Ultra cards.

If you don't consider 35-45 FPS average on High at 1680x1050 to be playable, then the problem does not lie with the game, the problem lies with you.

:rolleyes:

http://hardocp.com/article.html?art=MTQxMCwzLCxoZW50aHVzaWFzdA==

And that's not everything with high and it doesn't even manage 35 FPS average.

Before you tell people that the way they play and think of games are wrong, you should maybe get some facts straight first. I'm pretty sure attacking the person instead of the person's claim is "fallacious" as well.

In either mode, you never hit this 35-45FPS Average with "high" settings at a 16x12 res(I know you said 16x10 but who benchmarks that res anyway?)

Can't even use AA or AF. Sounds pretty unplayable to me on the latest and greatest with "high" and best visual settings.
 
Well, if my MBA taught me anything (much of it's common sense really), it's that you should always try to minimize your losses and maximize your margins.

In this case, now would be the ideal time to sell your GTS's while they are still worth something. You could even turn a profit! ;)

Hell, just 1 week ago, Dell was selling the MSI 8800GT (factory overclocked to 675 Mhz -- performing on par with an 8800GTX) for $209.

Now I know for a fact that you could sell your 8800GTS' for more than that on Ebay.

I just sold a 320MB card 2 weeks ago for $220 for cryin out loud.

In doing so, you could invest in an 8800GT, a pair of them in SLI even for that matter.
The money you could make off the sale of the 8800GTS's would cover the costs.

It's simple economics really. Instead of holding onto something that is depreciating at a higher rate, you should sell while you can still get a good price for them.

Then, in 3-4 months time, when the 9800GTX (or whatever label they give it) is primed for release, the resell value of the 8800GT will be higher than that of the 8800GTS by then.

Thus, it's a win-win situation. It's very very simple logic.
You don't stand to lose money, you would break even for now, and end up with a huge performance gain on top of it.

How can such simple logic escape anyone is beyond me. :confused:
Nothing personal.

I understand what you are saying, and I am not saying you are wrong because that wasn't the purpose for the statement. I am pissed that a card that was supposed to be seen as a card crysis could push isn't and for that matter two of them warrant the same performance as 1 GT. I will take your advice and start to look at getting two gt's but my wife is highly annoyed with my PC foray and has me on a short leash :mad:!
 
I kinda agree with the other guy about crytek. They said this game could be easily playable on a GTS with high settings, a low resolution, and no filtering. Which no matter what CPU you have, what they said is false.

They sold this game saying that a GTS was good enough for high settings which is just BS. A GT/GTX/ultra is cutting it but by not much, only if your lucky.
 
:rolleyes:

http://hardocp.com/article.html?art=MTQxMCwzLCxoZW50aHVzaWFzdA==

And that's not everything with high and it doesn't even manage 35 FPS average.

Before you tell people that the way they play and think of games are wrong, you should maybe get some facts straight first. I'm pretty sure attacking the person instead of the person's claim is "fallacious" as well.

In either mode, you never hit this 35-45FPS Average with "high" settings at a 16x12 res(I know you said 16x10 but who benchmarks that res anyway?)

Can't even use AA or AF. Sounds pretty unplayable to me on the latest and greatest with "high" and best visual settings.

:rolleyes::confused:

Do you even read your own links?

The DX9 results for 1600x1200 indicate that on all HIGH settings, the GTX nets 30 FPS, while on mostly HIGH settings, the GT nets 33.3 FPS average.

Also, for the record, the system as a whole makes a difference, and in the [H] benchmark, they were using a slower clocked C2D versus a faster clocked C2Quad, as in my benchmarks or as in the Tweaktown one.

Furthermore, the [H] benchmark system was using only 2GB of RAM, versus 4GB of RAM as in my benchmarks.

Thirdly, 1680x1050 is a VERY VERY common wide-screen resolution intrinsic to 22" monitors, which seem to be all the rave these days.

And on that note, 1600x1200 approximately renders to the screen 8.125% more than 1680x1050 would.

So, taking that into account, if we added 8.125% more to the aforementioned FPS averages, we would yield:
32.4 FPS for the GTX
36.1 FPS for the GT

at their respective settings (High, and mostly High).

That also doesnt take into account the higher clocked Quad Core versus the lower clocked C2D

That also doesnt take into account the 4GB vs 2GB of RAM.

That also doesnt take into account the 169.01 vs 169.09 drivers for that matter.

My point clearly stands.

I already posted the benchmarks before to corroborate that as well. :eek:
 
I understand what you are saying, and I am not saying you are wrong because that wasn't the purpose for the statement. I am pissed that a card that was supposed to be seen as a card crysis could push isn't and for that matter two of them warrant the same performance as 1 GT. I will take your advice and start to look at getting two gt's but my wife is highly annoyed with my PC foray and has me on a short leash :mad:!

:p Good luck to ya.

It's a vicious circle this upgrade hobby of ours, but it can also be a rewarding one if you play your cards right.

Now, perhaps more than ever, is a great time to put down cash for a card.

Im not sure if it's ATI's resurgence and agreesive pricing with their 3800 series or what, but not since the days of the Voodoo2, can I remember a time when you could purchase a TOP rated card from either manufacturer for less than $250 street price.

It's a good time to invest. :)
 
:p Good luck to ya.

It's a vicious circle this upgrade hobby of ours, but it can also be a rewarding one if you play your cards right.

Now, perhaps more than ever, is a great time to put down cash for a card.

Im not sure if it's ATI's resurgence and agreesive pricing with their 3800 series or what, but not since the days of the Voodoo2, can I remember a time when you could purchase a TOP rated card from either manufacturer for less than $250 street price.

It's a good time to invest. :)

Thanks, but I wouldn't really call this an investment seeing that you lose money hand over fist at an almost scary pace. I still don't see why stepping down to step up makes sense when you run the specs. Like I know we can see the GT is out performing the GTS but will there be a time when I need the higher bit 320 vs. 256 and extra ram 640 vs 512. I do game at 1920 x 1080. What is your take on that.
 
Because I'm a glutton for punishment and my laptop TECHNICALLY can run the game. I loaded it up and decided what it could do...

Pentium M 1.6 - Radeon Mobility 9600 64meg w/ Omega Drivers - 1GB DDR-333 - Windows 2K modded files.

It ran... 5-15 FPS @ 800x600. Got to the dude you're supposed to meet up with on the beach.

Didn't think 1024x768 would be that much of a jump... it crawled... it moaned... 22 minutes, 600megs of ram and running out of video memory for textures later, it said no more.

It was an interesting experiment and I had fun posting this for anyone interested.
 
Thanks, but I wouldn't really call this an investment seeing that you lose money hand over fist at an almost scary pace. I still don't see why stepping down to step up makes sense when you run the specs. Like I know we can see the GT is out performing the GTS but will there be a time when I need the higher bit 320 vs. 256 and extra ram 640 vs 512. I do game at 1920 x 1080. What is your take on that.

It sort of feels that way though doesn't it, but it's really not a step backwards, rather an incremental step forwards, that can on occassions rewards the user with up tp 50% gains in performance.

I guess it's the pricing and lesser amount of RAM that fools many people into thinking that way, but my take on that is that Nvidia wanted to release a part that was easy to process, cheap to manufacture, and gave them a competitive edge over ATI's recently released 3800 series cards, and the GT does just that.

It is also my belief that they purposely didn't change the name to say the 8900 GT, perhaps to get rid of excess stock on 8800 GTS's, in a sense to confuse unsuspecting customers that an 8800GTS with 640MB of RAM valued at $350 is a better performer/deal than an 8800GT with 512MB of RAM valued at $250.

For the unsuspecting customer, NV can stand to make cash and get rid of excess stock of 8800GTS's.
For those that know better, are well-informed, they will purchase the cheaper 8800GT, which NV knows can easily compete with ATI's best offerings even, at around the same price.
Win-Win situation for NV. Make money off the uninformed and informed :cool:


Anywho, marketing and economics aside, from a performance perspective, I personally haven't seen any benchmarks which would give the GTS 640MB an edge over the 512MB GT at 1920x1200.

It could be any number of things:
- the 112 shader processors of the GT vs the 96 of the GTS cards
- the faster shader clock
- the faster core clock
- the faster memory clock
- the optomized ROP compression algorithms
- God :D


As far as the memory bus goes (384 vs 256), most benchmarks have shown the difference to be nominal at best.

Here are a few reviews that pit the cards against oen another.
http://www.guru3d.com/article/Videocards/468/
http://www.vr-zone.com/articles/Nvidia_GeForce_8800_GT_Review/5369-6.html

The GT beats the GTS at 1920x1200 across the board, and at 2560x1600 too for that matter. So apparently the 128MB of extra RAM and 384bit memory bus don't really make much of a difference.
Although at that res, the GPU itself is taxed quite heavily as well, so evidently, the faster clocks on the GT seem to play a role here.
 
Because I'm a glutton for punishment and my laptop TECHNICALLY can run the game. I loaded it up and decided what it could do...

Pentium M 1.6 - Radeon Mobility 9600 64meg w/ Omega Drivers - 1GB DDR-333 - Windows 2K modded files.

It ran... 5-15 FPS @ 800x600. Got to the dude you're supposed to meet up with on the beach.

Didn't think 1024x768 would be that much of a jump... it crawled... it moaned... 22 minutes, 600megs of ram and running out of video memory for textures later, it said no more.

It was an interesting experiment and I had fun posting this for anyone interested.

:eek: :p
 
It sort of feels that way though doesn't it, but it's really not a step backwards, rather an incremental step forwards, that can on occassions rewards the user with up tp 50% gains in performance.

I guess it's the pricing and lesser amount of RAM that fools many people into thinking that way, but my take on that is that Nvidia wanted to release a part that was easy to process, cheap to manufacture, and gave them a competitive edge over ATI's recently released 3800 series cards, and the GT does just that.

It is also my belief that they purposely didn't change the name to say the 8900 GT, perhaps to get rid of excess stock on 8800 GTS's, in a sense to confuse unsuspecting customers that an 8800GTS with 640MB of RAM valued at $350 is a better performer/deal than an 8800GT with 512MB of RAM valued at $250.

For the unsuspecting customer, NV can stand to make cash and get rid of excess stock of 8800GTS's.
For those that know better, are well-informed, they will purchase the cheaper 8800GT, which NV knows can easily compete with ATI's best offerings even, at around the same price.
Win-Win situation for NV. Make money off the uninformed and informed :cool:

Or how about this. I step up on the card to a 8800gtx and sell the other card to get a Cooler Master Stacker (can barley fit the gts's in my case), and wait to get another gtx for cheap when the 9800's come out :p?

Anywho, marketing and economics aside, from a performance perspective, I personally haven't seen any benchmarks which would give the GTS 640MB an edge over the 512MB GT at 1920x1200.

It could be any number of things:
- the 112 shader processors of the GT vs the 96 of the GTS cards
- the faster shader clock
- the faster core clock
- the faster memory clock
- the optomized ROP compression algorithms
- God :D


As far as the memory bus goes (384 vs 256), most benchmarks have shown the difference to be nominal at best.

Here are a few reviews that pit the cards against oen another.
http://www.guru3d.com/article/Videocards/468/
http://www.vr-zone.com/articles/Nvidia_GeForce_8800_GT_Review/5369-6.html

The GT beats the GTS at 1920x1200 across the board, and at 2560x1600 too for that matter. So apparently the 128MB of extra RAM and 384bit memory bus don't really make much of a difference.
Although at that res, the GPU itself is taxed quite heavily as well, so evidently, the faster clocks on the GT seem to play a role here.

Thanks for the info. Much easier to talk when both parties are calm and on the same page :p! Anywho I have checked EVGA's site to see if I can "step down" to "step up" and indeed I can, but the site is freaking out and crashes when I try to upgrade. I will soon enough figure it out and try to dump my other 8800gts on ebay or craigs list for $200 to make up the difference for the other card.
 
Thanks for the info. Much easier to talk when both parties are calm and on the same page :p! Anywho I have checked EVGA's site to see if I can "step down" to "step up" and indeed I can, but the site is freaking out and crashes when I try to upgrade. I will soon enough figure it out and try to dump my other 8800gts on ebay or craigs list for $200 to make up the difference for the other card.

Step down to step up, that is hilarious. :D

Im just hoping the 9800 series launches within 70 days. I fear that Ill miss it by a few days though :(

Anyhow, best of luck to ya in your dealings with EVGA. :)
Oh, and sorry for coming off as abrasive earlier.
 
Step down to step up, that is hilarious. :D

Im just hoping the 9800 series launches within 70 days. I fear that Ill miss it by a few days though :(

Anyhow, best of luck to ya in your dealings with EVGA. :)
Oh, and sorry for coming off as abrasive earlier.

Not a problem we both said our peace and then acted like adults apologized and had a great informative discussion. I have 65 days left on my step up so I know I will miss it because it would be to much like right to just announce and then release the damn cards within that time. I am cool with it but if I go up to a gtx and wait for everyone to dump theirs I think I still will be happy. At 1920 x 1080 I saw people do amazing things with those cards with just one. In SLI I will be as happy as my wife in a boot store! I don't ever plan to go with a higher res so for the next two years I would be very content with the gtx in sli if I can make that happen. On the other hand I have to jump ship and go with intel I hate the performance in games I am getting from my current 5200+. Not to mention the freaking wall I hit at 2.808ghz. Bring on the Quad Core!

P.S. 5200+ is amazing for everything else but hardcore gaming, I will give the mobo, case, and proc to my son. He will be one happy 9 year old! Lol
 
:rolleyes::confused:

Do you even read your own links?

The DX9 results for 1600x1200 indicate that on all HIGH settings, the GTX nets 30 FPS, while on mostly HIGH settings, the GT nets 33.3 FPS average.

Also, for the record, the system as a whole makes a difference, and in the [H] benchmark, they were using a slower clocked C2D versus a faster clocked C2Quad, as in my benchmarks or as in the Tweaktown one.

Furthermore, the [H] benchmark system was using only 2GB of RAM, versus 4GB of RAM as in my benchmarks.

Thirdly, 1680x1050 is a VERY VERY common wide-screen resolution intrinsic to 22" monitors, which seem to be all the rave these days.

And on that note, 1600x1200 approximately renders to the screen 8.125% more than 1680x1050 would.

So, taking that into account, if we added 8.125% more to the aforementioned FPS averages, we would yield:
32.4 FPS for the GTX
36.1 FPS for the GT

at their respective settings (High, and mostly High).

That also doesnt take into account the higher clocked Quad Core versus the lower clocked C2D

That also doesnt take into account the 4GB vs 2GB of RAM.

That also doesnt take into account the 169.01 vs 169.09 drivers for that matter.

My point clearly stands.

I already posted the benchmarks before to corroborate that as well. :eek:


Only one test shows it playable on all High settings...And even that Test doesnt reach your 35-45 FPS...1 test out of all of them and you're saying I'm blind?

4GB vs 2GB ram for Crysis doesn't make that signifigant of a difference. It doesn't matter if I have 4gb or 8gb if Crysis doesn't use the ram.
I also doubt Quad makes that big of a difference over Duo. I smell marketing.
I also disagree with your claim that 22" widescreen is "VERY VERY common." I'd bet 16x12 is still more common thus making 16x10 uncommon.

Even if I concede to all those improvements, your system still BARELY nets YOUR 35-45 FPS average on HIGH settings you claimed. And I'd bet it still would drop lower than 25 FPS at times making it playable only most of the time.

In fact, nevermind, it doesn't net your 35-45 FPS average because the average is barely over 35. How the hell is 36FPS a 35-45 FPS conclusion?

Even if I concede overclocking, that will make an overclocked state of the art system (meaning faster than anything anyone can buy stock) able to run Crysis over 35FPS average most of the time.

/yay
 
Oh, and that's with 2x edge AA, sunshafts, a 2x increase in materials detail view distance, and 16x AF at 1600x1200.

As can be seen in the last shot, I had 6 NPC's shooting at me simultaneously from the front, one somewhere off on the side who naded me and brought down the house, and I was also shooting back sporadically throughout this sequence (emptied out my pistol).
That's about as intense as it gets in terms of simultaneous firefights at any given part of the game (up until some parts of the last few levels of course), and yet my framerates did not waver.

you're easily getting twice what I get there. More than.
 
i just went from a 320gts to a gt and my frames went up pretty considerably. I went from 1280x720 all medium and about 15-25fps. To 1280x768 all very high, or high + 4x aa. With a little better fps. 15-35,40ish. I am very pleased. It has made the game really enjoyable, where it was not before.
 
i just went from a 320gts to a gt and my frames went up pretty considerably. I went from 1280x720 all medium and about 15-25fps. To 1280x768 all very high, or high + 4x aa. With a little better fps. 15-35,40ish. I am very pleased. It has made the game really enjoyable, where it was not before.

Yep, same experience here.
I just happen to play it in 1600x1200 and in mostly High settings (with a few very high tweaks thrown in and a couple of touches of medium), and the difference between my 8800 GTS 320 and this GT is like night and day.

:)
 
now after screwing around with it I can run @ 1920x1200 all high as long as I stay in dx9 mode.
 
Not sure if anyone's thought of this yet....But I imagine Crytek was angling on there being at least one more high end generation of cards released by now. I mean, AMD/ATI has really let the pressure off of Nvidia. So the 6month cycle is way out of whack.

It only figures that about time a developer finally decides to try and get the most out of the rapid paced graphics card market, the GFX makers let them down

I say kudos's to Crytek for pushing this thing out without compromise. Not that they probably had much choice:p
 
I'm sure it's been said.....but I think crysis makers are idiots....they clearly have a poor engine which needs improvement.
Why is it that other games can pack very high quality graphics and they can't?
It's all software....the designer cards they use don't convert very well to conventional gaming cards....and that's a software issue not hardware!:eek:
I'm tired of poorly written software (not that the game itself is bad) just poorly executed! Especially with all the quad cores and SLI technology that we have and we have to play with this game in (LOL) slow-mo? I suspect they are already improving their graphics engine significantly.:eek:
 
I'm sure it's been said.....but I think crysis makers are idiots....they clearly have a poor engine which needs improvement.
Why is it that other games can pack very high quality graphics and they can't?
It's all software....the designer cards they use don't convert very well to conventional gaming cards....and that's a software issue not hardware!:eek:
I'm tired of poorly written software (not that the game itself is bad) just poorly executed! Especially with all the quad cores and SLI technology that we have and we have to play with this game in (LOL) slow-mo? I suspect they are already improving their graphics engine significantly.:eek:

I think that post of yours contains quite a bit of idiocy.

Every other game engine on the planet pales in comparison to CryEngine 2, and Crysis as a game, looks leaps and bounds better than any game out there, period.

Just because you can't run it with most of the eye candy turned up and with good performance to boot, doesn't mean the engine is poorly coded.
If I can, if many a GTX/Ultra user can, and you can't, then that once again is YOUR fault, not Cryteks.

Why should they cater to the lowest denominators?
It's those types of devs and people like you who hold the industry back, whether you are consciously aware of that or not.

I celebrate the fact on the other hand, that a very select few configurations out there can actually run this game close to what was intended by the devs (we still need another gen of cards before we can step into the realm of "Very High" settings).

Quite simply, this is one of those rare gems released once every few years that raises the bar for graphics so high, that it will take other devs a year or more to catch up.

You can sit there and sulk over it, but excuse me while I go play my Crysis on HIGH, at 1600x1200, and get 30+ FPS average.
 
Not sure if anyone's thought of this yet....But I imagine Crytek was angling on there being at least one more high end generation of cards released by now. I mean, AMD/ATI has really let the pressure off of Nvidia. So the 6month cycle is way out of whack.

It only figures that about time a developer finally decides to try and get the most out of the rapid paced graphics card market, the GFX makers let them down

I say kudos's to Crytek for pushing this thing out without compromise. Not that they probably had much choice:p

Definitely a very plausible theory.

Although I am quite pleased with my performance, Id love to test out those very high settings. Wonder if NV can deliver a card with 2 times the performance of an 8800 Ultra with the presumed 9800GTX.
 
Not sure if anyone's thought of this yet....But I imagine Crytek was angling on there being at least one more high end generation of cards released by now. I mean, AMD/ATI has really let the pressure off of Nvidia. So the 6month cycle is way out of whack.

It only figures that about time a developer finally decides to try and get the most out of the rapid paced graphics card market, the GFX makers let them down

I say kudos's to Crytek for pushing this thing out without compromise. Not that they probably had much choice

I believe a different theory which is; EA put pressure on Crytek to release ahead of schedule to beat the Holiday shopping season. Some unacceptable bugs that leave the game unpolished ie AA blowing up on FF on servers , sound bug loops, etc. are all evidence of this.
 
I'm sure it's been said.....but I think crysis makers are idiots....they clearly have a poor engine which needs improvement.
Why is it that other games can pack very high quality graphics and they can't?
It's all software....the designer cards they use don't convert very well to conventional gaming cards....and that's a software issue not hardware!:eek:
I'm tired of poorly written software (not that the game itself is bad) just poorly executed! Especially with all the quad cores and SLI technology that we have and we have to play with this game in (LOL) slow-mo? I suspect they are already improving their graphics engine significantly.:eek:

I would say that compared to Oblivion, this game is much more efficient, considering how much memory the textures are using, and how much geometry there is on your screen at any given time. The post processing is really what puts the whumppin' on your rig though. It's not an essential thing.
 
Back
Top