GeForce GTX 280 = Yawn.....Welcome to the Summer Refresh

Yes I want 8AA/32AF @ 2600x1900

... or I will cry. A lot. With tears.

Honestly, 99% of games are completely playable with a 9600GT / 3870. Under $150 gets you some good gaming. Crysis in ONE GAME out of thousands to choose from, play something else?
Point noted. But think about this one for a sec. Name one game that cannot be played at absolute max (besides Crysis) on the 3 most common resolutions (1280x1024 1680x1050 1920x1200) on the 9800GX2. Kinda tough huh? That is the problem here. The GTX280 does not feel like a step forward to alot of us. What would be the point of spending extra cash on the blazing fast card if the current cards hold up just fine? This is why I feel like nVidia is shooting themselves in the foot by releasing ANY card that cannot bring REAL performance boosts in Crysis. We need a SINGLE card that can do what TRi-SLi GTX280's are doing in Crysis. I've already heard every argument against this, but the facts still remain...we do not need a new card if current cards are already maxing out games. Like I said in other threads this card is so incredibly niche it is crazy. Only those who are playing Crysis @ 2560x1600 really need to upgrade to these cards. For everyone else anything from the 8800GT up to the 9800GTX is perfect and in some cases complete overkill for the PC games that are out currently.
 
When asked about Warhead Cervat Yerli stated it was already in the works prior to "no more PC exclusives" whinning.
Gotcha. Makes sense then. I honestly think that Yerli is making a big mistake. I really believe that Crysis should stay exclusive. I think that if Yerli wants to do a CryEngine2 game on consoles then more power to him, but just leave Crysis on PC.
 
I agree with those points, the controls even after fiddling with them for a while still felt like they were designed for a console controller. I disagree with your other points though. I think the AI was alright and the physics seemed very good to me.
I disagree. Crysis did not feel console like at all to me. When I think of a console ported game I think of Bioshock. I hated that game for PC. Even though they attempted to tweak the experience it still reeked like a console port. Anyways to each his own.
 
If Crysis was something special, Crytek would not have bailed on it so fast to work on something new.
 
I do not understand why they are changing to consoles, it really is beyond me.

The whole ideal behind Crysis was to make something revolutionary, and at a graphical standpoint they did. Why would you take that same mentality and then use tools (consoles) that limit what you can achieve? I do not understand it at all.
 
My guess, it would take much less investment to develope a game for consoles than PCs.
 
why blame nvidia for crysis's crappy engine?

Mainly because their stupid, it's obvious it's going to be another generation before we can play crysis at high resolutions maxed out and claiming it's nvidia and ati's job to optimize just displays ignorance, Cryteck clearly made some poor design choices with Crysis.

People seem to expect another 8800GTX or R9700Pro but cards that can provide these kind of increases are rare.

When comparing the GTX280 against a 8800GTX the performance gains are very nice, at 1920*1200 with AA reviews have reported a 75% average increase which is bloody good. Also the price really isnt that bad, prices i'm seing are a little lower then the 8800GTX when that was launched.

I do think nvidia scored an own goal with the 9800GX2 though, they really didn't have to make it as fast as they did and now there competing with their own product and losing in many benchmarks, all they had to do was beat the 3870X2. lol
 
What resolution are you running? Apparently Crysis doesn't need to be run at very high frames to be playable. Many users state that 30fps is sufficient unlike other games.


That is "30fps MINIMUM" because for me, frankly, even 25fps is getting notchy......

....Show me a card out, TODAY, that can maintain a MINIMUM of 30fps at 1920x1080 in Crysis with even 2xAA......no?

I play it at 1920x1080 with no AA, a mixture of medium and high settings, and it plays good enough that spending 400-600 bucks over my current G92 8800GTS is a waste of cash, totally. And yes, I can afford it, but I won't spend it, it's a matter of principle.

Besides, all I play online is CoD4 and my GTS rocks the house with that.....:p
 
... I feel like nVidia is shooting themselves in the foot by releasing ANY card that cannot bring REAL performance boosts in Crysis. We need a SINGLE card that can do what TRi-SLi GTX280's are doing in Crysis. I've already heard every argument against this, but the facts still remain...we do not need a new card if current cards are already maxing out games. Like I said in other threads this card is so incredibly niche it is crazy. ...

Don't forget the strategic importance of the GT200, nor the risks Nvidia has taken by choosing that strategy.

Think about how mass-market PC video cards have developed; GPU was 2D; GPU took on 3D; 3D split into general 3D and CAD/CAM 3D but with the same base GPU; video processing was added in too, but often as a specialised hardware unit; unified shaders came and the GPU started moving into the HPC arena but with only single-precision floating point ops. To make serious headway into the HPC market a processor should be able to handle double-precision floating point ops.

The GT200 is the first [edit]Nvidia[/edit] GPU to do double-precision floating point, and Nvidia has clearly stated that CUDA and going beyond gaming is where they are heading as a company. Given its heritage it is to be expected that it also excels at gaming, but just as the 8800GTX and 8800Ultra were powerful, power-hungry, expensive, bleeding edge products, so are the GT200-based products.

Nvidia had committed to giving a ~1TFLOP GPU to the gamers plus double-precision to their HPC customers. Meeting these commitments was clearly more important strategically than doing a die-shrink. Generally speaking, gamers and HPC users at the bleeding edge have a different definition of "value" and are less concerned about heat and cost than the rest of us. We'll just have to wait for the refresh -- or settle for a competitors "value" product. :).

To put it another way, even if the 280 & 260 (in 65nm parts) don't sell well in the consumer space, it will be a pity for Nvidia but the GPUs won't go to waste, they'll get put into Tesla products which have fatter profit margins anyway.

Of course, in pursuing this strategy Nvidia risks alienating its largest customer base ... and it risks not capturing its intended new market.

To me, the danger to Nvidia is that they may have tried to hit too many targets with the GT200. The GT200 only has 1 double-precision floating point unit for every 8 single-precision unit. This means that whilst it can do ~1TFLOP at single precision, it can probably only do about 100-130GFLOP double-precision. Now HPC is not my area so I don't know what proportion of compute problem may require double-precision (it isn't going to be 100% of course, there are plenty of integer and single precision things for a program to run), but it seems to be a bit of a risk for a company to tout its latest and greatest as a 1TFLOP lion that does double-precision if the reality is that it is a 120GFLOP double-precision puma.

On the other hand, as Anandtech pointed out, games have barely begun to move from 16 to 32 bit, so making all the FP units in a GPU double-precision clearly doesn't make sense. What could happen is another split in the GPGPU path with the mass-market & entry-level workstation version having a variable number of Streaming Multiprocessors each with high SP-FPU:DP-FPU ratio, and the HPC versions having the same high-level design, but with a lower SP-FPU:DP-FPU ratio in the Streaming Multiprocessors.

[edit]I missed the news that the AMD FireStream 9250 Stream has double-precision FPUs (although apparently not 64-bit DP) which do 200GFLOPS -- which makes things look even worse for the strategic positioning of the GT200.[/edit]
 
I also have to say i don't give a shit about Crysis. Why make a game that can barely run on the days hardware. Valve has done a good job of making an engine that runs well on any computer. Granted they don't have the best graphic ever but it scales well.

Best post in the thread...well said.


The way I see it, if you spent $600 on a video card just to see better framerates in Crysis, you're just a fucking moron.


Who gives a shit about Crysis anyway?? What else are you wanting to see? Motion blur? I hated it and turned it off anyway. More leaves swaying in the wind? Pointless. More detailed textures? Who gives a damn how good the rocks look in a firefight. More reflections? Like I haven't seen good water in a game before.

A game can only look so good, and last time I checked we played games for fun, not to ohhh and ahhh over fucking graphics and framerates.

I played Crysis from start to finish on an 8800GT and it looked fine, ran fine, and I beat it and uninstalled it already. I still however, play TF2 for hours every night with no bitching about graphics or framerates. Crysis is doing nothing but putting more money in nVidia's pockets because people are shelling out big bucks on the new cards still trying to see that magic number of 60fps on very high settings. If people would stop with all the Crysis shit, we'd probably see some good cards from nVidia.


Perhaps this is a good point. The current hardware is dandy with every other thing you throw at it, yet we're seeing very small gains with one game? A game they've mysteriously decided to stop supporting...? At a certain point you've got to look at Crysis and think that maybe it's Crytek's fault. Yes it looks great and does a lot of things right, but does the game's slightly superior look equate the the gigantic performance hit it takes in comparison to everything else?

Don't even try...your logic will get nowhere with the Crysis fanboys.
 
Mainly because their stupid, it's obvious it's going to be another generation before we can play crysis at high resolutions maxed out and claiming it's nvidia and ati's job to optimize just displays ignorance, Cryteck clearly made some poor design choices with Crysis.
nVidia worked practically side by side with Crytek in order to make sure that nVidia cards would perform the best in that game. nVidia literally paid for ALL of Crysis' advertising. You know what is funny? Crysis is just a small taste of what is to come. By the end of this year there will be other powerhouse games that nVidia and ATi will fail to max. It is because nVidia and ATi are falling behind. The developers are ready to move forward. Another interesting tidbit that I can add and I am sure it is going to get me flamed...right now the best looking game IMO is Metal Gear Solid 4. I can't run Crysis at the resolution and quality settings I want to and I am not prepared to drop $2,000 in order to play Crysis the way we should have been able to play it when the 9800 cards came out so I stand by my claim that Metal Gear Solid 4 is the best looking game out there currently. You have to understand that I have ALWAYS stood behind PC gaming and defended it hardcore, but the facts remain. The PS3 and Xbox 360 have some of the best looking games available. Sure the hardware is not as nice as whats on PC, but that is not holding them back from getting stunning looking games. With nVidia refreshing the same technology over and over and continually falling short of expectations it is no wonder why PC gaming is dying...I mean its uhhh...changing...yeah that's it...its changing...PC gaming could never die. :rolleyes:
 
Well than say hello to lots of price gouging, and overpriced cpu/mobo/gpu.
All it will do is FORCE nVidia and ATi to wake up and start delivering cards that can keep up with the developers out there working on the cutting edge.
 
All it will do is FORCE nVidia and ATi to wake up and start delivering cards that can keep up with the developers out there working on the cutting edge.


Force? FORCE? your going to force someone to make a technological innovation....

Ok, so I'm going to lock you in a room. That will FORCE you to develop jedi mind powers to unlock the door. It doesn't work like that.
 
Force? FORCE? your going to force someone to make a technological innovation....

Ok, so I'm going to lock you in a room. That will FORCE you to develop jedi mind powers to unlock the door. It doesn't work like that.
If Intel can do it then nVidia and ATi should be able to do it...right? All I am saying is that if Intel releases some kind of monster GPU then nVidia and ATi will be forced to compete. That's how stuff works. Glad I could teach you this. I think I learned this in like 5th grade.
 
If only the people that complain about Crysis knew all what was going on and what was being rendered, you'd recant your "poor coding" remarks.
:)
 
If only the people that complain about Crysis knew all what was going on and what was being rendered, you'd recant your "poor coding" remarks.
:)

Yep, looks spectacular and is highly efficient for what it's putting out. Silly kids don't know what programming or software design is at all, yet they act like they're guys that make Bjorne Stroussop's (spelling!) look like an amateur :rolleyes: .
 
If Intel can do it then nVidia and ATi should be able to do it...right? All I am saying is that if Intel releases some kind of monster GPU then nVidia and ATi will be forced to compete. That's how stuff works. Glad I could teach you this. I think I learned this in like 5th grade.

I think the chances of Intel releasing a monster GPU are somewhere between slim and none.
 
The way I see it, if you spent $600 on a video card just to see better framerates in Crysis, you're just a fucking moron.

I don't think anyone does this, people want better frame rates in all their games, to be able to crank the AA/AF in all their games. I don't know why you'd bring this up, has anyone claimed thats all they want a new card for?

Who gives a shit about Crysis anyway?? What else are you wanting to see? Motion blur? I hated it and turned it off anyway.

Why the hell do we care what your personal preference is? Learn to see past your own opinion and personal preference at what others might prefer, the world doesn't revolve around you.

More leaves swaying in the wind? Pointless. More detailed textures? Who gives a damn how good the rocks look in a firefight.

I do for one, many others do as well, graphical fidelity is a wanted commodity, if you don't want it then FINE, but don't assume everyone is exactly the same as you, some people want good graphics, deal with it!

A game can only look so good, and last time I checked we played games for fun, not to ohhh and ahhh over fucking graphics and framerates.

Again your own opinion/point of view, people play for many reasons such as Fun/relaxation, escapism, for a challenge and rewards of that, socially (MMO), pro gamers play to compete, and sometime for money. All these groups have different requirements, rangeing from completely uninterested in graphics and performance to highly interested in one or the other, or both.

I played Crysis from start to finish on an 8800GT and it looked fine, ran fine, and I beat it and uninstalled it already. I still however, play TF2 for hours every night with no bitching about graphics or framerates. Crysis is doing nothing but putting more money in nVidia's pockets because people are shelling out big bucks on the new cards still trying to see that magic number of 60fps on very high settings. If people would stop with all the Crysis shit, we'd probably see some good cards from nVidia.

I'm not sure what you're infering here...that people putting money into video cards that keeps Nvidia in business is a bad thing? The only thing that stagnates a market is lack of competition (AMD in this case) as soon as they get their thumb out we'll see competative price dropping and better revisions (e.g. 9800GTX 30% price cut)

I do agree that this idea of trying to get crysis to run in max settings is a bit daft, it's this arbitrary goal which is completely meaningless. But in my opinion if you can ignore the bitching about how the Crytek engine is "badly optimised" (HAHA yeah right!) having the bar set high is a good thing as it drives hardware developers to beat it and it drives the industry fowards. Remember we need demanding games to justify new hardware advances.

One last thing about your TF2 comment, my modest [email protected], 4Gb RAM, 8800GTX system cannot run TF2 in max settings at 2560x1600 with a decent frame rate, I'm usually running max settings with low/med AA on 1920x1200, Age of Conan isn't any better, 2560x1600 with no AA and with 1/2 the settings lowest. There's plenty of games out there even old source engine ones which still need a powerhouse to run them well, irrelevant of the state of Crysis.

Sorry for the rant but a few of these things struck a nerve
 
<snip>Sorry for the rant but a few of these things struck a nerve

Don't appologize. It was a very good rant. And frankly, it was right. I'm not going to appologize for my having more money than them, or choosing to spend it on video games rather than whatever they spend thier money on.

We all have choices in life, I went to school and now have a good job and buy the things I want. If you didn't go to school, or would rather spend your money on anything from fast cars to cigs, drugs, and/or booze, that's fine too. Please don't use your fast car and drugs/booze at the same time though. :p
 
GeForce GTX 280 < Crysis

http://enthusiast.hardocp.com/article.html?art=MTUxOCw0LCxoZW50aHVzaWFzdA==

"If we can sum up our experiences in Crysis, we can say that the GeForce GTX 280 is not the “Crysis killer,”<
-----------
As far as other games, we already have/use cards that will run any other game fine, w/AA/AF at high rez
once again Nvidia lives up to their reputation of evolutionary changes not revolutionary
Remember a game called Far Cry?
If we use it as the example, we will be able to play Crysis properly around 4qtr. 2009

Welcome to the Summer Refresh,
see you around Christmas when I "may" buy another card

P.S.- Ati, you out there?

T-man, still running 8800Ultra and still waiting to play Crysis :)

summer refresh huh?

so people actually think that a company can put out a video card thats TWICE as fast as the previous one EVERY single time they release a video card?

i don't think people actually know how hard it is to make video cards. we're lucky to get 10-20% increase every time a new card comes out.

just wait a year, i'm sure there will be cards out that are twice as powerful as now. :)
 
all good points, frostex.

people are quick to jump on the "crysis is shit" bandwagon nowadays... and that in itself is getting old.
 
I think the chances of Intel releasing a monster GPU are somewhere between slim and none.
If anyone could release a monster GPU other than ATi or nVidia it would be Intel. Intel does have the Larrabee coming. Sure it is probably some time off, but I think it would be nice to have a 3 way competition going on. Just an FYI as well Intel is working on releasing a serial graphics port technology that can scale WAY better than what nVidia and ATi are using. Apparently nVidia has tried to license this technology, but Intel would not comply. Not only that but Intel bought the company that was working on Project Offset. Project Offset is arguably the most advanced looking game/game engine ever perhaps beating out CryEngine2. I think this lends ALOT of credibility to the idea that Intel may be gearing up to release an ATi/nVidia killer GPU. Like always take this with a grain of salt...
 
how many people really play at 2560x1600?? I suspect not many. And can anyone really tell the difference running 1920x1080 with 16xAA say over 8xAA or even 4xAA???

I'm sorry that you don't run at 2560x1600. Stop complaing that Nvidia put out a card the fits those people's needs. Seriously, stop whining. They release more than enough cards to fit the needs of people who game at any other resolution.

You want to play at 1920x1200? Go grab a pair of the 9800GTX+ and SLI them. They will fit your needs just fine.

Just because you don't play at a high enough resolution to take full advantage of Nvidia's latest offerings, means your not [H]ard enough for the bleeding edge of technology. Get over it already.
 
Best post in the thread...well said.


The way I see it, if you spent $600 on a video card just to see better framerates in Crysis, you're just a fucking moron.


Who gives a shit about Crysis anyway?? What else are you wanting to see? Motion blur? I hated it and turned it off anyway. More leaves swaying in the wind? Pointless. More detailed textures? Who gives a damn how good the rocks look in a firefight. More reflections? Like I haven't seen good water in a game before.

A game can only look so good, and last time I checked we played games for fun, not to ohhh and ahhh over fucking graphics and framerates.

I played Crysis from start to finish on an 8800GT and it looked fine, ran fine, and I beat it and uninstalled it already. I still however, play TF2 for hours every night with no bitching about graphics or framerates. Crysis is doing nothing but putting more money in nVidia's pockets because people are shelling out big bucks on the new cards still trying to see that magic number of 60fps on very high settings. If people would stop with all the Crysis shit, we'd probably see some good cards from nVidia.

Don't even try...your logic will get nowhere with the Crysis fanboys.

OK, here's my take,

Far Cry is/was the same as Crysis, ie;
all the current hardware puked trying to run it
but it was good for the industry in that it made the consumers (and as a result the manufacturers) want a product to make it run "as it should"
So, now we have Crysis...and yes it is a "good" thing, it's pushing the hardware

one thing that bothers me is it's the only game out there that does in fact stress our hardware, I WANT a game, more games, to actually show the weakness of my system. (Call of Juarez anyone?<kickass game btw)
I mean geez run Quake 4 it was a great looking game, but now it runs 300+ FPS

Crysis is closer to a benchmark of graphics than anything else.
whether you like it as a game isn't the point, the fact it makes most $2000 systems from 2007 seem slow IS the point. We should want MORE games like Crysis...

So, for me, I won't uppgrade till they make a single card solution to run Crysis @ 1920x1200 4x/16x at around 80fps, THEN I'll upgrade, otherwise, why would I need to?


(BTW guys, thanks for the thread action, 3900+ Hits wow)
 
As an enthusiast I have my expectations from hardware manufacturers and I try to keep them well grounded and realistic. To understand why there is no "Crysis killer" yet you must understand the big picture. Nvidia and Ati have budgets to stick to and when the market does not demand anything more powerful than what is already offered, then they will not mothball mainstream sales to produce an exotic ultra powerful GPU that is simply not affordable by the mainstream consumer. If all new games that were released today were as demanding as Crysis, everyone would run out of business overnight. You guys must remember what happened to Voodoo right? So far both Nvidia and ATI are duly following Moore's law.
 
As an enthusiast I have my expectations from hardware manufacturers and I try to keep them well grounded and realistic. To understand why there is no "Crysis killer" yet you must understand the big picture. Nvidia and Ati have budgets to stick to and when the market does not demand anything more powerful than what is already offered, then they will not mothball mainstream sales to produce an exotic ultra powerful GPU that is simply not affordable by the mainstream consumer. If all new games that were released today were as demanding as Crysis, everyone would run out of business overnight. You guys must remember what happened to Voodoo right? So far both Nvidia and ATI are duly following Moore's law.

What the? ...you call $600 affordable and mainstream, uh OK
yawn....
 
Will there ever be a single-card solution that will be able to run Crysis maxed out?

How long has Crysis been out? Video card performance has more than doubled since Crysis was released (probably tripled) and we still cant run it maxed out? Crysis seems to be a cool game but I think that we are chasing something that may not even exist.

It's perfectly fine to base your video card purchase on crysis performance. But personally I'd like to actually see crysis perform @ high-settings before I decide on a video card. Not the other way around.
 
Yes, just wait another 20 months or so. Then the GTX 380 will come out and it will be able to max crysis.
 
What the? ...you call $600 affordable and mainstream, uh OK
yawn....

It's nvidia's high end card. These are not designed for the mass market, and nvidia knows that not nearly as many people will buy 280's as, say, a 88GT. The ~$600 price range has been the same for many high end nvidia GPU's for a while (Hell, the Ultra was $829), so why so much bitching?

I honestly don't understand.
 
It's nvidia's high end card. These are not designed for the mass market, and nvidia knows that not nearly as many people will buy 280's as, say, a 88GT. The ~$600 price range has been the same for many high end nvidia GPU's for a while (Hell, the Ultra was $829), so why so much bitching?

I honestly don't understand.
Because many people are tired of the very high prices and want to go back to the days when $350-400 was the norm for an introductory price of a high end card. Even in those days, there were increasing concerns that the prices were getting way out of hand. As it stands now, a lot of consumers sometimes have to wait months for prices of their intended purchases to drop, or settle for inferior products sooner...

Ati has more realistic prices and if they continue to release quality products, we will also see NVidia lower their introductory pricing due to competition. Lack of competition is the primary reason we have seen absurd pricing in the high end. $700 after all is said and done for a single video card is freaking ridiculous however you try to justify it (please no Crysis justifications).
 
Back
Top