Reason 8800GT looks so good because no 9800GTX out yet

Yup, even if the 3870 underwhelms, the 8800GT will be more readily available and cheaper then

Yep, and my comments previous to that were in relation to the market strategy. I think nvidia is trying to sop up all the remaining gamers they can with their 8 series before they get some competition from ATi and release the 9 series. I was just trying to help a few avoid the inevitable annoucement which will seed several "8800GT owners grab your tissues" threads.
 
that's because the gts uses purevideo1, and the gt uses purevideo 2

Not sure what you mean, I know the previous G80 8800GTS models use VP1? or w/e but the have 3 models of the 8800gts listed
1 8800GTS 512 (listed with the 8800GT 512 next to it too) it says the 8800GTS 512 has EXCELENT, where was the 640/320 has GOOD
and then 8800GTS 640 / 320
 
Yes, [the cards have shown] MASSIVE increases in power consumption and computational power, BUT the games have increased at a MUCH more massive amount in comparison.
That seems to be the case, yeah. I think developers are just over-eager, or are using graphical fidelity as the primary selling point. If you look at Crysis, for instance, you see a game that's tremendously difficult to run at the second highest in-game settings at typical enthusiast resolutions on very, very high end machines. While that's not necessarily a bad thing by itself, the game has been so zealously over-hyped by both Crytek and the public purely because of its class-leading graphics, which the majority of average users could never hope to see at playable frame rates, that we're being led to feel that the graphics cards are somehow "behind the curve". In some ways, they are. In many ways, they are not.

I've always felt that it's the task of developers to keep pace with the hardware, and not the other way around. Graphics sell games, and it's almost misleading to demonstrate mind-blowing visuals when there isn't sufficient hardware to drive those visuals. This falls in line with the practice of publishing "recommended specs" that are grossly insufficient -- the goal, I think, is to hook unsuspecting buyers into purchases they aren't fully prepared for, and because you can't typically turn back from a retail game purchase, you're already fucked. There are some respectable developers out there, those that are shooting at manageable targets in terms of performance (Valve, Epic, id, and a couple other AAA developers), but I'd say that few achieve the goals they set for themselves.

What Crytek does, and what other developers do, seems irresponsible in some ways, yet they're admired for "pushing the envelope", when the envelope is already ready to burst. Why NVIDIA and AMD are expected to respond more quickly to these kinds of scenarios, when the developers can respond almost instantly, is beyond me.
 
amd and nvidia aren't behind the curve, crysis is easily playable at previously acceptable resolutions. It's just that people are starting to use ridiculously hi res and they're expecting the same high end performance previously offered to them.
 
I agree, I'd say any Core 2 cpu clocked 2.4+ GHz and an 8800GT or GTX can run the game at all options on High at 1280x1024 or so very smoothly (with the motion blur, you only need 35~ fps for a fully smooth gaming experience - any higher and you won't notice the increase). That's not too bad, considering the visuals you're getting.

I've tried some of the other latest games like FEAR, Prey, and such, the visual complexity there (mostly small indoor scenes) doesn't even begin to compare to the amount of trees/foliage/terrain/water you see in the average Crysis scene, not to mention the fully interactive/destructable nature of the environment. You think those trees that can be broken where you shoot them and buildings that can be completed flattened cost nothing when it comes to optimizing 3D rendering with static objects in mind? Also the parallax/bump mapping and dynamic soft shadows everywhere. The complexity grows exponentially, I'm surprised Crysis runs as well as it does already. Imagine in a few months, when it'll be better optimized for Quad cpus and SLI/CF.
 
What Crytek does, and what other developers do, seems irresponsible in some ways, yet they're admired for "pushing the envelope", when the envelope is already ready to burst. Why NVIDIA and AMD are expected to respond more quickly to these kinds of scenarios, when the developers can respond almost instantly, is beyond me.
They've already responded. Notice the 'Low' video graphics setting avaliable? Click that. Ignore the rest. Now you have a game that's not ahead of the curve visually. But something tells me you'll be complaining that you can't shoot down trees and the game doesn't look as good as it could.

PS. I'm not expecting NVIDIA or ATI to respond quicker than game developers can push their games to require better hardware. It's much harder for the hardware folks to make faster hardware than for software developers to create stuff that requires better hardware. But I disagree with saying that they shouldn't do that at all - then we wouldn't be making progress. Someone has to push, and someone has to try and follow. In this case, it's the software and hardware, respectively.
 
They've already responded. Notice the 'Low' video graphics setting avaliable?
Answer me this: is the 'Low' setting what Crytek uses to demonstrate Crysis, or is 'Very High'? If you aren't aware, the latter is typically what's shown in the videos and in screenshots.
 
people in the business get amused by these enthusiast analysis comments that have no bearing on their true market forces that generate their revenues.

The GT simply exists because it's cheaper to produce and provides a larger profit margin all while being next in line to the performance of their top tier cards.
 
the GT is an amazing card because it alows the average joe to finally get thier hands on some serious performance.
It's a $250 card people. Get over it. it was never meant to eclipse the GTX/Ultra cards. And to the people complaining that they can't game on thier 30" Dell's.... BOO FUCKING HOO. Get over yourself. the resolution is retarded. You don't like the performace of your Ultra? Suck it up or get a second one.
You have to realize that 90% of the market for video cards does not have the money for a $600 card. Most of us can muster up $250. And the vast majority of gamers do not have monitors that exceed 1680x1050/1600x1200 and at those resolutions the GT, GTX and Ultra will excell at all but mayby crysis.
Stop whining and complaining that Nvidia won't release something to satisfy Mr. Spoiled Rich Asshole that has to much money to blow. 90% of the population is just catching up to the 8800 crowd. You early adopters will just have to sit back and wait for a while.
 
Everyone is freaking out what a great super deal the new 8800GT is, oh my god the best geek card ever made, it is cool, fast, and cheap :rolleyes:


One reason for that = no 9800GTX out yet. If the 9800GTX was here on sale for $500.00 or so, and twice the performance of the 8800GTX, the 8800GT wouldn't look as good as it does today because it is being compared to the $500 8800GTX.

Otherwise the 8800GT would look average, in SLI it would just maybe be a little above half performance of the 9800GTX most likely. Maybe that is why nVidia has not released a 9800 series yet ??

Also my point is the 9800GTX was supposed to be here around now, one year after the 8800GTX release

I have sat back and watched all of this unfold and have to agree with you. What I would call it is smart marketing by Nvidia..........the price is what really drives this card...........and now they have your money!!!

I have a feeling there will be some very unhappy people when the new 9800GTX comes out and they have already jumped on the 8800GT bandwagon.
 
I have a feeling there will be some very unhappy people when the new 9800GTX comes out and they have already jumped on the 8800GT bandwagon.

And why exactly would they be unhappy?

Must really suck to spend $250 and get $500 performance. There will always be a new better card on the way why wait for everything when you can enjoy yourself now?

Just remember dont buy the 9800GTX when it comes out because youll be unhappy when the 10800GTX comes out :rolleyes:
 
Exactly, I completely understand the marketing standpoint for why it's done - it's business. I'd be arguing that exact point against myself if I were you. The reason I made this evident is because the 8800GT owners are attacking someone who is bringing this to light - don't get pissed off when you see a new 9800GTS come out and smoke your 8800GT. Sure, it'll be more expensive, but it's going to be damn near the same card. The reason you have an 8800GT and not a 9800GT is because ATi has not brought anything competitive to the table. The 8800GT is just there as a placeholder to test their next architecture, and to sop up the little bit left of the market that hasn't already bought a G80 card. There is good in it - you get close to the same performance for a damn good price, BUT - don't be fooled into what you're really purchasing - the nerfed 9800.

I hope you're aware the 8800GT is the next architecture - it's 65nm technology with half it's pipeline turned off.


Totally agree,two of the better posts in this god forsaken thread.


I tend to agree, we'll all soon be forgetting the 8800GT in light of these new cards.


I dont think so,as the GT is one hell of deal.


haha.. man if you own a 22" LCD and you're trying to play Crysis - that is hilarious. The GTX can't even run it at full detail whatsoever... It also can't run World in Conflict for shit at full detail either... that's definitely not accurate.


Exactly,nevermind what its like @ 1920x1200 on my 24" Nec. :( We need a new 500 to 600 dollar card.Next Spring cant come fast enough for me. :eek:
 
Originally Posted by cerebrex View Post
I hope you're aware the 8800GT is the next architecture - it's 65nm technology with half it's pipeline turned off.


---

no and no. the G92 is just a tweak to the current G80 architecture and calling it the "next" architecture is going too far. also, neither the G92 nor the G80 have simple "pipelines". also, it does not have "half" anything turned off. if anything, one of its eight blocks has been disabled to yield 112sp instead of 128.
 
I have sat back and watched all of this unfold and have to agree with you. What I would call it is smart marketing by Nvidia..........the price is what really drives this card...........and now they have your money!!!

I have a feeling there will be some very unhappy people when the new 9800GTX comes out and they have already jumped on the 8800GT bandwagon.


I wont be one of them. I will just buy that too since there will be plenty of people eager to buy my Gt's second hand. For me its a hobby , I look at my Gt's as rentals.
 
I wont be one of them. I will just buy that too since there will be plenty of people eager to buy my Gt's second hand. For me its a hobby , I look at my Gt's as rentals.

That is the best way to look at things. I have bought and sold video cards on Ebay and if you sell them before they get too obsolete you can usually get a reasonable price for a used card. I got my 8800GTX used for $340 and my 8800GTS 640MB brand new for $300.00.
 
The GT is a great buy right now, but its unfortunately not anything new save for the smaller die size and thus lower power reqiurements.

Its the same tech as the GTX and the GTS, with a bit more clock speed.

Do I wish I'd have waited to buy these instead of my GTXs, nope.
You still cant run anything better than I can with my cards, you just got it cheaper; and I do appreciate that. But in a short while we'll start all over again.

There was a good article on bit-tech about the GT. It really is a great card in that it will bring excellent gaming to anybody who can fork out 250 to upgrade their card. Unfortunately it will still be owned by Crysis and other yet to be named games, like it always is......and we'll be spending more money in about 3-6 months to get that extra performance, together,GT or GTX owner.
 
Back
Top