Does it feel like gfx cards are not advancing that fast now?

TheBuzzer

HACK THE WORLD!
Joined
Aug 15, 2005
Messages
13,005
I felt there been a huge decrease in speed of new gfx cards. As new I mean where the card is really different and not that sli and dual core crap.

If you remember 3dfx tries doing many cores on their gfx cards and got been by a single core. Or have the enginners finally reach their brain limit to build something faster
 
The GTX280 and HD4870 seemed like pretty good performance jumps over the previous generation to me. Especially for the HD4870 versus the HD3870. Add in SLI or CrossfireX and it's even better. Whether you like it or not multi-cpu and multi-gpu setups are the way of the future.

Look at Larrabee...
 
The 280 was 18 months after the 8800 GTX and roughly 80% faster than it all in a single core. I don't really understand what you mean. 18 months after the 280 won't be till Q1 of 2010 and the GT300 series is due out Q4 2009.
 
GPUs can only move as fast as the manufacturing available to produce them. Engineers can draw up incredibly advanced and powerful chips, but if the manufacturing isn't there in a cost effective and power and thermal effective manner to produce it then it won't be a reality. Also, as we get into smaller and smaller processes eventually we will get to a point where the hardware doesn't exist yet to produce components at that level. Already we are getting into territories where it is very difficult to maintain cohesion, leakage is a problem at very small nanometer sizes that future technologies are pushing.
 
I think the OP might be talking about all the refreshes, if that's the case then I agree with him. But in terms of coming out with new cores/architectures, I think there's been plenty of advancement recently (i.e. 8800GTX - > GTX 280 was a huge jump in performance).
 
I think the OP is refering to generations.

The 9000/5000 -> x000/6000 -> x1000/7000 generational transitions happened relatively quickly (a year each if I remember correctly), versus now where it seems to be close to a year and a half between generations (look how long it took to go from the 8000 series to [its actual sucessor] the 200 series). I agree with him on that point, however it could very well be bias of us being in the current time.
 
I think the OP might be talking about all the refreshes, if that's the case then I agree with him. But in terms of coming out with new cores/architectures, I think there's been plenty of advancement recently (i.e. 8800GTX - > GTX 280 was a huge jump in performance).

It was? I thought those benchmark reviews show it wasnt too big of improvement, well only on big screen size. higher than 1920x1600.

Ya, before it seems like each year there was a more powerful generation of video card. Now it is like they are putting two cards in one and saying it is new.
 
It was? I thought those benchmark reviews show it wasnt too big of improvement, well only on big screen size. higher than 1920x1600.

Ya, before it seems like each year there was a more powerful generation of video card. Now it is like they are putting two cards in one and saying it is new.

I think you mean 1920x1200, and frankly if your not running 1920x1200 a 260 will run 60+fps on anything maxed. Most games don't push the latest graphics card that hard on sub 1920x1200 res. Mostly because a lot of them are console ports which are pushing a mere 720p which is less than 1/2 the pixels of 1080p.
 
I think you mean 1920x1200, and frankly if your not running 1920x1200 a 260 will run 60+fps on anything maxed. Most games don't push the latest graphics card that hard on sub 1920x1200 res. Mostly because a lot of them are console ports which are pushing a mere 720p which is less than 1/2 the pixels of 1080p.

well my 8800gtx been handling games at 1920x1200 fine. I am guessing another reason is that most games now are designed for console so they don't push the pc power. Like the xbox 360 and ps3 gfx cards are older than the 8800gtx.
 
We seem to have traded pure performance in cards for bang for the buck at the non-top ranges. The performance gaps between price ranges are more compressed than ever and imo that's a good thing when it comes to the accessibility of PC gaming, and the fact that what cards we have now can handle almost all games even with cards in the 100-150 dollar range combined with the penetration of console ports into the current PC stable pretty much allows for a slow down of top end graphics card tech to happen.
 
well my 8800gtx been handling games at 1920x1200 fine. I am guessing another reason is that most games now are designed for console so they don't push the pc power. Like the xbox 360 and ps3 gfx cards are older than the 8800gtx.

I've got 2 8800 GTS 512s in my system and I've been wanting to upgrade, but can't find a valid reason because they are still crushing most everything. Only a couple of the most recent titles I've been forced to go to only 4X AA with everything maxed to keep 60fps.
 
GPUs can only move as fast as the manufacturing available to produce them. Engineers can draw up incredibly advanced and powerful chips, but if the manufacturing isn't there in a cost effective and power and thermal effective manner to produce it then it won't be a reality. Also, as we get into smaller and smaller processes eventually we will get to a point where the hardware doesn't exist yet to produce components at that level. Already we are getting into territories where it is very difficult to maintain cohesion, leakage is a problem at very small nanometer sizes that future technologies are pushing.

true but I have to admit that I was hoping for more, with the 55nm process doing so well now I would have liked to seen a larger version of the R700, say the size of the GT200? I know it doesn't fit ATI business model but I was thinking 1600 shaders :D
 
It is both due to games today having the need to be console aware and due to the fact that most are waiting for DX11.
I think will see many PC centric releases soon and a bit of focus on PC gaming yet again in the near future..
There is a bit of a gap between the upcoming DX 11 graphics cards and the next console cycle. We should start seeing something regarding DX 11 cards on the market around summer/autumn. As for consoles they should be one and a half to two years from a next major console release. According to sources they are pointing for a 2011 release for the next console cycle.
What this means is that once DX 11 capable graphics cards are on the market this year, DX 10 cards will become extremely cheap and still give huge leaps of performance compared to consoles.
With core i7 getting even more penetration the Core 2 duos will be so cheap that they become low end hardware.

My point is. Computers at console price are already providing more performance than consoles.
A new graphics card will only help on this price vs performance difference.
Many people are already switching back to the PC because it has become yet again the good choice for the best overall gaming experience at a good price.

You guys are saying this yourself. A Geforce 8800GTX can handle todays games just fine and they are below $100. Actually even a 9000 is around the $100 range. You couple this with a core 2 quad or core 2 duo that is also around $100. Add a board a sound card if you really don't want to use the onboard sound, or whatever and you got a gaming machine that is about the price of a Xbox 360 gives you better performance in games and does a lot of more things.

And then the developers will start pushing graphics again because the lowest common denominator will be a graphics card like a geforce 9000 or something like that. DX 10 starts becoming the norm and your console begins to gain dust.

And you guys know what happens next? Two years pass and a new console comes along and you see it all over again.

That is why I'm buying a PC now. I buying my gaming machines around the PC cycle not the console cycle. I get better gaming with more options than console at good price vs performance.
 
I thought things have been moving rather well on the hardware side. I would have liked to see more from openGL since DX10 is vista only.
 
I thought things have been moving rather well on the hardware side. I would have liked to see more from openGL since DX10 is vista only.

Or you could move to a OS that is still receiving mainsteam support. :rolleyes:;):D
 
2 things:

Graphics card memory bandwidth hasn't increased proportionately with shading power. We need faster memory connected to a wider bus. Ati uses faster memory but narrow bus while Nvidia uses wider bus but slower memory. Also very few game engines are designed originally for the pc nowadays. Developers are mostly designing games from the ground up for consoles so we have few pc games that utilize current hardware. We have what? Crysis and Stalker Clear Sky. So pc games have become console ports that run at 1920x1200 or 2560x1600 with varying amounts of AA and AF applied. Though it is nice to run most of these games at 60fps with max eye candy.
 
This is a good thing, needing to buy a 200-300 dollar vid card every year to keep up was a big hit to pc's vs a console you buy once and play for 3+ years.

Now a 100 card (my 4830) can run the newest games at 1920x1200 at max/near max settings. I have no problem buying a new 100 dollar card every year.
 
2 things:

Graphics card memory bandwidth hasn't increased proportionately with shading power. We need faster memory connected to a wider bus.

It's gone pretty well hand in hand.
8800 GTX had 103.7MB/s the 280 had 141.7MB/s and better compression for an even higher realitive bandwidth.
 
The GTX280 and HD4870 seemed like pretty good performance jumps over the previous generation to me. Especially for the HD4870 versus the HD3870. Add in SLI or CrossfireX and it's even better. Whether you like it or not multi-cpu and multi-gpu setups are the way of the future.

Look at Larrabee...

Larrabee is *nothing* like SLI and CF. Larrabee really isn't much different from the GPUs we know and love today - the only thing "unusual" about it is that is runs x86 instead of some secret instruction set, but even then that doesn't grant it magical powers or anything, and was a decision by Intel to reduce costs (re-use old designs), not because its super awesome.

And I disagree that multi-GPU setups are the future. Neither ATI nor Nvidia seem to be rushing to offer compelling SLI/CF midrange products (sure, you can SLI/CF midrange cards - it just is a waste of money since you could just buy a faster, single-GPU card for less)

I felt there been a huge decrease in speed of new gfx cards. As new I mean where the card is really different and not that sli and dual core crap.

I think it all comes down to games. When a $130 card (4850) can max out popular games (like call of duty 4/5 and Left4Dead) at 2560x1600, its not like there is much of a demand for more performance. Aside from those with 2560x1600, what use is there for more power? Honestly, the only game I can't run maxed with AA is Crysis, and I'm certainly not going to upgrade for 1 game. The games I actually play I run with everything maxed at native res (1680x1050) with 8xMSAA or 24xCFAA.
 
No, its just that ATi had catched up. Before, GFX was like a monopoly, they offers the best performing VGAs without competition
 
I think the OP might be talking about all the refreshes, if that's the case then I agree with him. But in terms of coming out with new cores/architectures, I think there's been plenty of advancement recently (i.e. 8800GTX - > GTX 280 was a huge jump in performance).



Unless you are into flight sims - then you are better off with the 8800GTX/Ultra or even 8800GTS 512 for reasons I can't understand.
 
I do agree that adances have slowed a bit, but I think it's due to software. I only have a 4850, I play all my games at 19x12 and the vast majority of my games I can max out at that resolution. The only exceptions being Crysis, Far Cry 2 and Stalker: Cleaer Sky. There never was a HUGE demand for super expensive, high end cards but now there is hardly any demand for them. No games to take advantage of them, and with the world economy the way it is, far fewer people willing/able to fork over the $$$ for them. Just doesn't make sense for AMD/nVidia to invest in a market that is almost non-existant these days.
 
Unless you are into flight sims - then you are better off with the 8800GTX/Ultra or even 8800GTS 512 for reasons I can't understand.

Not sure but it was explained to me that the engine they use is DX7 based and doesn't take advantage of the modern architectures, this the old but still pretty fast 8800GTX will do better then the newer cards that rely on modern standards.
 
i went from a 9800pro setup to a gtx260. To say that the graphics are better is an understatement. Its absolutely phenomenal. I have much slower refresh rates though, than the rest of you guys.
 
The issue is how much power people need - since most people have 22" LCDs (or less) a 4850/9800GT is really all that is needed. That's why AMD/nVidia are making such a fuss over using the GPU for processing, as they hope people will buy the high-end (and high margin) cards instead of the mid-range.
 
The issue is how much power people need - since most people have 22" LCDs (or less) a 4850/9800GT is really all that is needed. That's why AMD/nVidia are making such a fuss over using the GPU for processing, as they hope people will buy the high-end (and high margin) cards instead of the mid-range.

I doubt you can max AA and achieve 60 FPS on the more demanding games on a 22" LCD with a 9800GT.

Then again, our definition of "how much power people need" may be different as well.
 
I doubt you can max AA and achieve 60 FPS on the more demanding games on a 22" LCD with a 9800GT.

Then again, our definition of "how much power people need" may be different as well.

this, I like being able to max my AA out, I really don't NEED a GTX280 to play DMC4 but its prettier at 1920 by 1200 max settings with 16QXAA and vsnc for 60fps. and I was willing to pay for it. If there was a card I could do that with some of my other games I would probably get that as well. but its hard to recommend that to others, this is my toy and I am willing to spend the money on it.
 
Seems to me CPU advancement is more prevalent nowadays. It seems to swing back and forth. Some years it’s all about graphics, others it's all about processing power.
 
Back
Top