GTX Owners say Ouch!!

I'm running BF2 and its expansions with full AA and max settings and my computer's handling it well. It's a newer card, what do you expect, worse FPS?
 
I'm running BF2 and its expansions with full AA and max settings and my computer's handling it well. It's a newer card, what do you expect, worse FPS?

You said you run all games with full AA/AF and max settings. That only works if you are playing 2+ year old games I suppose. BF2142 would run like shit if you tried full AA on it, yet it runs great on the 8800 series, that was my point.
 
I do run all games on full AA/AF. I don't have 2142 but I'm sure I can handle it since it uses the same engine as BF2, doesn't it?
 
I don't know what you said but these cookies are mighty tasty!

About the vent server thing, I'd rather not listen to prepubescent kids fight all day, throwing obscenities at eachother.

Also this is the end of my posts in this thread, talk behind my back all you want!

Youll be back. Its in your nature.
 
I am back, but I wont be after 4:30 if you're that worried. I don't see why everyones trying to prove me wrong about playing older games on full settings but it's not working. A higher FPS in a game is obvious when getting a more advanced card but I didn't need it till now. I'm very grateful my 7900GS lasted this long and hope it still lasts till I get my new card to hop on the Dx10 bandwagon. I'll then play newer games like Bioshock over again with better eye candy.
 
I am back, but I wont be after 4:30 if you're that worried. I don't see why everyones trying to prove me wrong about playing older games on full settings but it's not working. A higher FPS in a game is obvious when getting a more advanced card but I didn't need it till now. I'm very grateful my 7900GS lasted this long and hope it still lasts till I get my new card to hop on the Dx10 bandwagon. I'll then play newer games like Bioshock over again with better eye candy.

I knew youd be back.
 
I do run all games on full AA/AF. I don't have 2142 but I'm sure I can handle it since it uses the same engine as BF2, doesn't it?

I guess you never bothered to click the links above proving that a 7900gs can't handle AA in 2142 while providing good frame rate on a 20" screen.
 
50 average framerate is great without AA. If I enabled 2xAA or even 4x it would probably still be good. Also my resolution is 1680x1050 not x1200.

Well I'm out.
 
There are dozens of games that are not what I would call playable at 1680x1050 at "max AA/AF". Heck, even at 4xAA, there are a lot. At 8xAA? It would take games that are several years old to be playable to me. Even some of those would. Such as Farcry, at 8xAA/16xAF and at your res, it would be slowwww. You'd have to turn down every other settings to get decent frames. Unless you like slide shows... which I do not.

The 8 series was/is a huge leap over the 7 series for NV. In some games and settings, it doubled performance.
 
What is "Maxed" settings for you? 0xAA, regular AF? Or at least 4xAA (usually 8x) and 16xAF? What resolution?

1280x1024 w/ 4AA and w/ 8AF (Modest settings in here I suppose, but I honestly can't tell much of a difference past those settings so even if I could exceed them I would rather have the higher framerate, and my monitor is an LCD so I can't go past that res.) Not sure whether this is what you were thinking but my logic has always laid in the "point of diminishing returns"
 
It's apparent ATI needs to get their act together to provide some competition for Nvidia. 450-500 for a card that is over half a year old is pretty absurd

Agreed.

Also agree with the guy who said that if you waited to buy because there is something great around the corner, would mean you never buy anything.

I was on the fence with my P4 and outlive the single core AMD's, the Pentium D's and the X2's, the first wave of CD2's and now the Q6600. I'm was going to wait for Penryn but then I won't buy that either because I'll be waiting for something else, so screw it , I'm getting a Q6600 or a E6600.
 
50 average framerate is great without AA. If I enabled 2xAA or even 4x it would probably still be good. Also my resolution is 1680x1050 not x1200.

Well I'm out.

1680x1050 is close to 1600x1200.

If you think that 50fps is "great" for a FPS w/o AA and we are talking about what it would be with AA... I guess don't mind slideshows as playable in FPS games.

Also the point still stands, the 8 series has had their place since they were released. They have been able to run games with Maxed settings (AA/AF included) which other cards could not.
 
I think there is enough consistent leaks to lend credence to the rumors that:

G92 (256 bit interface 8800) and RV670 (256 bit HD 2900) are coming soon. Now whether they are called 8800GT or 8850, or the ATI is a 2950 or 3850 are just minor irrelevant details. These will be be brilliant midrange cards. Anyone betting these aren't coming before year end would likely lose that bet.

As far a lifespan of a card, well that depends on what demands you place on it. I don't believe in buying new hardware every year so I just turn things down and don't get games that I know will choke my system. I was using a Athlon/9700Pro until recently. :) Once spectacular and now below mediocre.

I am only going to use a 7900GS, which is my new card, until I really feel the need for something better.

I might have splurged on an 8800GT if available now, but nothing out there right now made me want to spend more than the $99 for a 7900GS which plays all my OLD game cranked. But I won't be running to buy Crysis either (not my genre).

I like reading about new tech and what is coming wether I buy it or not, whether it makes my stuff look slow or not. I don't understand why people get so testy when new info leaks in rumors. If you dont' like this kind of stuff don't read it.

Maybe we could have a rumor forum so it wouldn't hurt the tender sensibilities of the rumor haters.
 
I think there is enough consistent leaks to lend credence to the rumors that...

Thats the problem though, these aren't different sources of information, most can be tracked back to SINGLE sources (others don't site sources), which are from sites not even in english and have no credibility.

I have yet to see shots of this card in systems, or in people's hands. Most of the shots are of the exact same photo, with different watermarks from different sites on it. These photos also have very strange lightning and look rendered which again leads to doubt that they are real.

I wouldn't doubt that the fake 9800gt U2 edition I made up in a few secs and posted here a few times didn't end up as front page material for one of these sites.
 
Just about every person who owns an LCD and doesn't want tearing in their games?

You dont enable vsync unless your system can consistently hold 60FPS in games enter the 8800GTX which was able to do this in nearly ALL games a year ago the next best thing was the 7950GX2 which STILL struggled in the current games out.

So you're saying its impossible to max out AA and AF with the 7 series cards a year ago? I thought you claimed you have no 8800, do you not have a 7 series card either?

No, your twisting my words before the 8800s no card on the market could run full settings with full AA and AF in all games like you claim your 7900GS can do. There where in fact games that brought the 7 series to its knees even before the 8 series.

If you took a moment and read what I said you'd notice that I was talking about..well just about that: the advantage of multi cores in games.

No you made a vague uninformed statement about quad core performance in games years before there was such a thing as a dual core processor for the PC.

By the same logic nobody should have been buying multicore processors for gaming. games that truly take advantage of multiple cores are still few and far between were just now starting to see better support

Of course you can do that but multitasking isn't the only thing a processor is used for either buddy. Back then very few software and applications actually took advantage of multicore CPUs but anyways this has nothing to do with the subject at hand.

Back when? in 2001 like you said before? the first desktop dual core processor came about 4 years after that so what bullshit are you trying to claim again? No wonder there was poor support for multicore processors back then... they didnt exist yet.

No I can't handle games that came out recently on fully maxed out settings on at my res but I will eventually when I upgrade. This is why I waited, so I can take full advantage of the current card I have now and by that time I can by a new card for cheap! Makes sense, I know it's hard to understand for you.

You couldnt a year ago either.

So you're saying the whole [H] community is calling you stupid for not spending more than $150 dollars on a card? Does everyone here agree with this guy? I'll agree on one part though, you are pretty stupid.

See you love to twist words and its funny. Like i said the average joe thinks its stupid to drop that kind of cash on one part of a computer...

The same way most people would be shocked at what musicians spend on equipment etc.

Anyays I don't recall criticizing anyone actually, so who the hell are YOU putting words in my mouth and the whole [H] community. Look around kid, theres people on these current forums looking for $100 video cards.

You dont recall criticizing anyone?!?! read your first essay all your doing is criticizing people that bought an 8800 card when they where released...

I don't see what you're trying to prove, I never said anything is wrong with anyone so you can keep arguing but you're talking to a wall here.

I dont need to prove anything you have proven yourself that you know jack shit about what your talking about thats pretty clear

As I said, they're relatively new and demo's dont mean jack.

Think what you like. Its not like those are the first games to start pushing the 8800s just because the DX10 versions are now coming around you think it takes DX10 to push an 8800?

Good I hope they're being pushed so I can put them to good use. The new games that just released or are coming out are what's inspiring me to get a new card.

Good for you. Thats what people said a year ago when they bought the 8800s.

I think my point there pretty much stated the need for such a card. World In Conflict is a new game taking full advantage of the GPU's capabilities. I know I probably can't handle the game on high which is why I'm in need of an upgrade soon. If games like this released last year I'd upgrade to a new GPU as well. I never once stated image quality isn't worth ogling over, I said previous games didn't need such cards to handle it.

The power of the 8800 series was needed last year. The games that have come out over the last year have brought the 7 series and X1k series to its knees and without the 8800 running many of them at 100% with AA and AF at any respectable resolution would have been impossible. It seems that since they where not DX10 games they do not count to you for some reason.

Damn after all that you still can't understand anything? Maybe you should read it over a few times. Take it slow, word for word. YOU CAN DO IT!

Jokes aside, If anyone's feelings were hurt in any of my posts, just tell me and I'll change it. I really don't see how my posts were full of negative comments like the sophomoric retard I'm responding to.

To clear this up, all I'm saying is now's the right time to buy a new card for me, not when the 8 series came out because technology didn't meet those standards. By doing this I saved some money and I'm still in the same position as everyone else who already has a card. End this!! PLEASE!

Now may be the right time for you but criticizing people that felt last year was the right time for them is stupid and childish. Just wait until some uninformed fool is talking out his ass this time next year about how stupid it was to buy an 8 series card now.
 
Maybe we could have a rumor forum so it wouldn't hurt the tender sensibilities of the rumor haters.

I think if these asshats just put Next Gen Rumor #n in the title of their post as opposed to "GTX OWNERS SAY OUCH NOW!" or some imflammatory BS like that, that is certain to trigger illicit reponses...well i think you see what i mean. Its not the material, most of us know its just rumors until a press release, but the way these little kids come on here and phrase just begs for ridicule.
 
Wow you're still going at it pfunkman? I proved you wrong a while ago but you're still yapping away. I'm not gonna prove to you anything anymore because you constantly put shit in my mouth about me criticizing this and criticizing that. If everything I say is either wrong, vague, or illogical to you, maybe you should just stop arguing or perhaps try to grasp the English language a little better so you can at least respond in a civil fashion.

First of all, of course the 8 series was a large leap from the 7 series, the whole world knows that. Secondly, up until a while back, the 7 series could handle most games with full settings (depending on the res) decently; obviously not as good as the 8 series but still playable. I really don't know where you're getting your facts from.

Now onto the multi-core processing part; I think you're beginning to understand where I'm coming from. Think back on when you said "By the same logic nobody should have been buying multicore processors for gaming. games that truly take advantage of multiple cores are still few and far between were just now starting to see better support". This is what I've been trying to say about the 8800's but to a smaller extent. Of course the 8 series are a beast of a card a year ago but only now (by now I mean within the past few months) are we starting to see better support for these cards. Only now are we seeing games that are truly taking advantage of its capabilities. Of course they can handle last year's games like nothing but the 7 series cards handle them decently as well.

The 8 series came out before Directx10 released so I didn't feel the need to step up to those cards and only now am I starting to feel otherwise. I don't think I missed anything in terms of graphics this past year up until now. I really can't stand Vista but I really want to play Crysis at its full potential so I may have a go at it.
 
The thread has not been locked because no one with an 8800 GTX has really said "Ouch" going on a year now. If last year's card is still relevant a year later, then we all have something to celebrate. I, for one, have zero regrets buying my 8800 GTX cards. Best investment I have ever made. And, not to insult anyone with a 7900 series cards, but the 8800 series is a huge leap over that generation card. I tried 7800 GTX 512 SLI, 7900 GTX SLI and 7950 GX2 quad SLI, and NONE of them played games at an acceptable level. Personally, the 8800 GTX is the first card that came along that really delivered.
 
First of all, of course the 8 series was a large leap from the 7 series, the whole world knows that. Secondly, up until a while back, the 7 series could handle most games with full settings (depending on the res) decently; obviously not as good as the 8 series but still playable. I really don't know where you're getting your facts from.

Of course they can handle last year's games like nothing but the 7 series cards handle them decently as well.

The 8 series came out before Directx10 released so I didn't feel the need to step up to those cards and only now am I starting to feel otherwise. I don't think I missed anything in terms of graphics this past year up until now. I really can't stand Vista but I really want to play Crysis at its full potential so I may have a go at it.

I thought you were talking out of your's till you said your resolution was 1920x1200. In this case I'm sure you couldn't handle most games on max till you upgraded. I tried my measly 7900GS on my 40" HDTV with the same resolution and I could barely handle Bioshock on low but I can assure you I wasn't lying when I said I can handle every game (handle as in a decent frame rate) I put on it on max with a 1680x1050 native res.

Like I pointed out before, a year old game (BF2142 that came out before the 8800) takes a more powerful card than the 7900gs to run with AA @ 20" resolutions. Just because you are playing games from 2005 doesn't mean that games in 2006/2007 don't need 8800 cards to run with all settings maxed and AA.

Also, I've been playing games with AA+HDR for almost a year now and the AF IQ is much better on the 8 series vs 7. So not only do I get superior image quality, but I get much better frame rates as well.

I can't see how you are still trying to say that the 8 series hasn't been needed when its been proven that the 7 series can't handle games maxed, and hasn't been able to for over a year.

And when I say "Maxed" I mean with AA/AF as well, because games need AA no matter what your resolution/monitor size.

Also Re: Bioshock since you played that... it shows a whole 22 fps @ 1600x1200. Doesn't sound very playable to me. 8800gts gets 53, over 2x the speed and in DX10.

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp
 
Wow you're still going at it pfunkman? I proved you wrong a while ago but you're still yapping away. I'm not gonna prove to you anything anymore because you constantly put shit in my mouth about me criticizing this and criticizing that. If everything I say is either wrong, vague, or illogical to you, maybe you should just stop arguing or perhaps try to grasp the English language a little better so you can at least respond in a civil fashion.

First of all, of course the 8 series was a large leap from the 7 series, the whole world knows that. Secondly, up until a while back, the 7 series could handle most games with full settings (depending on the res) decently; obviously not as good as the 8 series but still playable. I really don't know where you're getting your facts from.

Now onto the multi-core processing part; I think you're beginning to understand where I'm coming from. Think back on when you said "By the same logic nobody should have been buying multicore processors for gaming. games that truly take advantage of multiple cores are still few and far between were just now starting to see better support". This is what I've been trying to say about the 8800's but to a smaller extent. Of course the 8 series are a beast of a card a year ago but only now (by now I mean within the past few months) are we starting to see better support for these cards. Only now are we seeing games that are truly taking advantage of its capabilities. Of course they can handle last year's games like nothing but the 7 series cards handle them decently as well.

The 8 series came out before Directx10 released so I didn't feel the need to step up to those cards and only now am I starting to feel otherwise. I don't think I missed anything in terms of graphics this past year up until now. I really can't stand Vista but I really want to play Crysis at its full potential so I may have a go at it.


Yet again, just somehow, i still knew youd be back. I bet youll be back again, too.
 
I had to give up my SLI 7900GT cards for an 8800GTS 640MB a long time ago... the 7000 series cards can't handle even DX9 games at 1920x1080 with any decent frame rate. I'm currently at nearly twice the permance I had with the SLI 7900 cards. No comparison. The 8800s are far better in the higher resolutions.

If you are still gaming at 1024x768, or 1280x1024 I suppose a 7950GT would still run pretty much everything out there at near max (not counting DX10 effects).

If you want to throw in some AA though, forget it. 8800 or nothing.
 
Like I pointed out before, a year old game (BF2142 that came out before the 8800) takes a more powerful card than the 7900gs to run with AA @ 20" resolutions. Just because you are playing games from 2005 doesn't mean that games in 2006/2007 don't need 8800 cards to run with all settings maxed and AA.

Also, I've been playing games with AA+HDR for almost a year now and the AF IQ is much better on the 8 series vs 7. So not only do I get superior image quality, but I get much better frame rates as well.

I can't see how you are still trying to say that the 8 series hasn't been needed when its been proven that the 7 series can't handle games maxed, and hasn't been able to for over a year.

And when I say "Maxed" I mean with AA/AF as well, because games need AA no matter what your resolution/monitor size.

Also Re: Bioshock since you played that... it shows a whole 22 fps @ 1600x1200. Doesn't sound very playable to me. 8800gts gets 53, over 2x the speed and in DX10.

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp

I've played many games that aren't BF 2142 that came out in 2006/early 2007 and from my experience I handled all of them on high settings with ease on my res. I've played through the whole game of Bioshock (no Dx10 settings though) on the highest setting possible. As I said and I'll say it again, Bioshock is a game that does show the true potential of the 8 series and I respect that. I know the 8 series will run my card down when compared but I can still play older games without the Dx10 eye candy just fine.

If I had thought 2142 was a great game to play and buy I'd be willing to agree with you if the benchmarks were true but from what I've seen, these benchmarks don't mean anything as they "proved" to be unplayable when playing Bioshock yet I still beat it on max settings.

Also, HDR lighting is not always superior. It depends on the game and this has only been standard feature in games for the past year or so and not all games have outstanding and superior image quality like you say. Some noticeable games that do look good however are some Half Life 2 Source games and UT3. Oblivion and Farcry has somewhat good HDR too. This =/= superior image quality in the very least IMO.
 
wel do some fucking work instead of writing essay's. it seems you do fuck all at work

QFT. I wrote my essay. Im done.

Manbearpig, all I can say is that, you're partially write. My 6600GT kicked the bucket, which is the main reason i jumped onto the 8800. My upgrade cycle is once every 3 years and I stick to it like a hobo to a christmas turkey... well... ill probibly stick in another 2 gigs and might even SLI if Nvidia ever gets this mixed card SLI thing down... just as the hobo on my theoretical turkey would accept gravey and mash potatoes ;p

and yea, games even as lowly as cs:s can give the 8800 a run for its money. And having played on a 7950GT, i can tell you, the 8800 did outperform at the time.

edit: oh and one more thing, your not listening to me. on an LCD monitor, you cannot change the refreash rate, I dont care how much you try, the monitors hardware means that it cannot physically change the voltage on the bars fast enough to get the pixel to change color, more then 60 or sometimes 70 times a second. Whatever is telling you that you can push something to 140Hz, is lieing.
 
Like I pointed out before, a year old game (BF2142 that came out before the 8800) takes a more powerful card than the 7900gs to run with AA @ 20" resolutions. Just because you are playing games from 2005 doesn't mean that games in 2006/2007 don't need 8800 cards to run with all settings maxed and AA.

Also, I've been playing games with AA+HDR for almost a year now and the AF IQ is much better on the 8 series vs 7. So not only do I get superior image quality, but I get much better frame rates as well.

I can't see how you are still trying to say that the 8 series hasn't been needed when its been proven that the 7 series can't handle games maxed, and hasn't been able to for over a year.

And when I say "Maxed" I mean with AA/AF as well, because games need AA no matter what your resolution/monitor size.

Also Re: Bioshock since you played that... it shows a whole 22 fps @ 1600x1200. Doesn't sound very playable to me. 8800gts gets 53, over 2x the speed and in DX10.

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp

I've played many games that aren't BF 2142 that came out in 2006/early 2007 and from my experience I handled all of them on high settings with ease on my res. I've played through the whole game on Bioshock (no Dx10 settings though) on the highest setting possible. As I said and I'll say it again, Bioshock is a game that does show the true potential of the 8 series and I respect that. I know the 8 series will run my card down when compared but I can still play older games without the Dx10 eye candy just fine.

If I had thought 2142 was a great game to play and buy I'd be willing to agree with you if the benchmarks were true but from what I've seen, these benchmarks don't mean anything as they "proved" to be unplayable when playing Bioshock yet I still beat it on max settings.

Also, HDR lighting is not always superior. It depends on the game and this has only been standard feature in games for the past year or so and not all games have outstanding and superior image quality like you say. Some noticeable games that do look good however are some Half Life 2 Source games and UT3. Oblivion and Farcry has somewhat good HDR too. AA + HDR is a Directx10 feature using Shader Model 4.0 which is a slight upgrade from SM3.0. It makes bright things really bright and dark things really dark. This =/= superior image quality in the very least IMO.
 
man im good at the double post these days..

oh and how could i forget, I played this game too...

Oblivion much?
 
Like I pointed out before, a year old game (BF2142 that came out before the 8800) takes a more powerful card than the 7900gs to run with AA @ 20" resolutions. Just because you are playing games from 2005 doesn't mean that games in 2006/2007 don't need 8800 cards to run with all settings maxed and AA.

Also, I've been playing games with AA+HDR for almost a year now and the AF IQ is much better on the 8 series vs 7. So not only do I get superior image quality, but I get much better frame rates as well.

I can't see how you are still trying to say that the 8 series hasn't been needed when its been proven that the 7 series can't handle games maxed, and hasn't been able to for over a year.

And when I say "Maxed" I mean with AA/AF as well, because games need AA no matter what your resolution/monitor size.

Also Re: Bioshock since you played that... it shows a whole 22 fps @ 1600x1200. Doesn't sound very playable to me. 8800gts gets 53, over 2x the speed and in DX10.

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp



Let me point out somethings you guys should all agree on:

1. Right now is not the time to buy 8800 series card because Dx 10.1 is coming out and 8800 series do not support that. Wait for until November or December see what nVidia or ATI is coming out that support Dx 10.1. There is no point to buy an obsolete card now so close in time.

2. Buy the graphic card as a upgrade when you feel you cannot comfortablely playing certain game ie BF2142 or Crysis at reasonably frame rate and image qualities. I love BF2, but I just started playing BF2142 probably around May 2007. At that time, my system is a S939 X2 3800+ [email protected] and 7800 GT SLI running res 1280x1024 everything max except AAx2 AFxtrilinier. The average frame rate is between 55-75, which is respectiful, but if there is Titan mission and I get into the Titan, the frame rate droped so much (somewhere 40s) that I can't hold the gun straight. There were so much lag that I get kill easily. Therefore, I upgrade to Opty 185 [email protected] and 8800 Ultra. Now with everything max and res 1600x1200, my average frame rate is 90+. Too bad, I just don't play BF2142 nowadays because I got all the unlocks including the Northern Strike and War College ribbon. When 8800 Ultra first came out, it was CDN$900. Few weeks later, it dropped to CDN$720. That is when I made the purchase in July. I don't regret it because right now 8800 Ultra is still selling for around CDN$600-620. So just buy the video card when you need it for game, but just not right now.

3. Depending what resolution you are running the game, you can decide if you need a top of line graphic card or a better CPU. 1600x1200 and above is GPU dependent so faster the graphic card will make a big difference. 1280x1024 and below is CPU dependent so you just need an average graphic card would still give a decent frame rate that is comparable to fastest graphic card. This is evident by running Dirt game with my sig system. At 1280x1024, the CPU usage is constant 100%, while at 1600x1200, the CPU usage actually dropped to around 80%. Since my CRT monitor is a high res monitor (it can go up to 2500+x1600+) res, so I can adjust the res to suit the game; most people have fixed res LCD monitor, running at different res will look ugly. Since my dual core [email protected] dual core is mediocore by today's standard, I need a fast graphic card to run at high res to relieve the CPU bottleneck.

Here is a great comparison article between different CPU + cache, RAM, and video running UT3 demo:

http://anandtech.com/video/showdoc.aspx?i=3127

In their tests, an ATI 2900XT actually beats 8800 Ultra at res. 1280x1024, and not at higher res. The captions on graphs in page 6 are typo, should read 1600x1200 I believe.

EDIT: nice pissing game here. Even myself is writing essay on this topic. Damn, I got a Medical license exam to take next weekend.
 
Yet again, just somehow, i still knew youd be back. I bet youll be back again, too.

Thank you for clarifying to me I was back, I actually didn't know. It's usually what forum members do: they write back.

Yeah Wizard, I stick to my shit too, but I've only had my 7900 for a year and I need to upgrade soon which kinda sucks. Also, you talking to me about the refresh rate thing? I never once mentions about changing an LCD refresh rate :confused:
 
1. Right now is not the time to buy 8800 series card because Dx 10.1 is coming out and 8800 series do not support that. Wait for until November or December see what nVidia or ATI is coming out that support Dx 10.1. There is no point to buy an obsolete card now so close in time.

Not particulary true. An interesting interview on Cevat Yerli from Crytek stated otherwise:

http://www.gameinformer.com/News/Story/200710/N07.1004.1210.04388.htm

It's a good read,
 
ok well, bear man pig, enough words....

Here are some images from the "Geforce 8800 preview" artical from firingsquad.com. Keep in mind that the drivers for the 8800 were, at this point, pretty bad, to say the least.
coh1600.gif


lomac1600.gif


fear1600-1.gif


cod21600-1.gif


aamts1280.gif


even the lowly source!
c1600-1.gif


Damn manbearpig, where did you get your 7900GS? seems to me it kinda, defies the laws of physics...
 
AA + HDR is a Directx10 feature using Shader Model 4.0 which is a slight upgrade from SM3.0.


Bullshit... AA+HDR is a feature that until the 8 series was only supported by the architecture of ati's X1k series. The 7 series supported both independently but due to hardware limitations could not do both at once.

My X1950pro supports AA+HDR it is completely unrelated to SM4.0 and DX10:rolleyes:


If I had thought 2142 was a great game to play and buy I'd be willing to agree with you if the benchmarks were true but from what I've seen, these benchmarks don't mean anything as they "proved" to be unplayable when playing Bioshock yet I still beat it on max settings.

I do not believe you one bit. considering the review [H] did with an X6800 2GB ddr2-800 and an 8800GTS 320 cant even maintain 30FPS at max at 1600x1200 and that cards nearly double the power of yours. Claiming you run "max" with your 7900GS when systems that are nearly twice as fast as yours cannot, come on man:rolleyes:

I couldn't even play at full settings and i know my systems faster than yours.

I guess it is possible though that your definition of playable is 15FPS though.

Either way im done here 90% of what you said is misinformed nonsense and maybe you should do some reading before trying to pass your bullshit off here. People here can see through that kind of bullshit in a heartbeat.

Good luck to you im done responding.
 
everything after the ti4600 up until the 8800 series was utter crap. the 8800 series was/are the most amazing cards nvidia has ever made. ATI dominated against the 5800/5900 6800 7800 7900. infact, in numerous situations, not majority of course, the X1950 XTX still holds its own against stock 8800 GTS cards. i cant see ATI taking back the performance crown any time soon though, maybe when DX 11 cards ship.


and yes, the 7900 GS is crap.
 
Ok if you want to play that game, here are some 7900 benchmarks:

http://www.tech-hounds.com/review25/ReviewsPage5.html

Look at it and compare the 7900GS and the 7950GT. You barely see a 10 FPS difference with full AA/AF etc at those resolutions. How is 40-50 fps bad with max settings at a high res? Even 30 fps is acceptable in video games.

quake 4... 1024 X 768.... 14" monitor? yeah... I'd just be generally pissed if someone were using somthing like that at all.

you calling 30fps acceptable is ignorant. Its all about preferances. And none of my articals mention minimum FPS.

edit: guess I've officially invited myself into this flame war. I've said all I want to say, ill leave the readsers of this flame war (if there are any) to decide for themselves.

...Yeah Wizard, I stick to my shit too, but I've only had my 7900 for a year and I need to upgrade soon which kinda sucks. Also, you talking to me about the refresh rate thing? I never once mentions about changing an LCD refresh rate :confused:

manbearpig said:
MrWizard6600 said:
This is the paragraph that tells me you dont know what your talking about. the sub pixel of an LCD monitor cannot physically change color more then, on a good pannel, 70 times a second. The FPS read-out above that is theoretical. Your actual FPS cannot exceed your monitors "refresh" rate.

Thank you for proving my point. Let me explain further: because the typical LCD framerate is usually 60 frames per second, it would be a waste to get the best possible graphics card on the market for the best possible frames. Once you enable V-sync, the highest frame you can get is the highest framerate your monitor can handle, which like you said is 70Hertz on a good panel. But no ones stopping you from getting a 120 Hertz HDTV..

Thats what I was responding to when i mentioned refreash rate. I read it wrong, but no HDTV, rayathon, LCD, or plasma, can handle a FPS of 120.
 
Does this card support SLI? Because if it does, you could buy 2 and still end up cheaper than a GTX, and probably blow its doors off.
 
Not particulary true. An interesting interview on Cevat Yerli from Crytek stated otherwise:

http://www.gameinformer.com/News/Story/200710/N07.1004.1210.04388.htm

It's a good read,

Yes, it is good read.

It is true that Dx 10.1 won't be fully acceptable or pervalent at least 12 months after its release. Just that 8800 series is so near its production cycle, it is a tough call. Like couple years ago, I helped by brother building a top-of-line system using 7800 GTX 512MB. It was in limited supply and cost more than any other video card at that time for $720. Two weeks later, nVidia came out with 7900 Series. The performance in 7900 series vs 7800 series makes if we want to do 7800 GTX 512 in SLI impractical.

If I get the rumors correctly that nVidia is coming out with 8800GT (G92 version of 8800) in November, which replaces or on par with 8800 GTS, then buying GTS right now would be ill advised. Assuming the nVidia's high end cards are coming out in January/February 2008 (there are conflicting reports that nVidia is moving up its release date to 2007), you might still get few months of usage out of 8800GTX/Ultra if you bought it now.

I suppose if we use Crysis as a standard, some one in this thread mentioned that with his Q6600 [email protected] + 8800 GTX can do 60+fps at 1920x1200 with everything max, then I say with that kind of system, he can hold off for another year before upgrading.
 
Back
Top