BFGTech GeForce 9800 GTX @ [H]

@Sharky974

The [H] shows how a card will perform in actual game play.
You ever played a game and had that game run good but then in some parts of the game the fps just went to hell and got so low it ruined the smoothness and maybe gave a pause or stutter?

So bottem line is, the [H] is just giving you their opinion of a card performs during game play and you should take it at that and nothing else.
You can always goto Anandtech or Techreport, etc for other reviews.
 
Not really suprising this rebadged gts was released on april fools day then. =/
 
Some of the "Static" reviews are hitting the web and showing similar results as [H]ard OCP. Toms has a bunch more titles reviewed, there are some interesting results, especially in Flight Simulator and UT3, ( http://www.tomshardware.com/2008/04/01/nvidia_geforce_9800gtx_review/page11.html ) .. In UT3, they get a 40% boost over thee 8800GTS 512.. Flight Simulator also shows the 9800GTX with a lead over the 8800 Ultra. (Small though). Tom's is attributing it to better drivers for the 9800GTX..

Could a boost come in the future with better drivers for the 9800 over the 8800s? Just like what happened when the 8600GTS was released.
 
Tom's review certainly shows the 3870X2 in a much better light, often outrunning the Nvidia cards at the same settings. Sometimes not, but certainly overall it looks a hell of a lot better. As well, as you noted in your own review few games challenge these cards, so it seems often when 3870X2 loses it's in an situation where the FPS are so high it doesn't matter.

And you cant use the "timedemo" excuse, because Tom's doesn't use timedemo, they use gameplay and FRAPS.

Who's review is right? I dont know, but I do know this site has a history of favoring AMD in CPU's (remember the Core2 debacle) and Nvidia in GPU's.
 
Some of the "Static" reviews are hitting the web and showing similar results as [H]ard OCP. Toms has a bunch more titles reviewed, there are some interesting results, especially in Flight Simulator and UT3, ( http://www.tomshardware.com/2008/04/01/nvidia_geforce_9800gtx_review/page11.html ) .. In UT3, they get a 40% boost over thee 8800GTS 512.. Flight Simulator also shows the 9800GTX with a lead over the 8800 Ultra. (Small though). Tom's is attributing it to better drivers for the 9800GTX..

Could a boost come in the future with better drivers for the 9800 over the 8800s? Just like what happened when the 8600GTS was released.

They are not showing "similar results" they often show the 3870X2 in a better light, and they are not using "static" tests either but gameplay and fraps.

And I'm not here to crap on this site, but Tom's shows me an actual useful thing in that review, they are benching at two common LCD sizes. Such a simple thing, but so many sites get it wrong. I have a 22" monitor 1680X1050, and whaddya know they have the exact benchmark for me. No more guessing that a 1600X1200 bench is roughly hopefully comparable to me like at most reviews, which could be a big deal in a game like Crysis that often hovers right at playable. 1600X1200 obviously tells me they're benching on a CRT, and who uses a CRT anymore?

I just think if reviews are trying to be innovative, they should focus on USEFUL innovations. I think the gameplay only thing from H is great, but not the way it's presented.

Well anyway even if it's a little faster as according to Tom's review, 3870X2 is fairly borked by this card. The 9800 is superior in almost all respects by not suffering from all the dual card issues and being cheaper, and Crysis is the only major game where performance really matters and the two are pretty even there.
 
Tom's review certainly shows the 3870X2 in a much better light, often outrunning the Nvidia cards at the same settings. Sometimes not, but certainly overall it looks a hell of a lot better. As well, as you noted in your own review few games challenge these cards, so it seems often when 3870X2 loses it's in an situation where the FPS are so high it doesn't matter.

And you cant use the "timedemo" excuse, because Tom's doesn't use timedemo, they use gameplay and FRAPS.

Who's review is right? I dont know, but I do know this site has a history of favoring AMD in CPU's (remember the Core2 debacle) and Nvidia in GPU's.

Every toms review I've read was recorded custom times demos and FRAPS ;) Still no real work going on there. Toms is above and beyond the worst tech site in existence.

THE ONLY GAME where the FPS were so high it didn't matter was COD4, which was so wonderfully summed up like this:

Hmm, They All Kick Ass

Nuff said.

Yea, shut up troll.

-------------

I really wish this was a cruel april fools joke, but I know it isn't :(
 
And I'm not here to crap on this site, but Tom's shows me an actual useful thing in that review, they are benching at two common LCD sizes. Such a simple thing, but so many sites get it wrong.

If every tech-site used the same bechmark procedure (resolutions, timedemos) on the latest hardware then they would all show the same results. I think it is much more useful to see different types of tests, specifically real gameplay at realistic settings and not some canned test. If you really like Tom's so much, well, the internet has more then one website:

http://www.tomshardware.com/2008/04/01/nvidia_geforce_9800gtx_review/
 
I have a 22" monitor 1680X1050, and whaddya know they have the exact benchmark for me. No more guessing that a 1600X1200 bench is roughly hopefully comparable to me like at most reviews, which could be a big deal in a game like Crysis that often hovers right at playable. 1600X1200 obviously tells me they're benching on a CRT, and who uses a CRT anymore?

20" standard aspect-ratio LCD displays run natively at 1600x1200.

I don't think it is an unreasonable logical leap to conclude that, if video card X runs game Y well at 1920x1200 or 1600x1200, it will run it well at 1680x1050.

And anyway, 1600x1200 is 1.92 million pixels per frame, and 1680x1050 is 1.764 million pixels per frame. Of the two, 1600x1200 is more demanding, albeit by a very small amount.
 
I have a 22" monitor 1680X1050, and whaddya know they have the exact benchmark for me. No more guessing that a 1600X1200 bench is roughly hopefully comparable to me like at most reviews

1680x1050=1,764,000 pixels
1600x1200=1,920,000 pixels

There you go, I used fifth-grade math to help you along in your struggle.

I noticed that Benchmark Reviews mentions why they used the resolutions they did in their 9800 GTX review, they explain here. As it turns out, not everyone has your 22" widescreen monitor, so don't be so shocked. I'm still a little angree that Tom's didn't test with the triple 24" display that I'm using... what were they thinking?
 
I have a 22" monitor 1680X1050, and whaddya know they have the exact benchmark for me. No more guessing that a 1600X1200 bench is roughly hopefully comparable to me like at most reviews, which could be a big deal in a game like Crysis that often hovers right at playable. 1600X1200 obviously tells me they're benching on a CRT, and who uses a CRT anymore?

Um... I'm running a 20.1" 1600x1200 LCD...
 
20" standard aspect-ratio LCD displays run natively at 1600x1200.

I don't think it is an unreasonable logical leap to conclude that, if video card X runs game Y well at 1920x1200 or 1600x1200, it will run it well at 1680x1050.

And anyway, 1600x1200 is 1.92 million pixels per frame, and 1680x1050 is 1.764 million pixels per frame. Of the two, 1600x1200 is more demanding, albeit by a very small amount.


True but, on a game like Crysis that often hovers RIGHT at playable, say 26 or 28 or 30 FPS, it could make a difference. I just hate having to guesstimate.

Didn't know that about 20" LCD's, But 22" is surely a lot more common though. When I was LCD shopping 1680X1050 seemed to be one of the major resolutions but you dont see a lot of reviews at that res, you see more often 1600X1200 at all sites.
 
1680x1050=1,764,000 pixels
1600x1200=1,920,000 pixels

There you go, I used fifth-grade math to help you along in your struggle.

I noticed that Benchmark Reviews mentions why they used the resolutions they did in their 9800 GTX review, they explain here. As it turns out, not everyone has your 22" widescreen monitor, so don't be so shocked. I'm still a little angree that Tom's didn't test with the triple 24" display that I'm using... what were they thinking?


Tell me about it, I have yet to read a review with a 1366x768 rez. (Which Im running and it looks great to me..Plus I get wicked performance at that rez..)
 
Who's review is right? I dont know, but I do know this site has a history of favoring AMD in CPU's (remember the Core2 debacle) and Nvidia in GPU's.

You mean the “Core 2 debacle” where we showed our gaming enthusiast base that there was no need for them to rush out immediately and buy a new Core 2 CPU if they already had high end Athlons?

“We have proven here that the flurry of canned benchmarks based on timedemos showing huge gains with Core 2 processors are virtually worthless in rating the true gaming performance of these processors today. The fact of the matter is that real-world gaming performance today greatly lies at the feet of your video card. Almost none of today’s games are performance limited by your CPU. Maybe that will change, but given the trends, it is not likely. You simply do not need a $1000 CPU to get great gaming performance as we proved months ago in our CPU Scaling article.”

Or where we showed that the Core 2 had immediate benefits in other desktop applications?

“There is no doubt that when it comes to editing video, manipulating images, or encoding music, the Intel Core 2 Duo and Extreme processors at 2.66GHz and above currently enjoy a healthy performance advantage over AMD’s Athlon FX and Athlon 64 line of processors.”

Or the part of the debacle that covered the fact that Intel has fixed their scaling and power consumption issues?

“Intel has fixed their broken power consumption legacy and the Core 2 now fits inside a power envelope that will give it a very competitive performance per watt position.”

Or where you pointing to where we have given AMD’s ATI GPUs multiple awards lately?

“For the first time this year we can recommend AMD ATI graphics cards for gaming. We highly recommend the ATI Radeon HD 3850 for gaming under $200. With the average price of a GeForce 8800 GT currently at $293 and stock hard to find, the Radeon HD 3870 represents a healthy $70 savings while providing the best gaming experience at the $219 price point. Both the HD 3850 and HD 3870 represent the best gaming experience at their suggested retail prices.”

Are these those areas you are referring to? I think you have a very selective memory when you find the need to prop up your poorly constructed arguments.


As for "Who is right?" We at HardOCP have NO PROBLEM standing next to our track record proudly.
 
I love the drama a video card review bring. Same rhetoric every single time of "this testing system sucks, zomg you guys are teh biased". Strangely the bias sems to change with whatever companys card is doing better at the time so figure that one out. :confused::eek:
 
Yeah Sharky974 seems to take the review to personally. He seems to forget that everything is POV, so if he doesent like it, go somewhere else. I find it hard to believe some gets so worked up over something so trivial.
 
I love the drama a video card review bring. Same rhetoric every single time of "this testing system sucks, zomg you guys are teh biased". Strangely the bias sems to change with whatever companys card is doing better at the time so figure that one out. :confused::eek:

yeah seriously :p nVidia has been winning this generations war, no problem. Nothing ATi has come out with has competed very well :rolleyes:

Go kyle, I needed a laugh :D
 
Just curious... the last few video card reviews used the quad core, and this one uses the dual core.
While I too believe that the CPu is not a limiting factor with either chip, why use the quad and then switch back?
 
First, great review. I find real gameplay stats much more useful than iterations of 3dMark06 and the like.

Second...
So take note: A high frame rate does not necessarily mean a smooth gameplay experience.

QFT.

Third... to whoever is upgrading from the 6600GT. I'm right there with ya, so I agree with your decision 100%, though I'm going a slightly different way. Upgrading from my trusty 6600GT SLi (from when SLi was actually useful)... to the 9600GT that I just picked up on the cheap. I just can't turn down good performance @ 1920 in CoD4 and UT3 for <$150.
 
"If you were waiting for a monster gaming class &#8220;green&#8221; video card to support HD video decoding so you can finally make that living room HTPC into an HD gaming powerhouse: You have that in the GeForce 9800 GTX. Not only does it have PureVideo HD technology including HDCP and HDMI (with an adapter, not included)"

For playing HD videos, does the 9800 GTX have an advantage over the GX2?

Making my tough decissions today, and I do like to play HD videos on my PC too.
 
Just curious... the last few video card reviews used the quad core, and this one uses the dual core.
While I too believe that the CPu is not a limiting factor with either chip, why use the quad and then switch back?

We upgraded Brent's system as he usually does the high end cards, but he had to leave town. Mark's box is not upgraded yet.

That said, and I did specifically address this on the test setup page...

We have done internal testing here lately to compare dual core X6800 and quad core QX6850 and using the high resolutions that we do in the titles that we pick for video card evaluations, we see no in-game performance differences. This of course might impact timedemo benchmarks, but not real world gameplay in these specific titles.
 
For playing HD videos, does the 9800 GTX have an advantage over the GX2?

Making my tough decissions today, and I do like to play HD videos on my PC too.

My kneejerk reaction would be be to say that the only advantage the GTX has over the GX2 for HD decoding purposes is that it uses less power and creates less heat, since it has only one GPU and one set of video RAM.

Both 9800 GX2 and 9800 GTX have PureVideo HD technology built into the GPU. - Kyle

Heh, yes that too. They can both do it fine, but the GTX does it with less wattage, so there you go.
 
In before the lock!

Kidding!!! Seriously though. Who would really care to see a review at 1024x800 or 1280x1024? Let me help you out and write it real quick.

Introdution, meat, and conclusion:
The 9800 at max settings and max AF, just like any 8800 series card, runs at 60+ FPS (sometimes breaking 300FPS on games like CS:S) on every game we tried it on.


Hmm, yeah not worth much is it? XD
 
The incontrovertible fact is that [H] has always been nvidia biased...

No wait, I mean AMD biased..., or was it Intel biased?... Ah yes, now I get it, its VIA biased
 
Oh I read that, hence my comment about agreeing,
I was just curious. Didn't realize there were separate test setups!!

We have two full time video card editors now, Brent and Mark. We also have a part time guy, Matthew. All have their own test setups. Brent has moved to Quad, but Mark and Matt are still on Core 2 Duo X6800s. All have 30" monitors now as well.

I still personally test and benchmark all the motheboards we cover, so I keep an eye on where CPU power is really taking us in a gaming sense. We steer clear of anything close to CPU limited in a GPU evaluation. SupCom and Flight Sim X come to mind, but even games like Lost Planet that you can see scale into even 8 cores gets GPU limited very quickly.
 
Ugh. It is true that not many games out there currently can really push some of the refresh 8800 card and now the 9800 cards, but a little part of me would be willing to buy a new card if I could play Crysis at the highest settings.
 
So what does all of this mean for someone with 8800gts 320mb? That card hasn't made an appearance in a benchmark since 8800gt came out :p
 
So [H] confirms it, I'll just sleep right through the 9800s. :)

Kyle, I am curious as to why you had physics (in the Crysis comparison) set to medium on the 9800GTX (1600x1200) but the 8800GTX had physics set to high. You think the results would have been closer if the settings were identical? Are physics all that GPU intensive anyway?
 
So what does all of this mean for someone with 8800gts 320mb? That card hasn't made an appearance in a benchmark since 8800gt came out :p
What resolution do you play at? 19" 1280 resolution? I'd probably keep waiting if I were you. 1600+ resolution you'd probably see some increase.
 
@sharky
Then just look at the apples to apples tests. I prefer seeing what settings I can enable and still have an enjoyable gaming experience. If at all possible, highest levels of shaders and textures greatly enhance the gameplay imo. Looking at the Crysis settings it will help me predict the worse case scenario for several upcoming dx10 titles; I'm not going to buy a new graphic card if I cant keep the shaders or texture levels up. And if it just so happens that your ATI crumbles under these higher settings, then I don't want to see it taking up chart space with 10fps min and 20fps avg, so we might as well turn that down a notch to keep it playable. If you just want the raw power tests just look for the occasional apples to apples sections.

Who would really care to see a review at 1280x1024?
I wouldn't mind seeing what the fps and settings would be like at that resolution for dx10 games, but in dx9 it's a no brainer.
 
So what does all of this mean for someone with 8800gts 320mb? That card hasn't made an appearance in a benchmark since 8800gt came out :p
I think you may have answered your own question in there somewhere... ;)
 
Even though I'm not impressed with the 9800GTX, I am impressed with the G92 architecture. The 9800GTX has less rops and much less bandwidth than the 8800GTX and it still manages to perform pretty much the same. Plus it seems to be a nice overclocker, I've seen reports of 800+ on the core
 
I love the drama a video card review bring. Same rhetoric every single time of "this testing system sucks, zomg you guys are teh biased". Strangely the bias sems to change with whatever companys card is doing better at the time so figure that one out. :confused::eek:

Kind of like gaming websites doing console reviews. They're always called bias when one side doesn't agree with a review and it always changes depending on the review. Oh yes and when they review a game good that side sings their praises and denies they ever said anything different.

Anyway, I'm wondering what kind of boost the 9800GTX would give over a single 3870. At my current resolution of 1440x900 I know it would be worthless, but I'm looking more towards later this year when I get a larger monitor. I was planning to just get a second 3870 or a 3870x2 and doing Crossfire, but I'd honestly rather stick with a single card set-up if I can.
 
What resolution do you play at? 19" 1280 resolution? I'd probably keep waiting if I were you. 1600+ resolution you'd probably see some increase.

Heh, I have an fw900. I've acquired this gts thinking that the next gen must be right around the corner; that was sometime last year.

Now watch me go through selling my gts, buying the 9800gtx, only to have a card 100x better drop in July!
 
Count me one one of the "who is running a CRT these days" people. Even with my POS 19" Envision monitor (16x12 @ 75), I find that it's better than the average LCD I've seen quality wise. I can't say ALL LCD's because I haven't seen/used any of the hyped panels (that *aren't* a part of the affordable range). I also have a second downstairs in the garage (Envision as well), but that's not used because I can't really justify the extra real estate on my desk at the moment. And I don't see upgrading until I see a ~24" , w/ decent (subjective!) IQ @ $250ish OTD. $300 panels on holiday fire sales are nice, but only for the nutters that get up at 3am to wait in line.

Oh, and Sharky wake up. You haven't been around long enough. Every fucking release cycle, whether it be a new CPU or GPU architecture, Kyle and friends get accused of whoring for the 'other side'. If you weren't so damn thick headed, you'd know that it's not true. They whore out WHAT IS BEST AT THE TIME. It was the case w/ K7, then the Northwood Intel's, then A64, now the Core variants. The graphics side is a bit more muddied in my memory, but it went something like NVIDIA 4 series (ti4x00), then R300/R350 (ie: Radeon 9500/9700/9800/[X800 and pals?]), then the NVIDIA 6 series (the first DX9 cards - 5900s were SLAMMED as dustbuster POS [which came way late to market]), then x1900 (? not sure here), G7x (7800/7900 cards), G80/G92. The more recent ATI cards get a nod, but only for the price levels really (in comparison to NVIDIA cards at the time), since as we all know the performance tanks @ higher res + adding AA/AF. Bottom line: there's NO fucking favoritism towards either side here, at least not to how the DAAMIT apologists are arguing. If there were, I'd join any of the (name "losing" companies tech) defenders in the exodus. And besides, anyone willing to smash a never to be seen "console" and put up 100s of thousands of his own money defending what is right and true - not give up just because it's cheaper earns some points in my book. Hell, if he didn't have any scruples, he could have hit up the users for GayPal donations for his defense..Did he? Nope. Although, ad dollars may have helped (not as much as you'd think I'm *sure*.

On to the topic at hand. Big surprise, no real improvement here. Now maybe all the defenders can STFU. Where's the MASSIVE increase in overclocks you guys were talking about? Bit faster memory speeds due to different modules (I assume), big WHOOP. I don't see any WAY over 800 clocks like some were spouting (900 eh not there either; and W/C or LN2/etc don't count). EVGA has the card available for step-up @ $340...Which is ONLY good for those of you that got the card pre-drop. I paid ~$290 for mine, which for me wholly debatable as to whether or not to upgrade (since there's not *that* much increase in overhead)..Made even worse by the fact I sent in for my $30 MIR, so I'd only get credit for $260. An upgrade would then cost me ~$100-150, factoring in shipping costs/taxes (although I didn't get charge tax on the G70 -> G71 upgrade, go figure) - SO not worth it. And it only gets worse who just bought their cards in the last week or two - those who'd get only $200-220 credit.

Bottom line, anyone that's been holding off upgrading from 6/7/wtf ever series, and won't wait for the "real" new cards.. Grab a GTS @ $200 while you can (or GT @ ~$20 less; [edit] But you'll probably want to grab the not so hot deal now $30 Accelero S1 (I got the rev 1 @ $17..) due to the shit-tastic cooling[/edit]). You can bet those babies are going to disappear faster than Eliott dumb dumb Spitzer.
 
Kyle, I am curious as to why you had physics (in the Crysis comparison) set to medium on the 9800GTX (1600x1200) but the 8800GTX had physics set to high. You think the results would have been closer if the settings were identical? Are physics all that GPU intensive anyway?

The Physics option is not all that GPU intensive, no but Crysis performance is a delicate balancing act. I found that by setting Physics to Medium, I could set Shaders to High and improve the experience.

On page 4:

At 1600x1200, the extra horsepower afforded by the BFGTech GeForce 9800 GTX's higher clock speed, compared to the GeForce 8800 GTS 512MB, allowed us to turn Shader Quality to High, at the reasonable expense of reducing Physics Quality to Medium from High. Physics quality is mostly a CPU-intensive option, but it does in fact introduce some load onto the GPU as well and it does impact gameplay differently depending on the video card.
 
In for two...Was hoping for more but it's still faster than 8800GTX and 20c less under load x2 40c less heat means for a much more comfortable room and case temps. This never used to bother me but after having 2x8800GTX running for the last month the computer room is freeking 85 degrees while the rest of the house is 72 degrees LOL Will be nice to be able to sit at the computer without sweating for a change :)
 
Back
Top