BFGTech GeForce 9800 GTX @ [H]

Well, I guess for someone with a 8800 GTX/GTS (G92) /GT the 9800 GTX does not seem like a considerable upgrade.

But for someone like me, with one of the 8800 GTS with 640MB... how does it compare? Ever since the introduction of the 8800 GT I rarely see the 640MB GTS appear in the reviews, which makes me wonder where exactly do I stack up?

And since I play on my 37 inch TV, I can only go up to 1366 x 768. Heck, I can turn several of Crysis' settings to high and I still get very playable frame rates.

Yet, I wonder... would I get an advantage with a 9600? Or a 9800? *shrugs*
 
Yeah, I 100% agree... I bought my card the day they became available, and at the time I thought I was spending too much money for one component... but damn... Nov 8th 2006 (just looked) and it's still chugging along merrily. 17 months (and counting) out of 1 piece of "high end" hardware is amazing in my book....
[snip]

lol..

11/10/06
"This email confirms that you have paid ZipZoomFly.com ([email protected]) $706.08 USD using PayPal."

that was the most I have ever spent on one video card ever, but after I popped it into my machine in place of my two x1900xtx cards and it beat them with no OC in benches and gaming, well I was pretty ok with the price tag after that.. I think it was OOS the next day also.. for the record though, you can now get two 9800GTXs for the same price, shipped. $329 * 2 + shipping (and tax if applicable)..but with the GT200 around the corner I doubt the 9800GTX will have the same staying power..
 
Very nice Review Mark.
But this blows...Even If I didn't expect a miracle I still hoped it would be somewhat worth it.

I've been waiting for the 9800GTX (I have a 6600GT) to decide what to do for my new rig and now i'm lost.
Everything else was decided (Iiyama 24", E8400, 4G DDR2, Abit P35....)

now i feel like I should really wait for Nehalem but a whole rig based around that is going to cost a lot of $
and DDR3 still kinda sucks.

I want to upgrade damn it! :/

I just went ahead and bought the Newegg deal on the GTX and will start picking up parts now to build around that. I hope in less than 90 days that maybe something "better" will show up and I can just step up. If not I know I have a solid card that performs at resolutions I want (1900x1200).
 
Not really. I can take up 100% of a CPU with a single line of code...while(1) anyone?
Of course you can push anything like that, but the point is that we are looking for performance in Crysis. Our 8800 cards are already supplying plenty of power for current games. The point of a new line of cards is to provide solid performance in current games and be able to provide good performance in games yet to come. nVidia and ATi have done their jobs for the most part. But then there is Crysis. nVidia's problem is that they have now rehashed the same technology for the SECOND time. I wouldn't put it past nVidia to do ANOTHER G92 refresh @ 55nm. If that happens I will be pretty upset.
 
Your post didn't really make a lot of sense, but let me explain my line of thought...

With SLI, you have available to you only as much RAM as is on one card. So when you take 2 9800GTXs, you have all the shader horsepower of 2 cards (Not getting into losses here..) but the RAM of one.

We can see already there are situations where the 9800GTX is memory starved, which is where the 8800GTX will out pace it, as it has 50% more ram, and a 50% wider bus to access it with.

So, if one lonely 9800GTX is kind of memory starved, then 2 9800GTX's are very memory starved, and 3 are incredibly memory starved. Which means the benefits of SLI will not be huge, and probably not worth the investment.

You can only fit so much into your framebuffer, and once you hit the limit, it doesn't matter much how fast you can churn the data, because a lot of time will be wasted waiting for the framebuffer to catch up.

And when you only have 512mb, in a day and age where people are gaming at 1920 and higher and wanting top performance while doing so, that framebuffer runs out really fast.


Your not making sense...

1 9800GTX = 1 512MB Frame Buffer
2 9800GTX = 2 512MB Frame Buffers
3 9800GTX = 3 512MB Frame Buffers

2 or 3 cards are not anymore starved than a single card...
 
Very nice Review Mark.
But this blows...Even If I didn't expect a miracle I still hoped it would be somewhat worth it.

I've been waiting for the 9800GTX (I have a 6600GT) to decide what to do for my new rig and now i'm lost.
Everything else was decided (Iiyama 24", E8400, 4G DDR2, Abit P35....)

now i feel like I should really wait for Nehalem but a whole rig based around that is going to cost a lot of $
and DDR3 still kinda sucks.

I want to upgrade damn it! :/
Yeah I am about to just say screw it on the upgrade for a while. This line of video cards has actually discouraged me from going ahead and doing a new build now. I was gonna go all out, but nVidia really did ruin this for me. I guess I'll just wait for Nahalem or whatever is king of the hill when the TRUE next gen cards show up.

Edit: Maybe by the time Nahalem comes out Intel will have a serial video adapter interface out that will allow us to actually increase performance 100% each video card we add. I think it is about time we demand more from these companies.
 
nvidia may pull a creative though and just stop driver support for the older card and make the updated drivers much less frequent.
 
Your not making sense...

1 9800GTX = 1 512MB Frame Buffer
2 9800GTX = 2 512MB Frame Buffers
3 9800GTX = 3 512MB Frame Buffers

2 or 3 cards are not anymore starved than a single card...

3 512mb != 1 1.5gb buffer

That's the problem. The cards cannot share memory, so everything is duplicated between them. So anything that would make a single 9800GTX memory starved will make 2 or 3 of them VERY memory starved, since you're trying to use twice or three times the 'horsepower' to churn the same amount of data.

SLI helps alot when you need more GPU power to crank the settings up, and without a doubt SLI'd 9800's will be a nice setup, but we already see situations where a SINGLE 9800GTX could use more memory. In those situations, 2 or 3 or 4 9800GTX's are going to perform roughly the same, because it's not the GPU that's lacking, it's the memory.
 
Yeah I am about to just say screw it on the upgrade for a while. This line of video cards has actually discouraged me from going ahead and doing a new build now. I was gonna go all out, but nVidia really did ruin this for me. I guess I'll just wait for Nahalem or whatever is king of the hill when the TRUE next gen cards show up.

I'm kind of feeling the same way. I am looking at the 790i w/ SLI-9800GTX but with the new nvidia cards coming in July, Nehalem, and current DDR3 prices it seems like a waste to go all out knowing the next-gen is coming in less than a year. Decisions, decisions...

Good thing I already splurged on CM Stacker 830 w/ Real Power Pro 1000W PSU after reading this review:
http://enthusiast.hardocp.com/article.html?art=MTQ2NSwxLCxoZW50aHVzaWFzdA==

Now I will be forced to fill it with parts, probably get the 9800GTX anyway.
 
Has there been any other time where one game was far beyond what the current top hardware could handle?
Unreal and Falcon 4 come to mind. Especially Unreal, "slideshow" was a term used in a lot of reviews.

Things are a bit different now with most people having LCDs and wanting to run at native res. Back in the CRT days being able to run a game well at high graphics settings almost never meant running at the highest res your monitor supported. When Unreal came out there was no such thing as a system capable of running it maxed out at the 1600x1200 or 1920x1440 res a big monitor was capable of. I think that was a pipe dream even for Voodoo 2s in SLI.
 
nvidia may pull a creative though and just stop driver support for the older card and make the updated drivers much less frequent.

Highly doubt it. Creative can get away with that kind of thing (barely) because they own the EAX standards and have bought out most of their real competitors, so they hold almost a monopoly power over the shrinking market of discrete gaming sound cards. NVIDIA still has AMD/ATI to contend with, doesn't own the DirectX or OpenGL standards, and if they piss too many people off, people will buy even a slightly slower AMD/ATI. NVIDIA may be messing with ignorant people (who, with their lack of knowledge, won't complain) with their silly naming schemes this generation, but they haven't screwed around with those in-the-know at least, and I doubt they will.
 
This review makes a lot more sense than the TweakTown one, where the 9800GTX seemed to be twice as good as an overclocked 8800GT in some situations, and often challenged the 9800GX2. This is more what I'd expect from what amounts to a higher-clocked 8800GTS 512MB.
 
3 512mb != 1 1.5gb buffer

That's the problem. The cards cannot share memory, so everything is duplicated between them. So anything that would make a single 9800GTX memory starved will make 2 or 3 of them VERY memory starved, since you're trying to use twice or three times the 'horsepower' to churn the same amount of data.

SLI helps alot when you need more GPU power to crank the settings up, and without a doubt SLI'd 9800's will be a nice setup, but we already see situations where a SINGLE 9800GTX could use more memory. In those situations, 2 or 3 or 4 9800GTX's are going to perform roughly the same, because it's not the GPU that's lacking, it's the memory.

correct on all accounts! they do not share memory, they just render one whole frame just like it was a solo card, but every other frame in the "stream" is handed off to the other card in a "round-robin" type of alternation.. (assuming AFR or some flavor of it is used, which is pretty much always used now days for mutli gpu rendering). This puts a lot of stress on the main system also.. you need a fast memory and IO sub system to feed textures to the cards in SLI.. I think this is why some people experience lower low spots in FPS with multi gpu enabled, when the system just can't hang with all the texture fetching requests.. but as long as the system can hold up, two card SLI really does improve gpu horse power when using lots of AA at higher resolutions.. If you do get into a video memory limiting situation, it's possible that an SLI'd pair might perform a little better, just from a "brute force" perspective, but only of the system can handle the card managment and it would not be that much of an improvement.. me thinks. :)
 
Besides the "meh" response, my thought was wtf at "processor cores"? :confused:
 
Thanks for the review.

At this point if you have a NV 6 or 7 / Ati <38 series this is a good card to upgrade to. But if you have the 8 series the software has not caught up enough to make it necessary to upgrade.

Too much focus has been on consoles as of late. Are there any titles on the horizon that will push the PC envelope? And no I do not mean Crysis 2 and 3 :rolleyes:

The pure video HD decoding is an interesting feature. I wonder with today’s quad core processors if it is even necessary to offload to the gpu.
 
Payed $320 for the 8800GTS 512 back in the day! (2 months ago)

Thinking of stepping up. But wasn't there word of 9900's? Or was that some april fools joke?
 
Payed $320 for the 8800GTS 512 back in the day! (2 months ago)

Thinking of stepping up. But wasn't there word of 9900's? Or was that some april fools joke?

By the time they come out your stepup will be over, so you might as well go for it. (The rumor, assuming its not a April 1st joke, was July.)
 
3 512mb != 1 1.5gb buffer

That's the problem. The cards cannot share memory, so everything is duplicated between them. So anything that would make a single 9800GTX memory starved will make 2 or 3 of them VERY memory starved, since you're trying to use twice or three times the 'horsepower' to churn the same amount of data.

SLI helps alot when you need more GPU power to crank the settings up, and without a doubt SLI'd 9800's will be a nice setup, but we already see situations where a SINGLE 9800GTX could use more memory. In those situations, 2 or 3 or 4 9800GTX's are going to perform roughly the same, because it's not the GPU that's lacking, it's the memory.

It does not make them more memory starved LOL If anything it's the exact opposite of what you think. In situations where a single card would struggle due to the lack of memory an SLI setup will alleviate the burden because the load is now split between two cards. Each rendering every other frame so the burden is essentially cut in half versus only having one card.

This is why you don't even notice the gains sometimes UNTIL you get to higher resolutions and situation where one card is really straining to keep up.
 
Bluehaze, when they say that a card becomes memory starved in SLi, they are implying that the amount of memory will not be enough for the high resolution + AA that one would be expected to run with his SLi setup. A single card is not starved because it doesn't have the horsepower to run those settings, so the issue only becomes apparent in SLi.
 
The 9800GTX is a April Fools joke, you guys have been so sucked in! :p

Nvidia will release the REAL card tomorrow!
 
What's killing me is; we have 9800 GTX and still talking about playing Crysis at Medium settings overboard at high res with <30 FPS.

Has there been any other time where one game was far beyond what the current top hardware could handle? As far as I remember we used to always complain about how we drop hundreds of dollars on latest video GPUs at launch without any software that was able to max the cards capabilities until months down the road.

A couple..

Rebel Assault
Wing Commander 3 and 4
Doom 3 (For a short time)
Oblivion
Fear (for a short time)
 
Bluehaze, when they say that a card becomes memory starved in SLi, they are implying that the amount of memory will not be enough for the high resolution + AA that one would be expected to run with his SLi setup. A single card is not starved because it doesn't have the horsepower to run those settings, so the issue only becomes apparent in SLi.

Wha? SLi does not cause a system to become anymore memory starved than the same system would with one single card. Both cards are rendering the exact same things through the exact same memory bus individually. If one card is not starved adding a second or 3rd card will in no way overload a memory bus. It alleviates the load on the memory bus due to each card now only having to render half as many frames. So in effect, even though both cards don't share the 512MB memory they will still have the effective output of 1 1024MB card.
 
Thanks for the review.

At this point if you have a NV 6 or 7 / Ati <38 series this is a good card to upgrade to. But if you have the 8 series the software has not caught up enough to make it necessary to upgrade.

Too much focus has been on consoles as of late. Are there any titles on the horizon that will push the PC envelope? And no I do not mean Crysis 2 and 3 :rolleyes:

The pure video HD decoding is an interesting feature. I wonder with today’s quad core processors if it is even necessary to offload to the gpu.

Farcry2, Left for Dead
 
Well I was right about this entire line of cards. Whodathunkit? The ONLY game that people care to see big increases in is Crysis. Even the 9800GX2 can't pull adequate numbers in that game. nVidia should make DAMN sure they get 50-60fps AVG in Crysis @ 1920x1200 Very High on their next flagship card or I PROMISE they will see a huge drop in game sales and hardware sales. People will straight up leave PC behind and take up console gaming. I love gaming on my PC, but if nVidia and ATi aren't going to deliver the performance I want to see in my games I am just not going to bother. If you think I am in a minority you have another thing coming. Trends are already showing people moving to consoles from PC. Give people even more incentive to game on consoles over PC and you will see the PC gaming market dry up. I have said this numerous times before and I have been flamed for it, but it is the truth.

You really don't need 60fps in Crysis... 30fps is more than sufficient and 40fps would be completely smooth sailing...

Anyway, your high-end hardware isn't what's going to push people towards console gaming, it's how much your average person is going to need to put-out to get a mid-range card that can handle at least the same kind of graphics as a comparable console can, and with the 9600GT going for as little as $120, any graphics-related shift should be moving towards the PC and away from the consoles.

Anyway, have fun with your crapbox 360- watch it explode in your face for all I care (and, yeah, they do explode- they really are just that unstable; I stupidly picked a 2nd gen one up the March after release and w/in 3.5 months it started freezing and a month after that it just blew-up). The grass really is no greener- though it is a tad more charred.

The door is WIDE open for AMD to come back hard with its 4870 and 4870X2

That begs the question though- what is the 98xx series competing with? The R6xx series. Yes, it looks like the R7xx series will arrive before the 99xx series (aka GT200), but if present reports can be believed that will only be by a couple of months (and if R7xx does prove more potent than R6xx has been, I'm rather sure nVidia will ensure the gap is only 1-2 months at most), but nVidia's answer looks like it will be the 99xx series, not the 98xx series.

Look, I love Crysis. The first half of the singleplayer experience offers some of the best FPS action around, with great AI and such an open design that I've played the first mission alone over 15 times already. And, yeah, it looks great and it pushes your gpu. Would I love to be able to play Crysis totally smoothly w/the settings maxed? Sure. However, Crysis is pretty much an anomaly right now. Not until Far Cry 2, Fallout 3, Alan Wake, STALKER: Clear Sky, etc... roll around are we likely to see anything else so demanding until after R7xx and GT200/99xx come-out. Maybe, just maybe, nVidia and ATi know what's coming and when it's coming from the software side and have decided to bring better performance to larger amounts of people (courtesy of the 8800GS, 9600GT, HD 3850, HD 3870, and 8800GT) while they further refine their true high-end entries for when they'll be needed for more than just Crysis. Why the 9800GTX, 9800GX2, and 3870X2? Probably to try and shut-up people like you for a few months, but even the 9800GTX does seem to be geared towards offering better performance to the masses thanks to its relatively low price point (which, if the 8800GTS 512mb, 8800GT, and 9600GT are any indicators, that price point may well drop into the low $200's).

Anyway, superior graphics are but one facet of the benefits offered by PC gaming. Even without current hardware being able to smoothly handle Crysis on Very High settings at the highest of resolutions, graphical superiority still rests with the PC, and that can be achieved even for the relatively paltry price of $120 for a 9600GT. In pretty much every game that is on both consoles and PC's, PC hardware can push higher resolutions, larger textures (note the "Extra" settings in CoD 4), AA, AF, and better shading. Even some of the most console-oriented games, like Guitar Hero 3, benefit from this (I'm running GH3 on my 8800GTS 640mb at 1600x1200 with 16xAF, 2xAA, and all the bells and whistles turned-on, including the most voluminous crowds- granted, all GH3 really needs is to be able to smoothly display the note chart at 60fps and that's all I really care about, but my rig can do that plus more- and the increased resolution does actually do a better job of making notes easier to define earlier even compared to console versions running on HDTV's, from my experience). And, once again, I stress that this is all available on a relative budget- you can snag 2GB of DDR 1066MHz Corsair Dominator RAM for $50, a Q6600 2.4GHz (and highly OCable) quad core processor for under $250, and a 9600GT for $120- the price of a budget gaming PC that can soundly thwack the consoles is only negligibly more than a standard PC which you'll need anyway, and on top of that your PC is then better overall. But, like I said, graphics are only one facet. I have a Wii, and could just as easily have picked-up GH3 on that (they've fixed the audio issues, so that's a semi-non-factor, other than that I have an X-Fi that will be going in my next build and no console can really hope to match that), so why didn't I? Another facet of PC gaming is custom content. If I can find a note chart for it, and I have the song file, I can put it in GH3. So instead of plopping down ~$100 for 70 songs, I plopped down only $70 for 70 songs plus pretty much whatever else I want. Granted, custom note charts for GH3 began on the PS2, but even with the fairly cumbersome tools currently available to handle this for GH3 PC it is much faster and more convenient in the PC version to implement custom note charts. But that's really besides the point, because custom content extends far, far beyond GH3. Maps and mods basically ensure that a majority of PC games I purchase entertain me far longer than the console games I purchase. Of course, convenient digital distribution from sources like Steam is another reason I prefer PC gaming- I'm increasingly buying more and more games digitally because I don't need to clutter my workspace with game boxes and installation on a service like Steam is as simple as dling Steam and telling it to dl and install my games. And I have access to those games on any PC. As I'll be going to college in the fall, I won't need to lug games home with me when I come home, as I'll just be able to boot-up Steam, log-in, and enjoy them.

And, hell, price-wise, the PC is the frontrunner. PC games cost AT LEAST $10 less than their console brethren, and thanks to custom content they last a lot longer (and PC games also tend to get DLC for free whereas consoles nickel and dime you for that). Really, it's a simple matter of business strategy for console manufacturers. They make their money on the games, and generally sell the consoles cheap (w/Nintendo being the exception) so they can sucker people into buying them and then milk their customers w/higher game prices and other crap (XBL, anyone?). On the PC, it will cost you more initially, but cheaper games that last longer (thereby causing you to purchase fewer games) will make it cheaper in the end (if you make sensible hardware choices- but the PC also gives you the choice to not make sensible hardware choices and to demand the best performance that money can buy, and you're perfectly ok to do that if you want). Nevertheless, the price of a decent gaming rig is never much more than the price of the kind of PC you'd get if you weren't gaming on it + the price of a gaming console, so add-in the cheaper long-term costs and...

And, well, we just can't forget the mouse, or games like Company of Heroes or Sins of a Solar Empire, which you just can't get on consoles. Or even Crysis.
 
In the end people buy a console because its a fad. :D i.e. omgzors you got a ps3 and omgodzors that game looks awesum 1111111!

honestly even though im a pc gamer i still have the urge to buy a ps3 because its got some games i will never be able to play in my pc(not because it cant handle em) because of exclusivity. Well maybe in the future there will be no more border line between a pc and a console... and we will have the best of both worlds.

And oh, the 9800gtx performance isnt too appealing, i dont know why nvidia made such a bold decision to even dare call it the 9 series. Id rather they did not bother as we had a good enough 8800gt and the new 8800gts. People were buying em like crazy, i did.

I dont know but maybe rebranding old tech does work... however i feel like i will never be able to tell if it does work cuz i will never have statistical data on nvidia GPU sales during this timeframe (i.e. at the introduction of the 9 series) who knows maybe we'll do... from someone whose more resourceful than me
 
I bought one today to replace my ailing 7900 GT card. Quieter fan and low temps with possible energy concerns make it a decent choice over the 8800 GTX for me. I usually skip a generation of card before replacing, milking it for all its worth. I thought this was based on the DirectX 10.1 though? Guess not. Kinda stinky. I imagine that so long as it kicks ass in Starcraft 2 and WotLK I won't notice anything else going on for the next year and a half to 2 years anyway :D
 
In the end people buy a console because its a fad. :D i.e. omgzors you got a ps3 and omgodzors that game looks awesum 1111111!

honestly even though im a pc gamer i still have the urge to buy a ps3 because its got some games i will never be able to play in my pc(not because it cant handle em) because of exclusivity. Well maybe in the future there will be no more border line between a pc and a console... and we will have the best of both worlds.

The only thing I miss are good sports games, but I just picked-up an old Xbox from a friend for like $25 for the 'box, 2 controllers, ~20 games, and I then went out and picked-up NHL 2K7 and have been bringing the rosters up-to-date, and it's fine. Nevertheless, EA Sports finally seems to be getting their NHL act together, so hopefully their '09 effort on the PC will be good (Moore, who's heading EA Sports now, mentioned that he wanted to put a lot more emphasis on the PC, so...). Actually, I'm really hoping that EA will take a cue from its own Battlefield: Heroes and go with a full advertising-oriented model for their PC sports games (provided they brought them up to at least where the next-gen console versions are... but if they went w/such a distribution strategy, they would to ensure they made the PC version attractive enough to get warrant such advertising).

Anyway, as for the rest... my Wii basically takes care of any games my PC doesn't. Between Mario Galaxy, RE4 Wii, Zelda: TP, Smash Bros Brawl, Mario Kart Wii (when it gets here, ofc- Double Dash is still doing ok there though), etc... I have fairly well satiated my appetite for anything the PC doesn't offer. And even something like Guitar Hero I can still get on the PC, and can get a better experience with it as well.
 
I already own 2 x 8800 Ultras and 3 x 8800 GTXs and see no reason to replace them with a 9800 GTX.

Thanks for the review [H]! :)
 
Are there any issues with noise or sound from the card? Seems like it'd have to move quite a bit of air to achieve the mere 9c idle-load temperature delta. I'd guess that if it was noisy, something would have been stated in the review, but a quick search didnt show anything for me. Perhaps i just overlooked it.
 
Are there any issues with noise or sound from the card? Seems like it'd have to move quite a bit of air to achieve the mere 9c idle-load temperature delta. I'd guess that if it was noisy, something would have been stated in the review, but a quick search didnt show anything for me. Perhaps i just overlooked it.

Page 8:
The fan is whisper quite at all times, even during full load testing while overclocked. The 9800 GTX fan remained inaudible over the ambient sound in our already quiet office.

Page 9:
You have that in the GeForce 9800 GTX. Not only does it have PureVideo HD technology including HDCP and HDMI (with an adapter, not included), it produces relatively little heat, eats relatively little power, and has one of the quietest fans we&#8217;ve ever had the pleasure of not hearing.
 
Thanks for the article guys and I am disappointed,

I see no reason to replace a 8800GTX @ 661|2040 1601 with this. I've had that card since their release.

As others have said, if you need to up grade for something like a 7900GT, a 8800GTS 512Mb or 8800GT is a much better price vs performance choice.

I'd love to see a card with the better shader muscle nVidia now has on a 512Mb interface with 1GB ram. That could tackle 2560x well.
 
Yet another reason why some people are flocking to the console hardware. What do we have now? Yet another card released to entice unknowing customers into buying new hardware. So why is this called a 9800GTX when it's no faster than a 8800GTX? Give it a bigger number to make customers feel they need to "upgrade"? less memory and slower bus? WTF was Nvidia thinking?

I'm very disappointed in the video card market. :mad: I too am losing faith in PC gaming because of this. Hopefully my new PC will last me until the next console generation, then I will have to think over the options.

I can only think the reason video cards have not progressed in the last year or more, is to help justify those people who spent $700+ on a video card that they made a "lasting" purchase.

Thanks sharing the info, [H]!
 
This is pretty much right in line with what I expected. Clearly nVidia is focusing more on cost reduction with the G92 than any kind of significant performance improvement. But at the same time, this is exactly what they did with the GeForce 7 series cards – new core, then a fabrication reduction.

I'm a little disappointed in [H]'s game choices. Part of where a high-end card should really be flexing is in AA. Crysis is too intensive for AA, Jericho doesn't support AA (the "edge smoothing" option is not true FSAA), and CoD4 uses the same engine as CoD2, so it's an old engine that any current card can man-handle. In the future I'd like to see a broader variety of intensive games with high levels of AA. I'd even be curious to see how something like Episode 2 handles 8xQAA from the 9800GTX.
 
This is pretty much right in line with what I expected. Clearly nVidia is focusing more on cost reduction with the G92 than any kind of significant performance improvement. But at the same time, this is exactly what they did with the GeForce 7 series cards &#8211; new core, then a fabrication reduction.

I'm a little disappointed in [H]'s game choices. Part of where a high-end card should really be flexing is in AA. Crysis is too intensive for AA, Jericho doesn't support AA (the "edge smoothing" option is not true FSAA), and CoD4 uses the same engine as CoD2, so it's an old engine that any current card can man-handle. In the future I'd like to see a broader variety of intensive games with high levels of AA. I'd even be curious to see how something like Episode 2 handles 8xQAA from the 9800GTX.

I do not understand why they are using Jericho as apart of the benchmarks, the game overall got poor reviews. They should be benchmarking something more recent....like World in Conflict, Company of Heroes, Medal of Honor Airborne, or even Frontline: Fuel of War (Haven't gotten that game yet...). They need to do their benchmarks on games that are commonly played, not games that are out of that area.

As towards the review, it does support that they are not pushing towards technology that will drastically improve performance. When building a vista rig in early 2007 (Yes, I know I'm mentioning vista, but I wanted to build up some experience with Vista to add to my repertoire), the 8800GTS 640 seemed to be the best card that would last longer than the previous generation. The only reason I changed this to the 8800GT was from the reduced size and weight of the card, the 8800GTS 640 does take up quite alot of space. The version of the 8800GT I got was slightly shorter than stock cards, which is nice. It was the minor things that they improved that made it worth moving toward.
 
I do not understand why they are using Jericho as apart of the benchmarks, the game overall got poor reviews. They should be benchmarking something more recent....like World in Conflict, Company of Heroes, or even Sins of a Solar Empire (There are other good examples, but these are the ones that came to mind). They need to do their benchmarks on games that are commonly played, not games that are out of that area.

[H] already justified their use of Jericho- it's very shader intensive. Whether you play Jericho or not doesn't really matter in terms of why they're using it. They're using it to measure real world shader performance, and it definitely seems to be doing that very well.

And why would you use SoaSE to benchmark a gpu...?
 
I kinda wished [H] used World in Conflict in their tests as well, but no matter. I still love [H] reviews anyways.
 
Back
Top