My take on the 4850 vs 9800GTX+

PoweredBySoy

2[H]4U
Joined
Nov 14, 2005
Messages
2,243
Yesterday I installed a 9800GTX+ as a possible replacement for my new 4850 and here are my thoughts below. This is in response to a different conversation, so some of my comments may seem out of context - but I'm sure you'll get the point.


ATI. Nvidia. Fight!

Alrighty, I received the 9800GTX+ in the mail yesterday. Ran some rough benchmarks using FRAPS on the 4850 before taking it out and then installed the 9800. First things first - the 9800 is huuuuuuuuge. It's a good 1/2 inch longer than the 4850 was - and I believe it totals around 10.5 inches. The end of the card literally passes into my harddrive tray. Another problem with that is the 6-pin power connectors are also on the end of the card (as opposed to on the side like the newer 200 series), so you need even more room for the power cords. Anyways, I had to rewire my harddrives in order to make room, but I got it in there (Antec P150/Solo).

Loaded up the latest Nvidia drivers and jumped right into Farcry 2.

The first thing I noticed was that the Nvidia did not look any more vivid, or have more pop, than I had previously thought it would. I think my notion that the ATI was flat was my own skewed perception. The second thing I noticed was that the framerates weren't a huge improvement - which of course I wasn't really expecting to see as these two cards are direct competitors to each other and in the same price point. In fact, I would say that out of their respective boxes the 4850 may, may, have had a slight advantage in raw framerate. But it's very close to dead even. Although I think it's very important to mention that out of the box the 4850 already had a pretty healthy overclock to it, while the 9800 was still reference speeds.

Then after about an hour of playing I confirmed the biggest improvement: no more framerate dips! This was my main reason for ordering the 9800 in the first place, as the 4850 didn't seem like it had a very stable framerate. Now to be fair, it could only just be for Farcry 2, it could just be a driver issue, but still - this game is all I really had to go on so it's what I used. But the 9800, while having roughly similar frames overall, didn't suffer from those random downward spikes. At that point I was already happier with the Nvidia. And that's when it started to get good. Seeing as the Asus 9800 already came with a Zalman-esque hsf preinstalled (spprrtt!), it was just begging to be overclocked. So I downloaded EVGA Precision to do the dirtywork. After a moderate bump in clocks I hopped back into Farcry 2 and instantly saw a 5 to 10 framerate advantage over the ATI. While that might not sound like much, it was certainly noticeable and the game just played smoother. Happy Soy.

The fan on the Nvidia is louder than the ATI (also aftermarket), that's for sure. But it's an acceptable loud. It just sounds like fast moving air - there's no whiney pitch to it or anything, and it only ramps up during gaming of course.

Despite the requirement for 2 6-pin pci power connects, all reviews I've read state that the 9800GTX+ actually draws considerably less power than the 4850. Those ATI 4000 series are power hogs.

Under load my 9800 only reached a top temp of 67C - and that's overclocked and at only 48% fan speed. This is compared to the blistering 75+C the ATI was running at. But to be fair, I saw no adverse effects from the ATI running that hot. That's just normal for the card I guess.

After playing around with Farcry and being convinced the 9800 was my new card I fired up Fallout 3. Now Fallout instantly recognized my video card had changed, so it re-auto-adjusted my video settings. So I went into the video settings and made some further adjusting of my own. I decided to push it a bit. Increased all my draw distances, threw on 2xAA and 4xAF, upped the 'World Detail' slider, etc...... WOW!! I'm not sure what I did, but Fallout looks much better now than how I was playing it before. I'd have to revise my current discussion with Thwak on how well this game looks. It's still no Farcry 2, but damn it sure can be pretty in it's own right. And even still I was running at a solid 55-60 frames. Sometimes I'll get minor dips into the 45 range, but they're very infrequent and brief. Now, I'm not saying the 4850 couldn't have made this game looks just as good - it was probably just user error on my part for not taking time to tweak the settings - but I will say it looks twice as good as before now and running very, very solid.

I think both cards are worth the ~$160. Seeing as how we've been getting raped on video card prices for so long now, it's nice to see some high performance for sub-$200. Overall, apples to apples, overclock to overclock, the 9800GTX+ undeniably has better framerates than the 4850. Not a lot mind you, but the advantage is definitely there, and sometimes all it takes is 5 more solid frames to smooth gameplay out.. I still think the 4850 is a great card though for the price, especially if those framerate dips were an isolated case to Farcry 2.
 
1, Turn AA/AF up.
2. 9800 GTX+ is a dual slot cooler vs 4850 being a single slot cooler
 
Then after about an hour of playing I confirmed the biggest improvement: no more framerate dips!

This has been the biggest gripe against ATI. FPS downspikes kill imersion for me. And it's something that doesn't show up in the "average" FPS numbers so many sites spew out. One of the major reasons I like [H]ard is for it's FPS vs time graphs.
 
Seeing as the Asus 9800 already came with a Zalman-esque hsf preinstalled (spprrtt!), it was just begging to be overclocked. So I downloaded EVGA Precision to do the dirtywork. After a moderate bump in clocks I hopped back into Farcry 2 and instantly saw a 5 to 10 framerate advantage over the ATI. While that might not sound like much, it was certainly noticeable and the game just played smoother. Happy Soy.

There is no such thing as 10 FPS gain with overclocking the video card. Sure 2-3 FPS, but not 10 .
 
There's a hint of bias in this, but overall good comparison. both are good cards in their own rights, although nvidia might have a slight lead due to better drivers.
 
There is no such thing as 10 FPS gain with overclocking the video card. Sure 2-3 FPS, but not 10 .

That's not true. There is a lot of instances where cards overclock particularly well, or certain games that respond particularly well to overclocking the video card. You can't explicitly say there is no such thing as an overclock that gains 10 FPS. Especially when looking at games that are a little older where a little bit more power to newer cards is all it takes to get from 150 to 160 FPS. Granted in Far cry 2, 10 FPS from video card overclocking is an impressive gain to say the least. But you saying there is no such thing as a 10 FPS gain isn't true at all.
 
There is no such thing as 10 FPS gain with overclocking the video card. Sure 2-3 FPS, but not 10 .

The OP clearly said 5 to 10 fps increase. He didn't say 10 fps across the board...
 
^^ OK , wise people , show me a benchmark with a 5 to 10 fps increase with FC2 and your overclock

I can agree with you on that increase with older games.
 
Well I did say it was a stretch but maybe he's gaming at like 1024*768 or something...Still at moderate resolutions it's possible.
 
Well I did say it was a stretch but maybe he's gaming at like 1024*768 or something...Still at moderate resolutions it's possible.

maybe, but i don't expect anybody here on [H] to own cards like those and game on 1024x768, you guys are much smarter than that
 
Running 1680

I used FRAPS and my eyes to determine frame rates - nothing scientific. Overclocking the 9800GTX+ gave me a noticeable improvement in the range I specified. I'm very happy with how it performed and I'll let you guys argue over the details.....
 
There is no such thing as 10 FPS gain with overclocking the video card. Sure 2-3 FPS, but not 10 .

you guys need to read what he wrote again: "....a 5 to 10 framerate advantage over the ATI"

He said 5 to 10 fps over the ATI. This does NOT mean that the card jumped 5 or 10 fps just by overclocking.

The Nvidia card was probably a few frames faster than the ATI to start with. He only says this earlier: "But the 9800, while having roughly similar frames overall..."

roughly does not mean exactly the same
 
How can you compare "overclock to overclock" when every overclock is entirely different. Although I'm tempted to say that frames are certainly not everything, I can guarantee that my OC'ed 4850 can outperform 99% of GTX+s.
 
Fallout 3 is based on a modified Oblivion (Gamebryo) engine, and since the release of the 8800 series cards Nvidia has had an advantage in that game (Oblivion) all things being equal. Probably why your 9800 GTX+ does well in Fallout 3.

I've read about three different reviews of the 9800 GTX+ and it trades blows with the HD 4850. Both are excellent cards and the decision to choose one or the other depends on what games are your favorites and if space inside the PC is a concern, heat, power draw, etc... I would be happy with either one (if I already didn't have a GTX 260 ;)).

It does confirm something that has always bugged me. I like both companies and have own a mix of their cards over the past 6 years I've been earnestly gaming. I've liked them all. I hate to see a knee-jerk "get an HD 4850" in forums I visit whenever someone asks what mid-range card to get. Its not the only good choice.

Oh, O.P., you may want to fiddle with Digital Vibrance (if it isn't borked on your driver) to make that image "pop" a bit more. I use it on low to up the color saturation on my card. One of the reasons I like Nividia a bit more. ATI seems to have a crisp, clean image, Nvidia seems to have better color.
 
you guys need to read what he wrote again: "....a 5 to 10 framerate advantage over the ATI"...

I was speaking in a general sense in response to a general statement, not so much in response to the OP.

Anyhow, I've used both the 4870, and GTX 260. I can't really tell a difference in image quality (so far as crispness, and colour reproduction are concerned). Granted I don't have fanastic super high end monitors, but the difference can't be very huge.

Another thing I hadn't noticed is a dip in frames for the AMD card. I've looked through some FPS logs in FRAPS and there doesn't seem to be the huge dips people speak of. I wonder if it is driver or specific game effecting, but I don't see it.
 
It seems the framerate dips applies to all the ATI cards. I have a 4670 and it is much worse than the 8600gt it replaced as for as smoothness. sure the 4670 has better higher and average framerates but it feels and plays like crap in most games. infact my minimum framerate went down compared to the 8600gt for every single game that I tested the 4670 on. Warhead for example was smooth at 20-25 fps on the 8600gt where with the 4670 it played like crap even though the average framerate was 5-7 fps higher. I will be going back to Nvidia for sure.
 
I have gone from a 9800xt to x850txpe to x1950xtx, then to an 8800gtx, then to 4850's in crossfire. It may just be me, but I have always thought the ATI cards just made the graphics look better, deeper colors and crisper details.

But that is just my humble opinion. :cool:
 
I have gone from a 9800xt to x850txpe to x1950xtx, then to an 8800gtx, then to 4850's in crossfire. It may just be me, but I have always thought the ATI cards just made the graphics look better, deeper colors and crisper details.

But that is just my humble opinion. :cool:
I think thats all in your mind when it comes to games. tbh though the desktop is crisper looking with ATI over my vga connection on my crt. I wasnt expecting that but I kept noticing it while I was testing back and forth.
 
nice little read, be cool to see some graphs so i can see those fps dips on your system. Glad you are enjoying your card because at the end of the day thats what its all about.

edit: wasnt there some driver issues (or am i thinking of another game my mind says something recent because i have not played farcry2) with that game on the ati side which i think was fixed with a new driver release?
 
well if you look through all of the Hard card and game reviews you will see that they mention the "doest feel smooth" issue sometimes.

They will lower their "highest playable settings" when the game "doest feel smooth". ;)

If you have a comparable setup to their setup, you should be able to play the game without any "framerate dips" using their "highest playable settings". If you can't play the game using their "highest playable settings" on your setup eventhough you are using the same card, then there must be something in your setup that is holding you back.
 
They will lower their "highest playable settings" when the game "doest feel smooth". ;)

If you have a comparable setup to their setup, you should be able to play the game without any "framerate dips" using their "highest playable settings". If you can't play the game using their "highest playable settings" on your setup eventhough you are using the same card, then there must be something in your setup that is holding you back.
yes I know but I was just pointing out the issue is real. in some games or with some cards the game just doesnt feel smooth. to me it seems that Nvidia delivers a smoother experience overall in most games.
 
yes I know but I was just pointing out the issue is real. in some games or with some cards the game just doesnt feel smooth. to me it seems that Nvidia delivers a smoother experience overall in most games.

At a given average fps, maybe but at a given setting, ATi should be smoother since the highest playable settings is higher according to [H] review.
 
edit: wasnt there some driver issues (or am i thinking of another game my mind says something recent because i have not played farcry2) with that game on the ati side which i think was fixed with a new driver release?

Um... EVERY new game that comes out requiers a driver release to enable CF in it. Most games also require hot fixes or driver releases to give decent preformance out of single cards.

It has to do with the non-robust architecture that the 4800 series uses.
 
nice little read, be cool to see some graphs so i can see those fps dips on your system. Glad you are enjoying your card because at the end of the day thats what its all about.

edit: wasnt there some driver issues (or am i thinking of another game my mind says something recent because i have not played farcry2) with that game on the ati side which i think was fixed with a new driver release?

haha. Graphs and charts are way beyond the scope of this post, my friend. I just had a rare opportunity to own two competitive video cards at the same time and I thought I'd just share my thoughts between the two.

I was using the 8.10 Cat drivers for the 4850. I had seen the hotfixes but didn't install them as they didn't seem relevant to me (I was running a single card in DX9). And since I was also reading about issues with the hotfixes I didn't feel like tempting fate.

As I noted in my original post, I certainly can't confirm how widespread the framerate hitching is with the 4850 - but it certainly was present in Farcry 2. Now, whether it's just a minor driver issue or something that only happens in Farcry 2, I really can't say. That was the game I was playing at the time I purchased the card, so that's what I had to go on. It wasn't the greatest first impression though, and I wanted to get a solid card in my box before the RMA window was closed.
 
completely understood, i just knew i heard about it going on in farcry 2, personally i haven't experienced it thats why i inquired
 
I went from a pny 8800gt to a visiontek hd 4850.This is what I noticed in cod4
I play at low res 1280x720.Settings maxxed aa/af all the way up
This is on a 5000+ 64x2 @2.6

My framerate went up about 20fps max with the 4850.I had worse framedips with the 4850 then I did with the 8800gt.
The 4850 looked crisp,almost like the aa/af worked better.The 8800gt had better colors
Both are good cards in my eyes
 
I sold my 4870's because of framerate dips in Clear Sky and just general flaky Crossfire issues. Sometimes they would even lock down and stay at 5-10 unless I alt-tabbed out and back in. I sold a 680i setup and went X48 and Crossfire because I so so so wanted to like Crossfire and support ATI but.......alas it was not to be. When it worked, ZOMFGWTFBBQ but then flake out after flake out..........the heat......having to make profiles every freakin time they let loose a "driver". At the time there was no good programs to OC them except in CCC which sucks my left nut. I was nervous about the vendors cause eVGA don't sell ATI. I could go on and on and on.

Bottom line, framerate dip is a real issue in many games with these cards. It's all over here and AT and XS and PCPer..........game after game after game..........and they don't fix em till the games been out for 2 months. THey did come out with FC2 hotfix prety quick but damn.......who didn't know that game was coming out?

Bought a cheapy 9800GT and OC' the crap out of it (760/1800ish/2000) and now play Clear Sky very smoothly. I can't turn on Enhanced Dynamic Lighting like the 4870 but the game runs so much smoother it makes up for it.

Something else I noticed with the 4870 and that game is that anything under 60 felt and looked choppy. With the 9800GT, not so.
 
hmmm, i'd really like to see this framerate dip and i am by no means some super intelligent bench tester. But if there was a way i can bench to see these dips for myself i'd love to open my eyes to it this is a problem i have not experienced. i do have fraps and it has a little benchmarking thing that shows minimum and such frame by frame.

i guess im vexed as i have not seen this dip that your talking about and i'd like to help if i could.
 
hmmm, i'd really like to see this framerate dip and i am by no means some super intelligent bench tester. But if there was a way i can bench to see these dips for myself i'd love to open my eyes to it this is a problem i have not experienced. i do have fraps and it has a little benchmarking thing that shows minimum and such frame by frame.

i guess im vexed as i have not seen this dip that your talking about and i'd like to help if i could.
well I could feel that the 4670 wasnt as smooth as the 8600gt was on my pc. I then ran tons of FRAPS benchmarks in several games on each card and it confirmed that the framerates were all over the place with the 4670 where the 8600gt was much more consistent. its weird that the 8600gt can feel so smooth at low framerates in games like Warhead while the 4670 feels sluggish at higher framerates. also the 8600gt responds great to overclocking where the 4670 actually delivered even worse minimum framerates and more sporadic performance when overclocked.
 
I guess that people will get the problem if they tweak the game settings to get a target average fps count in the game (e.g. 40fps). Then they do the same on an nVidia card and found out that at the same average fps count, the game would run smoother on an nVidia card eventhough they are now using a lower setting.

The problem here is they are compairing two different settings with the same average fps count. I'm pretty sure that if they lower the settings on the ATi card to match the settings on the nVidia card, they will get the same smooth gameplay if not smoother with a higher average fps.

That is why sometimes when you read a [H] review, you can see that they are using the exact same highest playable settings on both nVidia and ATi cards eventhough the ATi card is giving out a higher average fps. However most of the time in [H] reviews, the ATi card can push one or two more settings further than the nVidia card and the game would still be playable.

My conclusions:

1) ATi average fps =/= nVidia average fps.
2) Highest playable settings > fps counts.
3) Tweak the settings to get a playable gameplay, not to get a target average fps count.
4) Just use the settings that [H] is using in the review to get a smooth gameplay.
 
I used to run a factory overclocked 8800GT card but replaced it with a reference 4850 because of the noisy Zalman fan on the 8800GT card.

I've never experienced "framerate dips" with the 4850, even when I used it with a slower Athlon X2 CPU. Sure, the framerate is always noticably lower when there's a lot going on, like several enemies on the screen etc., but that's expected and happened with both cards. Not saying the issue doesn't exist, but it must be game or system specific.

Alot of Nvidia users are claiming that there's a bug in Nvidia's drivers that causes the framerate to continually decline as you play. Supposedly the only way to get rid of the problem and restore the high framerate is to alt+tab out of the game and back again. Again, never saw that on my system with the Nvidia card.

HardOCP videocard reviews contain graphs of the framerate over time as well as Min framerate so you can check those to see which card works best with the games you want to play.

ATI decided to start the Catalyst program and make improving the quality of their drivers a top priority some time in 2003 (or earlier?), yet people still claim they have inferior drivers. Strange how the poor reputation continues to follow them year after year.
 
I guess that people will get the problem if they tweak the game settings to get a target average fps count in the game (e.g. 40fps). Then they do the same on an nVidia card and found out that at the same average fps count, the game would run smoother on an nVidia card eventhough they are now using a lower setting.

The problem here is they are compairing two different settings with the same average fps count. I'm pretty sure that if they lower the settings on the ATi card to match the settings on the nVidia card, they will get the same smooth gameplay if not smoother with a higher average fps.

That is why sometimes when you read a [H] review, you can see that they are using the exact same highest playable settings on both nVidia and ATi cards eventhough the ATi card is giving out a higher average fps. However most of the time in [H] reviews, the ATi card can push one or two more settings further than the nVidia card and the game would still be playable.

My conclusions:

1) ATi average fps =/= nVidia average fps.
2) Highest playable settings > fps counts.
3) Tweak the settings to get a playable gameplay, not to get a target average fps count.
4) Just use the settings that [H] is using in the review to get a smooth gameplay.
well in my case I was using the exact same settings for both the 4670 and 8600gt. I could turn up the graphics or resolution in some games on the ATI card where I really couldnt on the weaker 8600gt. the problem is that games just arent smooth feeling even at the lower settings. for some reason the 4670 just seems to bottom out and feel sluggish during games.
 
gonna do some quick warhead benchmarking 60 frames in some battles, maybe like the last one with a mix of gamer and enthusiast settings
 
how does one add pics to a post? upload to photobucket or something?

and benching ain't as easy as i thought, kids keep running in and such... makes me die, then i have to start over.
 
well I could feel that the 4670 wasnt as smooth as the 8600gt was on my pc. I then ran tons of FRAPS benchmarks in several games on each card and it confirmed that the framerates were all over the place with the 4670 where the 8600gt was much more consistent. its weird that the 8600gt can feel so smooth at low framerates in games like Warhead while the 4670 feels sluggish at higher framerates. also the 8600gt responds great to overclocking where the 4670 actually delivered even worse minimum framerates and more sporadic performance when overclocked.

Sounds to me like you had a driver conflict or some other issue. The 4670 is a far superior card to the 8600GT in every respect, including minimum FPS. If you overclocked the card and the FPS dropped, then you definitely had a problem elsewhere. Perhaps your PSU couldn't deliver enough power, which would explain why overclocking it (increasing power draw) resulted in worse performance.

If your sig is accurate, that X2 5000 could very well be holding you back as well, especially if it is at stock speeds.
 
Back
Top