8800GT (single) vs 3870X2

manini

Limp Gawd
Joined
Nov 14, 2007
Messages
183
Hey Guys,

This may seem like a stupid question, but I was looking to get some input on it.
At a resolution of 1280x1024, is there any difference at all between the 8800 GT and the 3870 X2 . The 3870 X2 basically costs twice as much as an 8800 GT right now, and I actually went ahead and bought the 3870 x2 when they first went on sale at Microcenter for 389. I ended up getting it for $420 after taxes and shipping. I'm going to be combining it with an E8400, a DFI DK P35-T2RS, and 4gigs of ram, among other things (my parts should arrive within the next couple days). Now my only problem is my monitor is a 17" Sony LCD that only supports a max of 1280x1024, and although I'm dying for a 24" LCD, I can't really afford it at the moment, nor prolly even fit it on the desk I'm using plus my current monitor works great and has no issues so I don't think I'll be buying one anytime soon... I know the 3870X2 would be futureproofing it a bit, but overall, am I throwing an extra $200 away by going with the X2 for a resolution of 1280, or is there some difference between the two cards at this resolution... Ive got 30days to try it and give it back if necessary, and I'm kind of torn between the two options...
 
I would return it. If you can't afford the 24" monitor then chances are the 200$ diffrence is a significant amount of money to you. You wont see any change in the preformance at that resolution. If you save, in the future then you can look towards one of the new cards coming out this fall if you do end up having the money to buy the 24" monitor and higher graphics card. Trust me, 200$ in your computer that isn't giving you preformance increases really hurts when you look up in 2 months and realize you REALLY need taht 200$ for something "in real life".
 
Interesting points both of you make. I guess one thing that attracted me to the x2 is the fact that it does ahve HDMI out both audio and video, so I guess I could hook that up to a 1080p TV everyone once in a while and I dont think the 8800gt's have that... but that MSI 8800GT for 189 after rebate + free witcher game is so damn tempting...
 
There is an MSI 8800GT on newegg for 199.99 after rebates that has a much better cooler. This is worth the extra 10$ over the reference cooled cards.
 
3870x2 gets 16k in 3dmark06 (1280x1024), 8800gt gets about 11k.

It's true that the 3870 "benchmarks" higher. However the diffrence between 90 fps and 60 fps on a 60 Hz monitor is ZERO. 8800 GT will be able to play anything (other than crysis) at 1280x1024 with the setting cranked.
 
Playing a 3870x2 on a 17" is like trying to vapochill a 1.8GHz C2D. Sure, it'll help to have a 3870x2 in your pc.. but is it really worth it? By the time you get your 24" monitor, the gpu out at that time (that's equivalent in power to the 3870x2) will probably be around $200.
 
Oh god I'm so tired of reading this crap. That's a huge fallacy.:mad::mad:

If you run @ 60hz w/ vsync then it's not false. Why someone would run @ 60hz is another question... That is unless we're talking about MAX FPS vs AVG FPS vs MIN FPS.
 
a 3870 is enough to play many games-except crysis(xp)-at 1920x1080 without aa and generate some generous frame rates. the new 8.2 drivers have greatly improved some of the hitching issues in gow(xp) as well.
 
That's really a toss up whether to keep it or not. My question is why didn't you ask this question before you ordered? You realize now your out dual shipping.. plus the $200 difference, However I realize this doesn't help you now.

As I said, there is no exact answer at this point IMO. I'm lazy, so prob I would just keep it and know that whatever I'm throwing on my 17" is as good as it will ever be. You can always run AA @ 8X.

My 2 cents, and next time ask before you buy?
 
That's really a toss up whether to keep it or not. My question is why didn't you ask this question before you ordered? You realize now your out dual shipping.. plus the $200 difference, However I realize this doesn't help you now.

As I said, there is no exact answer at this point IMO. I'm lazy, so prob I would just keep it and know that whatever I'm throwing on my 17" is as good as it will ever be. You can always run AA @ 8X.

My 2 cents, and next time ask before you buy?

I just had to jump on it without asking questionsbecause it was a special deal.... $70 less than MSRP on the day of release :D, plus I can just return it to the store (hence no shipping costs) without any restocking fee or penalty within 30 days so it's not really a big deal.

On another note, it just arrived. Nice box :D Waiting on my other parts not. I may try just try it for a while and then see whether its worth keepin or not.
 
At the 1280x1024 res, you really won't see a major difference. Basing claims off an artificial benchmark may be good in some situations, but not in yours. If you're going to upgrade to a 24" in the near future, stick with the 3870 X2. However, if you're staying at that res for a long time, return the card and pick up an 8800gt. At that res, the gains won't really be that noticeable (I used an 8800gt on that res before and was able to max out just about everything on any game easily, exception of crysis, but even then very high with some AA sprinkled in). So basically, it depends on the monitor. So what are your plans for that?
 
So all future games will run at 1280x1024 @ max detail on the 8800GT ... :rolleyes:

Keep the X2 ..doubt you'll regret it. Also, with the extra fillrate/bandwidth you could crank up some of the settings compared to the GT in future games.
 
I'd sell the 3870x2, get a 8800GTS 512 (You can find those real cheap now) along with a 22' LCD (Acer's 22' cost as low as $200).

I don't know about you, but I'll definitely enjoy that more than a 3870x2 on a 17' screen.
 
So all future games will run at 1280x1024 @ max detail on the 8800GT ... :rolleyes:

Keep the X2 ..doubt you'll regret it. Also, with the extra fillrate/bandwidth you could crank up some of the settings compared to the GT in future games.

If you haven't noticed, last year's 7900gto isn't so hot anymore...
While a 7900gt might have dropped like $100 since it's release, a gto dropped $200 or more.
Yeah sure, if you can enjoy a 24" within the next couple months, by all means keep the card. But downgrading to a 8800gt won't hurt the performance by a significantly noticeable amount.
 
Long story short you'll be fine on the 8800 GT for about a year would be my guess. By fine I mean you'll be able to run all games at nearly max settings. Aside from crysis you can run every other game at max settings at that resolution. IF you are going to get the 24" soon (less than 6 months) then you should hold onto it. If your not, the 3870x2 will give you little to no noticable diffrence.

/offtopic If he's running an LCD (which probably 80-90% of the world is these days) then yeah he's running at 60hz. Lets say he runs at 120 fps vs 60 fps, first off, he'll only see 60 fps. And yes if he's has Vsync off then he'll have his frames be about 8.3ms sooner with 120fps vs 60fps. Seeing how he's only getting a frame every 16 ms, and the average processing time on a LCD vs a CRT is 40ms your getting lost in the noise. Human reaction times are normally around 100-200ms. This all ignores the fact that his game isn't going to run "any smoother". Keep telling your self that you can tell a diffrence though between ^^
 
return the x2, sell your lcd for like $80, get your $200 and buy a 22" lcd for $250 or so, you'll be much happier.
 
Long story short you'll be fine on the 8800 GT for about a year would be my guess. By fine I mean you'll be able to run all games at nearly max settings. Aside from crysis you can run every other game at max settings at that resolution. IF you are going to get the 24" soon (less than 6 months) then you should hold onto it. If your not, the 3870x2 will give you little to no noticable diffrence.

/offtopic If he's running an LCD (which probably 80-90% of the world is these days) then yeah he's running at 60hz. Lets say he runs at 120 fps vs 60 fps, first off, he'll only see 60 fps. And yes if he's has Vsync off then he'll have his frames be about 8.3ms sooner with 120fps vs 60fps. Seeing how he's only getting a frame every 16 ms, and the average processing time on a LCD vs a CRT is 40ms your getting lost in the noise. Human reaction times are normally around 100-200ms. This all ignores the fact that his game isn't going to run "any smoother". Keep telling your self that you can tell a diffrence though between ^^


No, first of all you're completely ignoring the fact that games don't run at constant fps, but have dips etc.

reaction time is 100-200ms? Lul, then how come it's easy for people to tell the difference between lan quality ping (<20) and anything over 40 or so :p
 
No, first of all you're completely ignoring the fact that games don't run at constant fps, but have dips etc.

reaction time is 100-200ms? Lul, then how come it's easy for people to tell the difference between lan quality ping (<20) and anything over 40 or so :p

Wiki says...

Simple reaction time is usually defined as the time required for an observer to detect the presence of a stimulus. For example, an observer might be asked to press a button as soon as a light appears. Typical RT to detect the onset of a light flash is approximately 150 to 300 milliseconds[1].

We're talking about the reaction for you to notice someone on your screen and then moving to shoot them. We're not talking about the diffrence between a high ping. Those cause jumps between movement which is different.
 
Wiki says...



We're talking about the reaction for you to notice someone on your screen and then moving to shoot them. We're not talking about the diffrence between a high ping. Those cause jumps between movement which is different.

your post began to fail as soon as you said
wiki says
 
The answer is so easy!!
Return the x2, get a GT and use the spare cash to buy a bigger monitor with 1680 res.
All round win !!

Sabrewolf got there first in post #22
 
Oh god I'm so tired of reading this crap. That's a huge fallacy.:mad::mad:

Not really, 3DMark put a lot of stress on the cards, most games at 1280x1024 are still some what CPU limited. So between the two cards the OP really wouldn't notice a real difference.
 
No, first of all you're completely ignoring the fact that games don't run at constant fps, but have dips etc.

reaction time is 100-200ms? Lul, then how come it's easy for people to tell the difference between lan quality ping (<20) and anything over 40 or so :p

That's totally illogical and invalid.

You can tell when your ping is higher (albeit I don't think you can tell between the numbers you site) because of when your shots should've killed someone in a game and they don't die, the outcome is different than your expected outcome.

The (questionable) ability to naturally discern between 20 and 40ms does not enable being able to tell if your latency in a multi-player environment is high.. the outcome of whether you got the kill you know you should've had in gameplay tells you that information.
 
If I do buy a monitor, it's gotta be at least 24" haha... The way I see it, since I dont buy a monitor unless it breaks, if I get a 24 now, the next time down the line well prolly be seeing 40" monitors as the standard haha. So if I buy a 22", its not future proofing it as much :D

Anyway, all joking aside, I do have another question.

Are there any 8800GT's besides the Zotac one that have HDMI audio/video out? I'm not really a fan of brands I've never heard of...

I love my 3870 x2 for the fact that it has the HDMI out...

Anyway, I put together my PC minus a hard drive. At least it boots and recognizes my e8400, but I haven't been able to test it with an OS yet. Can't wait for my Raptor HD to show up in the mail. (I got one off ebay but the guy never responded back to me... I really hope he sent it )
 
This is a tough one.


I'd sell the 3870x2, get a 8800GTS 512 (You can find those real cheap now) along with a 22' LCD (Acer's 22' cost as low as $200).

I don't know about you, but I'll definitely enjoy that more than a 3870x2 on a 17' screen.


He is absolutely right. Buying arguably the fastest graphics card out there to put it on the second smallest standard desktop LCD size is like buying a 50lb sledge hammer to put small tacks on the wall. Yeah, you've got the best and biggest hammer in town, but if all you're going to do is pound in little tiny tacks with it ...

So all future games will run at 1280x1024 @ max detail on the 8800GT ... :rolleyes:

Keep the X2 ..doubt you'll regret it. Also, with the extra fillrate/bandwidth you could crank up some of the settings compared to the GT in future games.

At the same time I totally agree with commodore, especially if you really want to run Crysis with better graphics. It will just plain run better with higher settings on the X2, even at only 1280x1024, and not to mention future games coming out soon with DX10! I have been running a 8800GT on a really nice (but old) 17" monitor. I love how it can run all my games maxed (and how I spent half as much as you did on your X2 :p) But guess what I'm doing? I just returned the 8800GT and bought the 8800GTS from the egg for $259 after a $30 MIR (free shipping) and a 22" acer with 16x12 res. I guarantee the experience in almost every single game is going to be way better with that combo than with the X2 and a 17". 22" is such a sweet spot for value right now. Yeah, you can get a soyo 24" for $300 occasionally on sale, but the Acer (like samsung and your sony) has a 3 year warranty and better specs. I would imagine better build quality, too.

Sounds like you're not going to go cheap when you get a monitor, though, like buying a Soyo. I feel like I could argue either side of this with plenty of positive points for both. I understand why this is a tough call for you. It sounds like you've convinced yourself to keep the 3870X2 already though, and I see your point about the HDMI features that it has over the 8800's. If you are really, truly going to take advantage of them, than this thread can end because you have your clear winner. Especially if you're going to get a 24" within the near future, like another poster said.
 
I just had to jump on it without asking questionsbecause it was a special deal.... $70 less than MSRP on the day of release :D, plus I can just return it to the store (hence no shipping costs) without any restocking fee or penalty within 30 days so it's not really a big deal.

On another note, it just arrived. Nice box :D Waiting on my other parts not. I may try just try it for a while and then see whether its worth keepin or not.


keep it. good deal. i already have the machine you are going to build. it will be great. do you even play crysis? i played thru it once and it sucked.
 
keep it. good deal. i already have the machine you are going to build. it will be great. do you even play crysis? i played thru it once and it sucked.

yah I played crysis... thought it was alright, but i had to play it on 800x600 with low settings for everything so it looked like total shit and still ran like crap... can't wait to see what the game is supposed to look like maxed out :cool:
 
looks good, still sucks.

Yeah, I've heard that game is mostly looks. Gameplay is only decent. Sad, but probably worth the revolutionary looks if you've got a GFX system that can show you all the eye candy. I would imagine that experience would be worth it.
 
Oh god I'm so tired of reading this crap. That's a huge fallacy.:mad::mad:
60 frames per second is 60 frames per second. It is true, however, that if the card can consistently put out 90FPS reduced to 60 by vsync, it's bound to drop below 60 far less often.
 
60 frames per second is 60 frames per second. It is true, however, that if the card can consistently put out 90FPS reduced to 60 by vsync, it's bound to drop below 60 far less often.

That's exactly what I'm saying. Everytime one of these asses tells someone there is no difference between 60fps and 90fps, they do a real disservice to the person. While technically the display will not show more frames, in the real world, a game running at an average FPS of 60 will probably spend half of its time below 60, whereas a game running an average FPS of 90 will almost never drop below 60.

If you want to argue that 60 is an acceptable MINIMUM framerate, that's perfectly fine, even a bit picky. But as an average or max? That leaves a ton of room for the framerate to spend quite a bit of time in the 40s and below. Theres a real good reason [H] specifies min max and avg frames in their benchmarks for video cards.
 
If you have vsync disabled @ 60Hz, there should still be a difference between 60fps and 120 (min framerate dips ignored), no?

Ie. if you are at a perfect 120fps, wouldn't you just "tear" every single frame? Ie. Each refresh of the LCD contains info from 2 frames (top half being the old frame, bottom being the new one)? I'm assuming that if the frame buffer is updated mid-refresh, the updated second half will be drawn to the monitor?

If this is the case, while the results could be ugly (tearing in every frame!) there could be a perceivable difference, especially with respect to motion. (Ie. how 60Hz interlaced can appear smoother in fast motion than 30Hz Progressive)

Back to the original question: one thing to keep in mind is that the X2 is crossifre, meaning driver issues and game support is such that you may not always achieve your theoretical maximum performance level. With a single chip design your always guaranteed to get it's full usage.

Aggies
 
my rig plays crysis @ high resolution (1680x1050) with 'high' settings on ALL and MAX AA..

or don't you consider 1680x1050 a high resolution?

I admit when I was running XP-32bit the last part of the last stage got a little choppy.. but when I went XP-64bit... smooth as silk...

i dont have a 1920x1080 screen to test that high.. but surely if the 3870x2 runs so much better it should handle that resolution with high settings??

i aint running no massif OC on anything when playing crysis.. infact i dont OC my 8800gtx when playing crysis at all (unlike my signature clock specs)

I aint got no P35 X38 or X48 either..

maybe u ppl have a 32bit bottleneck ;)
 
it's very true someone mentioned driver issue and crossfire about x2, so..
yes it was a good deal with on a good card but still i don't prefer cf/sli setup and with that resolution, a single 3870 wasn't on your consideration before ?
 
my rig plays crysis @ high resolution (1680x1050) with 'high' settings on ALL and MAX AA..

or don't you consider 1680x1050 a high resolution?

I admit when I was running XP-32bit the last part of the last stage got a little choppy.. but when I went XP-64bit... smooth as silk...

i dont have a 1920x1080 screen to test that high.. but surely if the 3870x2 runs so much better it should handle that resolution with high settings??

i aint running no massif OC on anything when playing crysis.. infact i dont OC my 8800gtx when playing crysis at all (unlike my signature clock specs)

I aint got no P35 X38 or X48 either..

maybe u ppl have a 32bit bottleneck ;)

I'm no MS lover, but I agree 100% with their push to 64-bit everything. WAY TO GO for once. 64-bit is far superior, and now all decent AMD and Intel processors have the necessary architecture. I run 64-bit Vista Ultimate and it sings. Recognizes all 8GB of system memory, plus my video card memory without any bottlenecks. So far everything runs great, even some of my older games and apps. All my hardware works fine (see sig.) It's stable like you wouldn't believe.

I would recommend running the X2 (or any graphics card with 512 ram or more) in a 64-bit environment, drivers providing. I hope more companies start writing native 64 bit apps, games, drivers, etc. The X2 has 1GB of video memory. If you have 32-bit windows, this could significantly reduce the amount of system ram Windows will see and address. Most 32-bit windows environments are only recognizing a max of 3.5 gigs of ram with everything running and different things addressing their respective amounts of memory. If you throw a 1GB card into the mix, you could reduce your memory available to the system to as little as 2.5GB. Multitasking and some games are going to be hampered a bit at those levels, especially in vista, which takes up a lot just running Superfetch and background processes.

With the price of DDR2 where it's at right now, 8GB or even 6GB makes more sense. 64-bit apps in a 64-bit environment are yielding 15-30% speed gains over their 32-bit counterparts in testing already.

Wouldn't it be glorious? A system with 8GB of system memory, all 100% addressed, PLUS 1GB of dedicated video memory on your X2 or SLI setup? Bye bye memory bottlenecks...
 
Back
Top