BFG 6800 Ultra (OC) or a ATi x800xtPe

BFG 6800 Ultra (OC) or a ATi x800xtPe???

  • BFG 6800 Ultra (OC)

    Votes: 156 46.0%
  • ATi x800xtPe

    Votes: 183 54.0%

  • Total voters
    339
6880 has better drivers IMO

I will strongly have to disagree to this :rolleyes: The drivers for the 6800 are a heap more buggy than any cat drivers at the moment.Weather you like to admit it or not?

Guru3D DRIVER forum forceware

Your Fave nvnews Driver forum

Hence why i wrote this^ But of course now wait for the endless threads on PS3 or HDR?

In !(MY OPINION)! the X800PRO AND X800XT are better ;) Stable mature drivers No questions about 1600X1200.MaxAA MaxAF.+almost 3 weekly driver updates and no questionable cheats for 3dMark03 like Nvidia still harbours by the looks of it?

191670 3Dmarks Yeah right :D
But to each his own
 
@trapine said:
I will strongly have to disagree to this :rolleyes: The drivers for the 6800 are a heap more buggy than any cat drivers at the moment.Weather you like to admit it or not?

Guru3D DRIVER forum forceware

Your Fave nvnews Driver forum

Sell that card brother :rolleyes: Hence why i wrote this But of course now wait for the endless threads on PS3 or HDR?

In (MY OPINION) the X800PRO AND X800XT are better ;) Stable mature drivers No questions about 1600X1200.MaxAA MaxAF.+almost 3 weekly driver updates and no questionable cheats for 3dMark03 like Nvidia still harbours by the looks of it?

191670 3Dmarks Yeah right :D
But to each his own

Yea, and you also have to wait forever to get bugs fixed with the Catalyst drivers. ATI hardly EVER has a beta leak. This means while your waiting around for a WHQL version to get finalized, i've already had several beta leaks that fixed my issue.

DRen72 said:
I had every intention of going with the ATi X800XT this round, and initially I did, but after playing games with the XT, I became aggravated by the poor drivers and the myriad of bugs in them. That would be ok, except ATi is very slow to get out new drivers or beta drivers when compared to nVidia. Having had enough of the ATi driver crap, I went back to nVidia this round. If only ATi had gotten its drivers corrected faster, I'd still have the XT. Granted certain nVidia drivers have bugs in games (FarCry for example), but we are seeing workarounds and/or upcoming fixes to these issues.

http://www.nvnews.net/vbulletin/showthread.php?p=375647#post375647

The nVidia drivers also have better features such as Application Profiles, Digital Vibrance, CoolBits, FULL Trilinear Filtering, more Anti Aliasing options, and SUPERIOR Linux and OpenGL support.

And what your talking about with 3DMark03 is NOT a driver cheat. Its a legitimate DEV tool that is locked away in the drivers. You have to enable it through a registry hack. 3DMark03 is also the only area you'll be able to use it as an advantage and Futuremark should be patching it soon.

And only nOObs go off of 3DMark scores to start with, so i could care less.

The 6800u is faster then X800XT PE at 1600x1200. Its faster then the X800XT PE with high levels of AA enabled. And its just as fast as the PE in the majority of games with high AF enabled. The X800XT PE rarely beats the 6800u until 8xAF or 16xAF is enabled and its just because of ATi's "better" Brilinear filtering. In OpenGL it doesn't matter what settings you use because the 6800u will win.
 
CONSIDERING neither of these cards are attainable for sane prices right now, it really shouldn't be so urgent to cause all this commotion...

But I go with x800 XT.
So what if it's old? Looks like nVidia's new stuff sucks cos it sure as hell is on par with ATi's old stuff and not the big leap over it that it should be.
PS3.0 is only benifiting FarCry, and that game gets old fairly fast... Visually it does not change with PS3.0, and the performance increase is not much of an improvement, X800 XT is still just as playable.

PS3.0 is mostly an nVidia thing, and EVERY good game is gonna cater to that. Many are still written with the sufficient 2.0 code, which in nearly all cases works fine visually (cough, farcry 1.1, cough, no diff visual from 1.2, cough)

nVidia stuffs a lotta Bull into drivers, like IE popup prevention... wtf, they are a GPU company!

X800 still kicks ass... and it will win for HL2, and be on par (i bet) in Doom 3. even if it lags a little behind the 6800, it will make it up in other games.

Go on, throw your insults at me now.
 
Yea, and creedAMD was as hardcore ATI as it gets
Oh wait a minute I will go out and buy a 6800GT because everyone on a forum has one in there sig and they recon it does 3 FPS faster than a 12piped X800Pro :rolleyes:
[edit]My brain here and insert 6800GT dron[/edit] :p

Yeah right!the thing that amazes me the most about all this is that even with the FarCry1.2 patch and PS3 the 6800U is still only on a par with a X800XT :p Give me X800 any day and the cats drivers ;)
 
@trapine said:
Yeah right!the thing that amazes me the most about all this is that even with the FarCry1.2 patch and PS3 the 6800U is still only on a par with a X800XT :p Give me X800 any day and the cats drivers ;)

is that with or without af enabled ;)
 
did you actually have the opportunity to compare the cards and then came to that conclusion

Have any of you had the chance to compare the 6800GT you own to an X800Pro :rolleyes: I think not?
 
krizzle said:
CONSIDERING neither of these cards are attainable for sane prices right now, it really shouldn't be so urgent to cause all this commotion...

But I go with x800 XT.
So what if it's old? Looks like nVidia's new stuff sucks cos it sure as hell is on par with ATi's old stuff and not the big leap over it that it should be.
PS3.0 is only benifiting FarCry, and that game gets old fairly fast... Visually it does not change with PS3.0, and the performance increase is not much of an improvement, X800 XT is still just as playable.

PS3.0 is mostly an nVidia thing, and EVERY good game is gonna cater to that. Many are still written with the sufficient 2.0 code, which in nearly all cases works fine visually (cough, farcry 1.1, cough, no diff visual from 1.2, cough)

nVidia stuffs a lotta Bull into drivers, like IE popup prevention... wtf, they are a GPU company!

X800 still kicks ass... and it will win for HL2, and be on par (i bet) in Doom 3. even if it lags a little behind the 6800, it will make it up in other games.

Go on, throw your insults at me now.

Are you people dense? lol

You all keep ignoring all the facts i'm presenting and continue to post false comments that are unsupported.

The 6800u doesn't suck in any way shape or form.

The drivers for the 6800's are still in beta stages. The drivers for the X800's have already been in use for quite some time on the 9800 pro because the X800's are faster but not entirely different technology like the NV40's vs the NV30's.

The 6800u beats the PE nearly every single time in EVERY game without AF enabled. And at ANY resolution. The PE only starts winning in certain D3D titles when high levels of AF is enabled. And the reason for that like i've stated several times is the use of Brilinear Filtering. ATI constantly put down nVidia and the NV30's for their use of Brilinear Filtering and touted their use of Full Trilinear. Well as you can see ATI is NOT using Full Trilinear.

There wasn't SUPPOST to be a difference in IQ with the 1.2 patch and SM 3.0 so why do people keep going on about it? lol

HDR will be the IQ difference in Far Cry and it will only be supported by the 6800's.

I also dont look for the X800's to do any better in HL2 then the 6800's. In the HL2 beta the 6800u outperforms the PE at 1600x1200 and the PE is only ahead by less then 5 fps with high AF enabled at that resolution. I think it will be the same way as the rest of the D3D games.

OpenGL on the other hand wont be. The 6800's are going to maintain their advantage in OpenGL no matter what the resolution or setting is.

So would you rather have a 6800u that performs as well as the PE in D3D and alot better in OpenGL or a PE that only performs a little better in D3D when high levels of AF are enabled because of "Brilinear" filtering?

The 6800's are more futureproof because of fully supporting SM 3.0. If your going to throw down $500 on a video card, you might as well get the one with the most features.
 
@trapine said:
Have any of you had the chance to compare the 6800GT you own to an X800Pro :rolleyes: I think not?

I dont have either card

the question is directed at you since you say the X800 pro is better, I am not making such claims :rolleyes:
Since you havent had the oppurtunity to compare the card, your opinion is senseless because you are just looking at reviews of the card and coming with your own bias conclusion

and before you come up with your "you are an nvidia fan" comments,
I just let you know that I am neutral to the matter, currently have a 9800 pro, a 9700 pro, and owned nvidia cards in the past, and will soon buy an nvidia card
 
@trapine said:
Have any of you had the chance to compare the 6800GT you own to an X800Pro :rolleyes: I think not?

Actually, there have been several people right here on HardOCP that have traded in their X800Pro for a 6800GT lol. :D

Quite a few people that have went from an X800Pro to a GT have said the performance was noticebly smoother on the GT. The X800Pro will also be unable to perform at PE levels because of being limited to 12 pipelines. The GT on the other hand can break Ultra speeds and performance levels.
 
Yeah, there have been several that switched from x800pro to 6800. I haven't seen anyone do that from the x800xtpe to 6800 though. I'd get the x800xt but I'm an ATi guy,
 
stumpy said:
Yeah, there have been several that switched from x800pro to 6800. I haven't seen anyone do that from the x800xtpe to 6800 though. I'd get the x800xt but I'm an ATi guy,

I would say that has to do alot with the fact both the X800XT PE and 6800u are extremely hard to come by and their more expensive. Along with the fact that the PE performs as well and sometimes better then the 6800u in D3D so there wouldn't be much point until SM 3.0 starts showing some real benefits like HDR in Far Cry.
 
burningrave101The 6800u doesn't suck in any way shape or form.

The drivers for the 6800's are still in beta stages. The drivers for the X800's have already been in use for quite some time on the 9800 pro because the X800's are faster but not entirely different technology like the NV40's vs the NV30's.

The 6800u beats the PE nearly every single time in EVERY game without AF enabled. And at ANY resolution. The PE only starts winning in certain D3D titles when high levels of AF is enabled. And the reason for that like i've stated several times is the use of Brilinear Filtering. ATI constantly put down nVidia and the NV30's for their use of Brilinear Filtering and touted their use of Full Trilinear. Well as you can see ATI is NOT using Full Trilinear.

There wasn't SUPPOST to be a difference in IQ with the 1.2 patch and SM 3.0 so why do people keep going on about it? lol

HDR will be the IQ difference in Far Cry and it will only be supported by the 6800's.

I also dont look for the X800's to do any better in HL2 then the 6800's. In the HL2 beta the 6800u outperforms the PE at 1600x1200 and the PE is only ahead by less then 5 fps with high AF enabled at that resolution. I think it will be the same way as the rest of the D3D games.

OpenGL on the other hand wont be. The 6800's are going to maintain their advantage in OpenGL no matter what the resolution or setting is.

So would you rather have a 6800u that performs as well as the PE in D3D and alot better in OpenGL or a PE that only performs a little better in D3D when high levels of AF are enabled because of "Brilinear" filtering?

The 6800's are more futureproof because of fully supporting SM 3.0. If your going to throw down $500 on a video card, you might as well get the one with the most features.

No doubt the 6800s are good cards. Good point about future-proofing. but dude... why don't you look at it objectively... it's not in the higher levels of AF/AA that ATi starts pulling ahead, it's in almost anything more than 2x/2x. And the thing how X800s are 5 frames faster in some games... Well, 6800's don't get a much bigger advantage than 5 frames either. Also, most gamers today DO use high AA/AF settings, and that's where ATi dominates. Face it, it's not as if we will never touch any older games (I still play MOHAA full on, with people running on X800 pros) where ATi will shine with AA/AF.

Yes, nVidia's cards are quite good. They have PS3.0, but that's only taken advantage of in FEW games. It's not yet a standard, and it won't be for a while. Forceware 61.xx drivers really do not offer too much of an astounding advantage, as the reviews earlier posted. I know they're still beta, but they won't change much. ATi's drivers are more consice and do all things a bit differently, but who cares that they are bilinear, the IQ is still almost identical. Are you gonna notice, "OH, look! That panel on the wall way in the distance doesn't have the 2 light pixles that make it look like it's more real!" when you have monsters in front of you in Doom III?

ATi has a very strong hold on the Flagship position still. It caters to all us old game players and the new games as well. Maybe not so much as DOOMIII but don't try and tell me that it won't look amazing on an X800. A difference of 5 frames when youre in the 60-80's range DOESNT MEAN A THING. Like [H]'s decision of dropping the apples to apples conclusion.
 
Are you people dense

Are you so dense and arrogant to think your view is right and anybody else is wrong :confused:X800 Is a damed good card(SmoothMarketing3) Is a none issue in my eyes from seeing the performace it gave in Patch1.2=minimal.HDR is personal pref to me it makes everything glow to much for my liking?
The X800 doesnt get slaughtered in any test.So its pretty poor from nivida really still only able to just better sub 3 year old architecture?And from looking around the cat drivers come out way more often and they are OFFICIAL NOT BETTA.Are currently more stable than any Nvidia driver.ANd if you need proof Just take a read at NVnews and Guru3d ;)
 
Plus, Ruby kicks that little mermaid's ass. No doubt about that.
ATi: A (relatively) small company pumping out actual movie-like realtime renderings with stunning quality.
nVidia: Amazing looking mermaid... but it's half fish and doesen't do too much, while Ruby saves the world.

There, that's the most technical advantage.
 
@trapine said:
Are you so dense and arrogant to think your view is right and anybody else is wrong :confused:X800 Is a damed good card(SmoothMarketing3) Is a none issue in my eyes from seeing the performace it gave in Patch1.2=minimal.HDR is personal pref to me it makes everything glow to much for my liking?
The X800 doesnt get slaughtered in any test.So its pretty poor from nivida really still only able to just better sub 3 year old architecture?And from looking around the cat drivers come out way more often and they are OFFICIAL NOT BETTA.Are currently more stable than any Nvidia driver.ANd if you need proof Just take a read at NVnews and Guru3d ;)

but at the end of the day your paying the same price for an overall better card be it features or performance. it features SM3.0 support and just about everything else burninggrave and trancendenz have been saying all the way through this thread. now if your not happy enough for people to pay the same price for a better card then thats fine but please use common-sense. sure the gt/pro may be on par d3d/dx but once again ogl shows the gt clearly better, but once overclocked to ultra (or easily beyond ultra) it outperforms it in d3d/dx too
 
krizzle said:
Plus, Ruby kicks that little mermaid's ass. No doubt about that.
ATi: A (relatively) small company pumping out actual movie-like realtime renderings with stunning quality.
nVidia: Amazing looking mermaid... but it's half fish and doesen't do too much, while Ruby saves the world.

There, that's the most technical advantage.

and here's me thinking she just got a crystal
 
I've read a few pages of this thread and I must say that I would buy neither the ATI or nVidia top flavors UNTIL I see some real benching in Doom 3. After all that will set the standard for game engines to come.(Unreal 2 to a certain extent) What I have works fine and will not be out of date for a while at least. I'll compare again in 6 months. By that time some of you may have switched sides in th this conflict :)

Also, I like top end stuff and these cards deliver, but I'll wait for prices to slide into the
reasonable world....and not out of this world
 
tornadotsunamilife said:
but at the end of the day your paying the same price for an overall better card be it features or performance. it features SM3.0 support and just about everything else burninggrave and trancendenz have been saying all the way through this thread. now if your not happy enough for people to pay the same price for a better card then thats fine but please use common-sense. sure the gt/pro may be on par d3d/dx but once again ogl shows the gt clearly better, but once overclocked to ultra (or easily beyond ultra) it outperforms it in d3d/dx too
Don't start with overclocking. You can overclock an X800 XT as well, you know... it's even supported partially by warranty, and done much more elagantly than CoolBits.

Yes, SM3.0 is new and hip. But it has about 5 frames of advantage as of this moment. And, only in one game. Which gets old fast.

And nVidia's lead in OpenGL is just about as much as ATi's in D3D. The cards are nearly even.
 
@trapine said:
Have any of you had the chance to compare the 6800GT you own to an X800Pro :rolleyes: I think not?

Well since you asked...yes. Any other questions? :rolleyes: Why is it people insist on questioning peoples decisions? If it isn't burningrave101 on one side of the fence....its his alter ego @trapine. I liked the x800 but for the price I stuck with my two 6800GTs. It all came down to price....
 
(or easily beyond ultra) it outperforms it in d3d/dx too

But then you are being totally stupid an (overclocked 16Piped card against a 12piped stockcard :rolleyes: Who needs common sense again?
Now if you flashed that X800ProVIVO to X800XT and overclocked it im betting it would make for a very intresting battle ;)As for the openGL.Im saving my Comments until DOOM3 is actually out.And both the 6800GT and X800 have been tested then i will make my comment?

but for the price I stuck with my two 6800GTs. It all came down to price....

Even that comes down to where you live ;) Hear in Australia the X800Pro is a fair bit cheaper than the 6800GT ;)
 
@trapine said:
But then you are being totally stupid an (overclocked 16Piped card against a 12piped stockcard :rolleyes: Who needs common sense again?
Now if you flashed that X800ProVIVO to X800XT and overclocked it im betting it would make for a very intresting battle ;)As for the openGL.I saving my Comments until DOOM3 is actually out.And both the 6800GT nad X800 have been tested then i will make my comment?

precisely you may have found some of your sense! wouldn't it be wiser to buy the 16 pipeline card? and its no good saying 'oh but the 12 is just as good' who cares? its the same price.

and also why conserve yourself for doom 3 when call of duty shows it now?
 
Guys, I love you all. I love nVidia and I love ATi. It's like making a decision between a Chevy Suburban or a GMC Yukon. Both terrific products, differ in company. (well, they both GM, but you get my point i hope)

PS, Ruby's crystal saved the world, cos it could have destroyed it with its bling!
 
I exchanged x800Pro for BFG 6800GT myself two days ago.
Slapped on Zalman cooler on and couldn't be happier.

Do what makes you happy...I just want you guys all to be happy mmmk? :D
 
krizzle said:
No doubt the 6800s are good cards. Good point about future-proofing. but dude... why don't you look at it objectively... it's not in the higher levels of AF/AA that ATi starts pulling ahead, it's in almost anything more than 2x/2x. And the thing how X800s are 5 frames faster in some games... Well, 6800's don't get a much bigger advantage than 5 frames either. Also, most gamers today DO use high AA/AF settings, and that's where ATi dominates. Face it, it's not as if we will never touch any older games (I still play MOHAA full on, with people running on X800 pros) where ATi will shine with AA/AF.

Why do you keep saying AA AND AF? Its not AA at ALL. The X800's take a much larger performance hit from AA than the 6800's do. The X800's never outperform the 6800's when just 4xAA is enabled at 1600x1200.

And where have you seen benches with just 2xAA + 2xAF?

ATI doesn't dominate in AA + AF. They have a slight advantage with HIGH levels of AF. 8xAF and 16xAF to be precise. ATI wont shine in OpenGL even with AA and AF enabled. They will do better then they would without high AF but they still wont be able to beat a 6800.

Have you just not seen any OpenGL benches? Even at 4xAA and 16xAF at 1600x1200 the 6800u is out ahead of the X800XT PE in every OpenGL game and usually by a fair margin. I was posting benchmarks earlier showing the 6800nu beating the X800XT PE in Call of Duty and Neverwinter Nights at 1600x1200 lol.

Yes, nVidia's cards are quite good. They have PS3.0, but that's only taken advantage of in FEW games. It's not yet a standard, and it won't be for a while. Forceware 61.xx drivers really do not offer too much of an astounding advantage, as the reviews earlier posted. I know they're still beta, but they won't change much. ATi's drivers are more consice and do all things a bit differently, but who cares that they are bilinear, the IQ is still almost identical. Are you gonna notice, "OH, look! That panel on the wall way in the distance doesn't have the 2 light pixles that make it look like it's more real!" when you have monsters in front of you in Doom III?

Over a dozen titles have announced SM 3.0 support already for just this year. Thats about twice the amount of games that get used in the majority of GPU reviews.

Here are some of the titles with pics:

http://www.nvnews.net/vbulletin/showthread.php?t=30736

How are ATI's drivers more concise? Does poor OpenGL and Linux support somehow make them better? The nVidia drivers give more features to the card like i mentioned earlier. Refresh Rate Overrides and Digital Vibrance are both some of the excellent features.

And its "Brilinear" filtering, not Bilinear.

ATi has a very strong hold on the Flagship position still. It caters to all us old game players and the new games as well. Maybe not so much as DOOMIII but don't try and tell me that it won't look amazing on an X800. A difference of 5 frames when youre in the 60-80's range DOESNT MEAN A THING. Like [H]'s decision of dropping the apples to apples conclusion.

The X800's hold an advantage in AF. I've yet to see a review where the X800XT PE was able to hold a consecutive lead without 8xAF or 16xAF enabled at ANY resolution or with AA enabled. And thats because of their AF optimizations.

nVidia is just getting started on their NV40 drivers while ATI has just been building on to where the 9800 pro left off. If nVidia gets some higher performance AF optimizations for the 6800u later on, the X800's will have no advantage at all in D3D.

krizzle said:
Plus, Ruby kicks that little mermaid's ass. No doubt about that.
ATi: A (relatively) small company pumping out actual movie-like realtime renderings with stunning quality.
nVidia: Amazing looking mermaid... but it's half fish and doesen't do too much, while Ruby saves the world.

There, that's the most technical advantage.

The Bulldog on my XFX 6800GT would have Ruby and the Mermaid both for lunch :p.

http://www.hwhell.com/articles/xfx_geforce_6800_ultra/
 
And its "Brilinear" filtering, not Bilinear.

HAHAHA okay, BRI linear and TRI linear... very nice. Take a good look at the age-old Q3 set-up menu... bRilinear? HAHAHAH, don't think you're always right on that...
You musta felt really smart, eh?

Other than that, okay. you win. I lose. Wait... what did I try saying? that they are both good cards? Right.

It's all good. So, nVidia happens to catch up and nudge ahead by a few FPS and PS3.0 (which still shows no true advantages) this round... ATi had the last 3 years. Big deal.

Both cards kick ass, but I would go with the X800 XT-PE cos I like it more.
Peace.

P.S., If bulldogs eating women is your deal, let it be. I say, hot animated female characters attract more costomers than grizzly bulldogs.
 
krizzle said:
HAHAHA okay, BRI linear and TRI linear... very nice. Take a good look at the age-old Q3 set-up menu... bRilinear? HAHAHAH, don't think you're always right on that...
You musta felt really smart, eh?

Other than that, okay. you win. I lose. Wait... what did I try saying? that they are both good cards? Right.

oh please stop posting now
 
There is a pretty good X800XT PE vs 6800u thread going on over at nV News.

Its definitely not a one-sided argument ither. There are alot of ATI users on nV News as well.

This last post by Ice Nine seems to sum up everything that i've been trying to get across to some of you.

I gotta disagree. I'm not huge on all these terms, but how would a developer implement FP32 on an ATI card? PS3.0 is supposed to be there to make things easier on the hardware and on the developer - a more streamlined approach to doing the same thing, but giving you the opportunity to do more of it... More efficiency means more eye candy, and isn't that the goal of these cards?

The way I see it, the more PS3.0 compatible cards there are out there in our machines, the more likely they will be to code for it - or at least not have to code an alternate way to do it for either card. Particularly on the Pro/GT level when the GT has 16 pipelines to the x800 pro's 12. If you're at the $399 tier, I simply can't see ANY good reason to opt for an X800 Pro.

On the ultra level, the same PS3.0 and FP32 thing - only now you can have dual DVI if you want to run two beloved 2001FP's in digital mode. Can't do that on the X800XT PE, at least, I haven't seen any that do dual DVI. Granted, this is a niche deal, but hey, if you're willing to plink down this much coin on a vidcard, then you might consider this a plus for the 6800.

Granted that *TODAY* you can do everything in PS2.0 that you can in PS3.0 - after all, no one's really had time to leverage anything that PS3.0 brings to the table. But if the scores are so darned close, and one card offers something the other doesn't have, why deprive yourself of it if the price is identical?

The benchmarks you see today are on incredibly immature drivers for the 6800 platform. You KNOW these are going to improve. The X800XT PE is a tried and true core with a very mature driver set. It's no doubt the reason it's "faster" today. But I think that's going to change to where both cards even out to within less than a percent of each other. I strongly feel that nvidia has a lot more "wiggle room" in the driver than ATI does at this point.

Again, not that either card is inherently bad, I just don't see a compelling reason to go ATI this time around with prices being equal.

(oh, and before I get bashed, my last 4 video cards were all ATI.)

No there aren't. Their bias ALL swings to whoever pays for their next christmas party

If you want actual benchmarks and opinions that have the highest content of fact, look no further than message boards. Granted, you have to weed out the Fanboy factor, but there is no bigger fanboy than a reviewer who's getting free gear and/or money from a manufacturer to pitch their products. This is true in EVERY industry, not just performance PC stuff.

Use these "reviews" as a reference, but get the real skinny from real people who spend real time with these products.

http://www.nvnews.net/vbulletin/showthread.php?t=32417&page=4&pp=15

I couldn't agree more about the reviewers. Actual end users on message boards have nothing to lose. Reviewers on hardware sites DO. If they give enough poor reviews to products, most vendors will stop choosing them to evaluate their merchandise.

I mean just look at it from a realistic point of view. If someone offered me a dodge viper and another offered me a ford mustang, i would have an extremely hard time not being a little bias lol.

Now i'm not saying reviewers are getting cars from nVidia and ATI, but when millions of dollars are at stake, and reviews from hardware sites and magazines play a HUGE role in who buys what, you can bet there is alot of money changing hands behind the scenes.
 
Okay. Good. I choose 6800GT over X800 PRO
BUT WE ARE TALKING ABOUT 6800Ultra and X800 XT here. Don't bring up the tier below. whatever man, I just gave my 2cents. Don't bash me for it. Opinions are good, and controversy is inevitable.
 
Hm, while the choice between a GT and Pro is rather clear, cut, and dry, I'm not so sure myself on the Ultra/XTPE. Guess we'll have to wait it out until D3/HL2 to see how things play out.
 
krizzle said:
Okay. Good. I choose 6800GT over X800 PRO
BUT WE ARE TALKING ABOUT 6800Ultra and X800 XT here. Don't bring up the tier below. whatever man, I just gave my 2cents. Don't bash me for it. Opinions are good, and controversy is inevitable.

but the only difference of the 6800ultra is the clock speeds, but however what i've said about the 6800gt features is also the same for the ultra
 
OKAY GODDAMNIT, you are right. I lose. damn, do you kick everyone when they're down? BRUTAL man, brutal.
 
krizzle said:
OKAY GODDAMNIT, you are right. I lose. damn, do you kick everyone when they're down? BRUTAL man, brutal.

Noone is trying to kick someone when their down. This isn't anything personal. Its the X800XT PE vs the 6800u. People from both sides are going to argue their points and say which they think is better. If your going to try and say one is better then the other then your going to get challenged.
 
Yes, but if I agreed that the 6800 was better, why still bash me for when I was convinced otherwise?
 
Back
Top