Someone got a 6800U today!!!!

Status
Not open for further replies.

MrHappyGoLucky

Limp Gawd
Joined
Jun 17, 2003
Messages
340
http://www.nvnews.net/vbulletin/showthread.php?t=29518&page=1&pp=15

Make sure you notice a few things about this piece of hardware

1. He is using the latest and hackest 61.11 drivers
2. Very obvious he's not too thrilled (went from a 9800Pro for this)
3. Far Cry bugs o'plenty
4. Notice the major performance hits on the AA and AF
5. He went from 1600x1200 no AA or AF to 1600x1200 2xAA 8xAF. Was it worth it?
6. His wimpy 3DMark score is no better than a X800Pro
7. Did I say that the 61.11 drivers are not FM approved?
8. All I can say is WOW

Man, now for $580.00 big ones (employee purchase price), you would expect a whole lot more!
 
It was his choice to buy the card, no need to flame him. Sure, $580 is steep, but he's got the bragging rights by having one when few people do, and if that's his thing, more power to him.
 
MemoryInAGarden said:
It was his choice to buy the card, no need to flame him. Sure, $580 is steep, but he's got the bragging rights by having one when few people do, and if that's his thing, more power to him.

agreed. though you gotta admit, even the highest of enthusiast have a little more patience, he probally could have gotten the card a whole lot cheaper later this month.
 
Bad_Boy said:
agreed. though you gotta admit, even the highest of enthusiast have a little more patience, he probally could have gotten the card a whole lot cheaper later this month.

I agree but he also could have gotten a whole lot more card for less money. :p
 
Please explain "I agree but he also could have gotten a whole lot more card for less money.".

I can't understand why you think that he could have gotten "more card" for less money. The 6800ultra is a bada$$ card. It gives the x800XT a really good fight. I haven't seen any of them for sale yet and i am pretty sure the price is gonna be right about the same at first.
 
botreaper10 said:
Please explain "I agree but he also could have gotten a whole lot more card for less money.".

I can't understand why you think that he could have gotten "more card" for less money. The 6800ultra is a bada$$ card. It gives the x800XT a really good fight. I haven't seen any of them for sale yet and i am pretty sure the price is gonna be right about the same at first.

for less money you can get an equal or faster card is what he's saying, x800pro for example, our results back that statement up: http://www.hardocp.com/article.html?art=NjEx
 
im still going to wait for those drivers nvidia is "supposedly" putting out
i dont wanna take the chance yet.
 
MrHappyGoLucky said:
I agree but he also could have gotten a whole lot more card for less money. :p

Considering that the Geforce 6 has built-in features that ATi products won't have for a while, I'd say that the Geforce 6 is a "whole lot more card."
 
MrHappyGoLucky said:
http://www.nvnews.net/vbulletin/showthread.php?t=29518&page=1&pp=15

Make sure you notice a few things about this piece of hardware

1. He is using the latest and hackest 61.11 drivers
2. Very obvious he's not too thrilled (went from a 9800Pro for this)
3. Far Cry bugs o'plenty
4. Notice the major performance hits on the AA and AF
5. He went from 1600x1200 no AA or AF to 1600x1200 2xAA 8xAF. Was it worth it?
6. His wimpy 3DMark score is no better than a X800Pro
7. Did I say that the 61.11 drivers are not FM approved?
8. All I can say is WOW

Man, now for $580.00 big ones (employee purchase price), you would expect a whole lot more!

1. Drivers that arent whql'd and arent intended for the public
2. I read the entire thread and did not find one post where he was complaining about his purchase
3. Its already been acknowledged that there are bugs in FarCry that have to do with CRYTEK's patch (are ppl so dim that they cant grasp this concept!?
5. If you read his posts he actually states that he prefers the smoothness offered by the 6800 ultra
6. really? I dont recall any X800 pro scoring 13,770
7. Does anyone really still give a flying fuck about what futuremark has to say?
8. All I can say is try again troll
 
DanK said:
Considering that the Geforce 6 has built-in features that ATi products won't have for a while, I'd say that the Geforce 6 is a "whole lot more card."

right now those features are doing nothing

and there is yet to be seen any proof of image quality differences between them
 
for 580 dollars, i personally would not be accepting any kind of unperfect image quality. speed be damned. i wouldn't tolerate flashing roofs and shadows disappearing re-appearing. and i would definitely expect to be able to jack up to 6xAA and not just 2xAA. i can do 2xAA at 1280x1024 with my current card without a hitch.for almost twice as much and one resolution size bigger i'd better not see anything less than amazement. i, as a consumer am getting tired of nvidia's 'oh the next driver fix' or 'we have more features than the competitor does' bullshit is lame. for the kind of money they're asking for i should get out of the box improvement.

but that's just me.
 
The x800 might be twice as fast as the 6800 Ultra, I still wouldn't buy it. What good is speed if you can't play half your games you want to play?
 
requiem99 said:
The x800 might be twice as fast as the 6800 Ultra, I still wouldn't buy it. What good is speed if you can't play half your games you want to play?

Where did you get this from and if you trully believe this you're pretty damned stupid. The X800 can run any game the 6800 line can... are you serious? Whoa.

~Adam
 
CleanSlate said:
The X800 can run any game the 6800 line can.

This is true for the time being. However, what happens when the next-gen games come out? If you went with the X800, you have a $500 videocard that can't play those games. If you went with the Geforce, the same $500 would be able to play the highest-tech games for a much longer time; hence more value.
 
defiant said:
1. Drivers that arent whql'd and arent intended for the public
2. I read the entire thread and did not find one post where he was complaining about his purchase
3. Its already been acknowledged that there are bugs in FarCry that have to do with CRYTEK's patch (are ppl so dim that they cant grasp this concept!?
5. If you read his posts he actually states that he prefers the smoothness offered by the 6800 ultra
6. really? I dont recall any X800 pro scoring 13,770
7. Does anyone really still give a flying fuck about what futuremark has to say?
8. All I can say is try again troll

haha owned!
 
number 1.) applies to 6800U - 60.72 and 61.11 are beta drivers, still no WHQL drivers for 6800U

X800Pro and X800XT have Catalyst 4.5 which is WHQL and on ATI's site since the middle of last month, 5/12/2004
 
CleanSlate said:
Where did you get this from and if you trully believe this you're pretty damned stupid. The X800 can run any game the 6800 line can... are you serious? Whoa.

~Adam


I'm thinking that he got x800 and 6800 switched around, I hope.
 
DanK said:
This is true for the time being. However, what happens when the next-gen games come out? If you went with the X800, you have a $500 videocard that can't play those games. If you went with the Geforce, the same $500 would be able to play the highest-tech games for a much longer time; hence more value.

Don't believe the Nvidia marketing hype. The X800 Pro plays all the games any 6800U could, and even faster in some cases. Now take into consideration the 6800U can't even allow to run at full PS2.0 (dumbs down to PS1.1 or 1.4), why would you think you will see a full PS3.0 implimentation?

Didn't anyone notice this guy is only finding the 6800U playable at 1280x1024 w/ 2xAA and 8xAF? The X800Pro does that without breaking a sweat.
 
defiant said:
1. Drivers that arent whql'd and arent intended for the public

What will the next excuse be? Funny how ATI never has this problem.

defiant said:
2. I read the entire thread and did not find one post where he was complaining about his purchase

Did he come out and say how happy he is? I didn't see that either. Given this guy is the first public person to have one, I would have expected a large boner. What I got was the card is playable at 1280x1024 w/ 2xAA and 8xAF. Not ground breaking at all.

defiant said:
3. Its already been acknowledged that there are bugs in FarCry that have to do with CRYTEK's patch (are ppl so dim that they cant grasp this concept!?

What you really meant to say is why can't the card run at full PS2.0 when requested by the game. This "developer" problem is BS. This is a TWIMTBP title to make matters worse. IF Nvidia was renedering like the game requests (as ATI does w/o these problems), it wouldn't be an issue.

defiant said:
5. If you read his posts he actually states that he prefers the smoothness offered by the 6800 ultra

Again, at 1280x1024 w/ 2xAA and 8xAF. Impressed by the $580.00 card yet?

defiant said:
6. really? I dont recall any X800 pro scoring 13,770

Take a look at FM's Hall of Fame... it's full of X800 cards! Why is it again we don't see any NV40 scores able to upload?

defiant said:
7. Does anyone really still give a flying fuck about what futuremark has to say?

If people didn't we would see them post their scores, right?

defiant said:
8. All I can say is try again troll

I call it clearing up misconceptions. :eek:
 
it's already been proven as well that nvidia cheated on 3dmark03 with their FX line, why anybody believes what their 3dmark score is now is beyond me. the only way i'd be able to believe it is if we start seeing some aquamark scores.

DanK said:
This is true for the time being. However, what happens when the next-gen games come out? If you went with the X800, you have a $500 videocard that can't play those games. If you went with the Geforce, the same $500 would be able to play the highest-tech games for a much longer time; hence more value.


next generation of games? we're barely getting our most next generation of games coming out. by the time the next-next-generation comes out we'll already have another round of cards coming out.
 
MrHappyGoLucky said:
What will the next excuse be? Funny how ATI never has this problem.

right.....ati's beta catalyst's that breaks more features than it fixes


Did he come out and say how happy he is? I didn't see that either. Given this guy is the first public person to have one, I would have expected a large boner. What I got was the card is playable at 1280x1024 w/ 2xAA and 8xAF. Not ground breaking at all.

Are you claiming to be a mind reader....because unless you can I dont think anyone here really gives a shit how you think he should be feeling. And as to settings that he running his card at he had the following to say

"4x/8x compared to 2x/4x the hit is pretty large.. sometimes as much as 30fps, and I really can't see much of an iq difference between the 2. They are both playable, but I prefer the ultra smoothness So for now 1600x1200 2xaa/4xaf suits me fine."

learn to fuckin read

What you really meant to say is why can't the card run at full PS2.0 when requested by the game. This "developer" problem is BS. This is a TWIMTBP title to make matters worse. IF Nvidia was renedering like the game requests (as ATI does w/o these problems), it wouldn't be an issue.

Again, it has to do with cryteks software.....and btw this is one fuckin game, can you point to in other dx9 titles that suffered the same problems?

Again, at 1280x1024 w/ 2xAA and 8xAF. Impressed by the $580.00 card yet?

again, learn to fuckin read

Take a look at FM's Hall of Fame... it's full of X800 cards! Why is it again we don't see any NV40 scores able to upload?

probably because only a handfull of ppl have actually been able to get their hands on the 6800 whereas the X800 has been much easier to obtain

I call it clearing up misconceptions. :eek:

I call it a troll with too much time on their hands. :rolleyes: U sure that ure not part of ATI's PR department?
 
HRslammR said:
it's already been proven as well that nvidia cheated on 3dmark03 with their FX line, why anybody believes what their 3dmark score is now is beyond me. the only way i'd be able to believe it is if we start seeing some aquamark scores.


next generation of games? we're barely getting our most next generation of games coming out. by the time the next-next-generation comes out we'll already have another round of cards coming out.

And its already been proved that ATI cheated on Quake, I mean Quack! SO why believe any of THEIR benchmarks for any game?

IF you use your logic for one side of your arguement it has to be used for the other side as well.

Nvidia hasnt released WHQL'd drivers yet, give them a chance to, then criticize the performance. Also, futuremark should approve their WHQL'd drivers, but they need time to evaluate the drivers, which are supposedly right around the corner.

If I buy a $400-500 card I expect it to last for AT LEAST 12 months. SO people saying there wont be games based on SM3.0 for 8-12 months (which is an ASSUMPTION!) so the x800 is a better buy. WRONG what happens in 12 months? buy a new ATI card for another $400-500 card that does support SM3.0? lol

I'm waiting for the 6800GT's to be released and be a reasonable price. (and overclock it) full 16 pipes, SM3.0,
 
chrisf6969 said:
.

If I buy a $400-500 card I expect it to last for AT LEAST 12 months. SO people saying there wont be games based on SM3.0 for 8-12 months (which is an ASSUMPTION!) so the x800 is a better buy. WRONG what happens in 12 months? buy a new ATI card for another $400-500 card that does support SM3.0? lol

Can you tell me exactly what benefits PS3.0 games will have over PS2.0 games in 12 months?

Will the games suddenly not run on ATI hardware? Will developers produce PS3.0 only games in 12 months that will lock out 85% of their target marketplace?

Almost 2 years after PS2.0 cards were released exactly how many PS2.0 games are there right now?
 
The games with SM3.0 will run faster because of a higher level of programmability, such as conditional branching. (and longer shader programs that work more efficiently)

as a simple example imagine 2.0 has to run a "for I = 1 to 100" loop for each texture it genrates.. However, 3.0 has a "where C=V do" loop where it only runs the loop long enough to generate the visible portion,etc. I'm oversimplifying it, but with 3.0 hundreds of passes could be saved by using smart branching, etc. or reusing texture code for various different textures and only running parts of it for different textures.

I'm sure the games will be coded for PS1.x, 2.0 3.0, etc.. but will look worse with 1.x. (like Max Payne does now between 1.x & 2.0)

2.0 and 3.0 will probably look similar but the performance will be better on the 3.0. And 3.0 could offer a better image if they coded a more complex path for it. Like with softshadows, that might not be efficient enough with 2.0.

Look at SplinterCell: Pandora Tomorrow wont even run on anything with out pixel shaders. ie: people with GF4mx's or lower(like integrated graphics) are out of luck, which OEM"s sell the f*ck out of!!(low end graphics)
 
jakUP is obviously pleased with his purchase. His money, his decision, STFU. He has owned other "top-line" cards in the past and has been honest in his opinions as I recall.

I just don't get it. Why do people like some in this thread feel the need to spin situations to favor one IHV over another? Really, do they get paid for this? A nickel for every anti-nVidia or anti-ATi post?

I'm just amazed at the fervent stupidity of the average fanboi.
 
Well given the relationship between Publisher/Developer and the state of PC gaming in general I think the chances of large amounts of PS3.0 coding in 90% of the games released in 12-18 months will be very small.

PC development in general simply cannot afford to devote those type of resources to support features that 85% of their market cannot support.

That being said.. Nvidia will have its titles with PS3.0 support that they will use to market their cards advantages.. which only makes sound business sense. Will those games run poorly on ATI hardware? Well... that remains to be seen.

My personal opinion based on history is that by the time we see enough PS3.0 games that utilize the instructions to their full effect the current generation of hardware (NV40, R420) will be on the mid/lower end of the videocard spectrum.

Given Unreal Engine 3.0's target hardware specifications I'm not exactly really sweating out any investment made in videocards this round... they'll be trash by that time.

Again according to your logic.. that FX5900 you purchased < 12 months ago will be useless since it doesnt support PS3.0

Though Im sure you didnt pay nearly $400 for it..so the investment doesnt sting as bad.
 
LabRat said:
jakUP is obviously pleased with his purchase. His money, his decision, STFU. He has owned other "top-line" cards in the past and has been honest in his opinions as I recall.

To true.. who really gives a rats ass in the end on what card someone buys.. its not theirs.. In the end they all play the same games..


I'm just amazed at the fervent stupidity of the average fanboi.

LOL! nothing amazes me anymore there.. I've seen several page long threads flaming back and forth over motherboard chipsets..
 
man you guys need to chill, especially defiant, whats up with the name calling and vulgar language? you guys are fighting over hardware that you don't have, and probably haven't even witnessed..
 
TheGameguru said:
Again according to your logic.. that FX5900 you purchased < 12 months ago will be useless since it doesnt support PS3.0

Though Im sure you didnt pay nearly $400 for it..so the investment doesnt sting as bad.

Well, it was a $230 card and I've had it 8 months(Oct2003-now) and only plan on keeping it another 2-4 months depending on the prices of the 6800GT in the next few months. However if I have paid $400-500 for it (or $320 like I did with my ti4600) I'd definitely be keeping it longer. My ti4600 lasted me about 19 months (almost 2 years) and I was very happy with it. (mar 2002-Oct2003)

So I will get rid of it long before SM3.0 games hit the street. But it COULD have lasted ALMOST 2 years (from the time I bought it), before 3.0 games come out.

I figured my videocards cost me about $20/month at the rate I've been keeping them! $230+380 / 27 months = $20/month!
Not counting the rest of the components of my system!!
If I paid $450 for a 6800GT now I would expect it to last approx 2 years. Which it probably would. But I'm probably going to buy one in a few months for under $400.
 
bobmanfoo said:
man you guys need to chill, especially defiant, whats up with the name calling and vulgar language? you guys are fighting over hardware that you don't have, and probably haven't even witnessed..

I'm glad i'm not the only not shitting a brick over nothing
 
Yeah, man you guys need to chill, Jakup is a good guy, he will give you the straight up story. I do think he is disappointed in the FarCry performance as we all are, but hopefully they will be fixing that soon. Just stop with the name calling and shit, it makes you look like children. Some people are trying to troll, just ignore them.
 
LabRat said:
jakUP is obviously pleased with his purchase. His money, his decision, STFU. He has owned other "top-line" cards in the past and has been honest in his opinions as I recall.

I just don't get it. Why do people like some in this thread feel the need to spin situations to favor one IHV over another? Really, do they get paid for this? A nickel for every anti-nVidia or anti-ATi post?

I'm just amazed at the fervent stupidity of the average fanboi.


My friend that is like pulling teeth from an alligator each side has to defend its $400+ dollar investment...be it "useless features" or "slightly higher frames" WTF ever....as of late video card threads have been no fun since you cannot post a damn thing praising one card (the 6800 as of late really) and say you are impressed with it with out somebody jumping out the wood work to tell you your purchase sucks for "x" reason.
Right now Im pretty sure once NV gets off their ass and puts more mature drivers out the 6800 is gonna fly.....the 420 could get a boost of speed as well...but others disagree with that....we simply wont know till both card are out in force and hte drivers are mature enough to really make a difference...untill then...Im hanging on to my BFG 5900nu
 
wow, all out war between "who is right" and "who is right" lol..

my beef is...

why even start a thread like this. somebody gets a peice of hardware and you automatically respond to making a gay thread to flaming him? why dont you go over to their forums and flame him if you dont like his purchase. jesus. grow up.

thats like if i got a intel cpu, and a amd fanboy makes a thread on another forum and says "OMFG BADBOY GOT A INTEL CPU TODAY!!!!!!!" and makes a retarded list on how im satisfied or not.

....its pathetic.
 
Nvidia 6800 Ultra= Immature

ATI X800= Mature



Why, due to architecture reasons! Remember ATI still has to figure out half the things Nvidia already implemented in its card. That will be interesting to see, and then we'll see who is going all the white towel crying.

Caliche
 
Status
Not open for further replies.
Back
Top