First "Ibiza" leaks? R520 scores 9700 in 3DM05

cant you just be happy? if ati beats nvidia it inspires nvidia to do better and vica verca! if there was only one company we'd still be playing on cards like 9600's
 
Apple740 said:
I remember the time the 7800 didn't exist yet; 3DM05 scores weren't important. Then the 7800 came out and suddenly each 7800 owner showed his score with pride.
And now we're again on the point 3DM05 means nothing. :D

You never saw me holler anything about 3Dmark scores. They're still meaningless. If you open up control panel- add/remove programs, is this what you see?


foy3fg.jpg
 
Rollo said:
Hmmm check out what Mr. E PMd me:



My "nVidia butt buddies"? WTF is that? LOL- yeah- our choice of video cards has somehow made made us prefer an*l sex, with each other!

Like I said, Mr. E takes this LAUGHABLY serious- "Oooh yeah! There are some imaginary clubs of the nV disciples and the ATI disciples, and when we can run our HL2 3 fps faster than them, we'll be OWNING them and sleeping with a smile at night!"

:rolleyes:

Could this be any sadder?

BTW- 5150- is that the leaf bag attachment we've heard such good things about For the R300 v.4?


LOL @ the PM. Yep the leaf bag attachment is top-class stuff, the Chanel bag of the video card leaf blower industry to be precise. :D
 
Wally said:
true, but then take into account nVidia's crippling of AF with the 68/78 series of cards jut to gain fps and it's looking good for R520 both in terms of power and visuals
:rolleyes: ati does it too.
 
yevaud said:
All I can say is:

If those specs/numbers are real....CONGRATULATIONS ATI !!
Now let's see how it does against a 7800GTX that's not clocked at 430 :)


You mean there are such things???

Like the XFX, Asus, EVGA KO? ;)
 
I will say that the AF pattern ATI is using now seems to have returned them to the top with regard to AF quality. Sure to be a hot topic in the coming weeks. :D

For those who missed it:

image25cs.jpg


The other thing is that they're claiming to be able to do HDR and AA/AF as an exclusive advantage. Hope this doesn't mean patents that prevent Nvidia from doing the same.
 
W1zzard said:

I noticed one thing in those slides:
The X1300 Radeon features a Shader Model 3.0 architechture done right.
Only possible on 90nm the RADEON packs more features than ever before.

1. NVIDIA's SM3.0 engine isn't flawed :rolleyes:
2. NVIDIA dosn't have any 90nm SM3.0 hardware, but still have SM3.0 :rolleyes:

Terra - I hate PR that distorts facts :(
 
Terra said:
I noticed one thing in those slides:

Terra said:
The X1300 Radeon features a Shader Model 3.0 architechture done right.
Only possible on 90nm the RADEON packs more features than ever before.

1. NVIDIA's SM3.0 engine isn't flawed :rolleyes:
2. NVIDIA dosn't have any 90nm SM3.0 hardware, but still have SM3.0 :rolleyes:

Terra - I hate PR that distorts facts :(

Perhaps they can do more with their way of doing SM3. Who knows? Certainly nobody who can say just yet. Sure PR can go overboard at time, but until we get some facts, we cant distpute anything right now. I dont believe all the PR either, but at this point, we have nothing else to go on.

Speaking of disrorted PR... It doesnt say, "Only possible on 90nm the RADEON" as you said, it says "Only possible with 90nm technology". Then goes on to say the new Radeon tech gives more features per square inch than ever.
 
Terra said:
I noticed one thing in those slides:


1. NVIDIA's SM3.0 engine isn't flawed :rolleyes:
2. NVIDIA dosn't have any 90nm SM3.0 hardware, but still have SM3.0 :rolleyes:

Terra - I hate PR that distorts facts :(

Yeah I must say I found that pretty funny...we've all been saying for over a year now how SM3.0 would suddenly be the cool thing to do once they made it...it's actually kinda funny to see it in print over a year later...

These slides are the standard kinda thing you see at a product launch, so just about everything written on them you can take with a grain of salt...though the part about HDR+FSAA was kinda interesting...

I don't think there's much to be learned by comparing the x1300 to the 6600...which was released what...a year ago? I mean, I would hope they could beat year old technology...

EDIT: Actually...I take alllllll that back...the x1300pro, the one in the benchmarks, is $149 which pits it against the 6600GT...which is probably faster...for some reason they compared it to the vanilla 6600 = lame...

looking at the 3dmark05 score for the pro...it looks decisively slower than a 6600gt...and that's with a 100Mhz clock advantage...that seems wierd...I wonder if these slides are messed up...
 
^eMpTy^ said:
EDIT: Actually...I take alllllll that back...the x1300pro, the one in the benchmarks, is $149 which pits it against the 6600GT...which is probably faster...for some reason they compared it to the vanilla 6600 = lame...

Is the MSRP of the 6600GT $149? They are comparing MSRP vs. MSRP, from the looks of it.
 
fallguy said:
Is the MSRP of the 6600GT $149? They are comparing MSRP vs. MSRP, from the looks of it.

right...which is complete bullshit...the 6600gt has been selling less than $149 all summer...I hope they plan on dropping prices really fast...cuz nvidia's cards have been selling way below msrp for quite some time...
 
yevaud said:
I will say that the AF pattern ATI is using now seems to have returned them to the top with regard to AF quality. Sure to be a hot topic in the coming weeks. :D

For those who missed it:

image25cs.jpg


The other thing is that they're claiming to be able to do HDR and AA/AF as an exclusive advantage. Hope this doesn't mean patents that prevent Nvidia from doing the same.

Wholly **** that is the first I have heard about ATi getting rid of angle dependant AF. Now I am really excited about the R520. Having had live with the shimmering issues on my 6800 and 7800, I am really excited to hear that non-angle dependant AF is back. If this is true I am definitely getting a R520 as soon as it becomes available.
 
Particleman said:
Wholly **** that is the first I have heard about ATi getting rid of angle dependant AF. Now I am really excited about the R520. Having had live with the shimmering issues on my 6800 and 7800, I am really excited to hear that non-angle dependant AF is back. If this is true I am definitely getting a R520 as soon as it becomes available.

I'm gonna go out on a limb and say that the 520 generation may be the first that allows the AF pattern to be changed in the driver. No proof, just a hunch based on the fact that ATI is calling it "HQ" 16x AF
 
I like my 7800gt but the screen quality is just not as good as it was on my x800. The shimmering really get under my skin, especially in Guild Wars. Looks like my 7800gt may be in the for sale/trade forum here in a little bit.
 
performance doesn't really look all that impressive to me...only 10% faster than a reference 7800gtx in half life 2? with a lot of retail cards clocked 15% than that, and new drivers on the way...I don't think ATi's performance advantage per these slides will last for long...

on the other hand...non-angle-dependent AF and HDR+AA...dude...this is gonna be a sweet card...
 
mrgimble said:
And in the end its only the game benchmarks that matter. Cus 3dmark has little replay value. :D

I think 3dmark has been replayed more than any other game on the planet.
 
^eMpTy^ said:
performance doesn't really look all that impressive to me...only 10% faster than a reference 7800gtx in half life 2? with a lot of retail cards clocked 15% than that, and new drivers on the way...I don't think ATi's performance advantage per these slides will last for long...

on the other hand...non-angle-dependent AF and HDR+AA...dude...this is gonna be a sweet card...

Yeah but who cares about HDR+AA if it's not useable? Gotta see some hard numbers to determine if having this feature is even worth it yet.
 
5150Joker said:
Yeah but who cares about HDR+AA if it's not useable? Gotta see some hard numbers to determine if having this feature is even worth it yet.

yeah I can't wait to see the full reviews...
 
5150Joker said:
Yeah but who cares about HDR+AA if it's not useable? Gotta see some hard numbers to determine if having this feature is even worth it yet.

Good point. A pretty slide show would not be very impressive.
 
millerpa17 said:
I think the average 3dmark 05 benchmark for a 7800 gtx is around 8000...so ATI's card is looking like its gonna waste the 7800.

See here:
http://www.hardwareanalysis.com/content/topic/45517

how does the 3dmark tell anything about in-game performence ? last time i checked ATI used to higher points in 3dmark05.

Btw how the heck is really going to use ln2 cooling to overclock their cards. Also the warrenty for this card is 1 year.

And remember the x1800xt is coming in november not october. And the brossfire board is coming late november.
 
plywood99 said:
I like my 7800gt but the screen quality is just not as good as it was on my x800. The shimmering really get under my skin, especially in Guild Wars. Looks like my 7800gt may be in the for sale/trade forum here in a little bit.

Have you tried the 7803 drivers?
 
nVidia's shimmering fix is not complete IMO, as you have to go to HQ to elminate shimmering. Under certain games/conditions the performance hit going to HQ from Quality can be massive (if you load up fraps and walk around in UT2004 ONS-Primeval, the fps hit is 30%+). Plus in the even newer beta drivers nVidia has shown no inclination towards fixing their "Quality" mode. On my 7800 GTX even with the performance hit everything is still playable, the same doesn't go my old 6800 GT which I've never really gotten the performance advertised as I have been stuck in HQ for the entire period I've owned the card. Anyways, even on HQ, angle dependant AF isn't disabled. I'm confident nVidia will respond with the non-angle dependant AF of the 5900/4600 days, if not on the 6800/7800 if it is hardwired then on their next gen card, but I am tired of all the crazy AF optimizations degrading IQ.
 
I want to see people OC these cards to 1ghz Core speed, now that'd be sweet!
 
Here is one example of why 3D-shittymark is useless:
3DMark03 Performance Factors

350 MHz PII R9700 Pro: 3373 3DMarks
1.4 GHz Celeron R9700 Pro: 4325 3DMarks
2.8 GHz P4 R9600: 3443 3DMarks

Terra - Now you call that a good comparison-tool? :rolleyes:
 
well one good thing is definately coming from this. if ati does do angle independent af, its going to force nvidia to respond by doing the same. no more cheating or shimmering!
 
Btw how the heck is really going to use ln2 cooling to overclock their cards. Also the warrenty for this card is 1 year

The 1 year warrenty is only for ATi bought cards. Saphire and Others may have longer Warrenties for the X1800XTs.
 
^eMpTy^ said:
right...which is complete bullshit...the 6600gt has been selling less than $149 all summer...I hope they plan on dropping prices really fast...cuz nvidia's cards have been selling way below msrp for quite some time...

No its not bs. Who cares if they sell lower than MSRP? Most cards do. MSRP vs. MSRP is how they should be reviewed, as is the case with most everything. The PCI-E X850XT has a much lower street price (about $100) than the PCI-E 6800U. Yet the are compared to each other, as it should be. The XT's price is closer to the GT's, than the Ultra's.

The Pro may be down to $120'ish soon after launch. Then how do you want it compared? Then it would be lower than the GT's street price. Then would you want reviews edited to compare it to another card? No. MSRP vs. MSRP (or as close as you can get), its the only way to go.
 
fallguy said:
No its not bs. Who cares if they sell lower than MSRP? Most cards do. MSRP vs. MSRP is how they should be reviewed, as is the case with most everything. The PCI-E X850XT has a much lower street price (about $100) than the PCI-E 6800U. Yet the are compared to each other, as it should be. The XT's price is closer to the GT's, than the Ultra's.

The Pro may be down to $120'ish soon after launch. Then how do you want it compared? Then it would be lower than the GT's street price. Then would you want reviews edited to compare it to another card? No. MSRP vs. MSRP (or as close as you can get), its the only way to go.

Yep, the reviewers should use MSRP. The unfortunate consumers who happen to be on a budget, should on the day they want to spend their money, determine which card gives them the best performance/features for their budget amount.
No one who is wanting to buy a card gives a half a care about what the MSRP of a card is-- they want the best value for their dollar. If two cards in a comparison group both MSRP for $499, and one can be bought for $390, while the other can only be found for $799- the choice is clear (at least to the consumer with any sense or budgetary constraints).
 
Rollo said:
Have you tried the 7803 drivers?


Yup, it is not as bad but still there. I've tried just about every new driver out there. Thing is it doesn't bother me in other games, but in Guild Wars it practically screams at you, on the walls, the ground, hills, everywhere, shimmer shimmer shimmer.

This did not happen on my x800 so I know it is a problem with Nvidia. I really hope they fix it, just got this card and would like to keep it, nut not at the cost of image quality...
 
Back
Top