ATI’s R520, RV515, RV530 Processors Ready for Production

Russ said:
wtf are you talking about. If your X800xt runs games at low res at max AA/AF etc well, then how is ANY card going to beat it soundly? Your logic is flawed because you don't consider a card to be much better just because at low res they are similar. On board graphics can run pong well, so can your X800xt. Do you then not consider the x800xt to "cripple" integrated graphics?

Who are you to say what "mainstream gaming resolution" is?

I'm rereading your post, and you don't seem so adamant against cards as I first thought you were. But I don't feel like deleting, so my post stays.

The thing about SLI 6800Us is that you could have gotten that 18 months ago, not 2 (7800GTX launch). Yea, a 7800GTX spanks the pants off a 9800pro, but when the 9800pro first came out, would you have wanted to wait for the 7800gtx? Hell no. There's always something better coming, but you have to upgrade sometime.

And I would hold onto an X800 or 6800 series card until the next cards come out since they are supposedly close. 512mb, 90nm, etc will probably spank the pants off a 7800GTX in UK2K7 ans such. That's what I'd buy if I had a X800 or 6800+.

Your absolutely Right no card is going to beat mine soundly as there are no games that tax mine yet at 1280x720 max detail, When future games start to struggle at my preferred resolution with max settings then i will look for/wait for a card that will cream my x800xt thats all i was saying.

I had a geforce 4 ti 4200 and when that started struggling in games i upgraded to a 9800xt, when it stuggled at games like doom 3, farcry etc at 1280x1024 max detail, i upgraded to an x800xt, now when this struggles at my current prefered gaming resolution then i will upgrade again. I wouldnt upgrade to an sli setup though for my gaming as you know, it is not needed/cost effective. I just thought, some people on these boards only have 19" LCDS with a max res of 1280 x 1024 which is why it baffles me when they bought 2 6800us a year ago when one would have ran all the games like butter at max detail even at present!! Which is why i say they are idiots, and if they say its for future proofing then why didnt they just wait and just buy 1 6800u a year ago then buy another one at the end of this year when taxing games come out and save themselves a hell of a lot of cash!!

I assumed the mainstream gaming resolution is 1280x1024 as 17" to 19" lcds are the most popular at the moment, bigger size lcds are out of most peoples price range for now. I apologise if im wrong!
 
razor1 said:
Well if they use the G80 as a refresh for the r580 which isn't unified then it won't be the unified pipes thats pretty certian, I'm pretty sure they won't be using the G80 against the r580. No need for a new architecture for this refresh. The g70 core has alot of room to play and if they start producing that on .09 process that will just give them more clocks to play with (which they are already doing this with the g70 aka RSX). We already know the shader performance of the g70 line is excellent so adding more pipes and increasing clocks would be all that is necessary for a refresh unless ATi's r520 kills the g70, which is highly unlikely possibly performs marginally better though. And coming next year around June/July is when Longhorn is slated for so that would be the best time for unified chips to come out which is also traditionally when nV and ATi for that matter release a new type of core. Doesn't seem to me nV is going to be behind in invoation anymore the way things are going. They are being aggressive on all fronts and not giving ATi any room.

They really had no reason to increase the per clock shader performance for this round for complex shaders. They did it and they did it in a big way, Certain shaders like global illumination run close to 150% faster on the GF7, and no game is going to be using that type of lighting system anytime soon.

Alright, we're goin in a circle then. The lead developer of nVidia is stating there is plenty of life left in current core architecture, not the G70 specifically but much broader, how the core operates. This is only a month ago. Basically from his comments this is what i get.

-Unified cores are much more difficult to produce to operate correctly and most importantly efficiantly.

-The current way cores operate is quite sufficient for a while to come.

-Next year is not the best time just because of longhorn, this seems to be your only reasoning. nVidia can support unified shaders anytime they want for years, doesnt have to be next year, especially if they think doing a unified core (again more difficult to produce) at .90nm, would be faster then anything they could do on current tech. I think they would have to have no doubt it would be faster to do it on all cores, high to low. From every indication of Mr. Kirk they do not have plans for this so soon. You're saying one thing, their lead developer is making very strong comments to the opposite. And your only reasoning is just because of longhorn, however if nVidia finds no benefit or even loses valuable R&D time or performance, they wont touch it, you know this as well as it do. However i pose you this theory.

What if they make their higher end cards non-unified. Continuing on the current way cores work (which has changed alot regaurdless).

However they do have a select mid-low range cards that operate unified, sort of like how the 6600GT was the first 110nm. I think this would make much more sense from how nVidia is acting. They would show they are not behind at all in technology, and that current cores are more then satisfactory to go against unified for that time. Just because the R600 is unified doesnt mean massive performance leap. To add to it they wouldnt burn valuable time trying to get a high performance core out with a die shrink and under unified tech which will take time. This also allows them to again, release earlier then ATI unless the R600 is ahead of schedule.

This seems more like how the nVidia now would act.
 
^eMpTy^ said:
This drives me crazy...why don't you let these people speak for themselves?

Anyone who spends $600 on a videocard doesn't care about burning money THAT much, or they wouldn't have bought it...and everyone knows that prices drop a few months after a card is launched...it happens with just about every card as long as supply is there...

So yeah...let's see a show of hands instead of just assuming that there is some nebulous group of unnamed consumers out there that are angry about prices dropping on the 7800GTX...

Prices dropping is due to manufacturers competing with one another...nvidia doesn't control it...while we're at it...let's see a show of hands of all the people that think prices dropping are a GOOD THING...

Haha. Someone is justifying their purchase! :) I think prices dropping is a good thing. However, tell me they didnt release the 7800 gtx knowing the r520 was coming out in a few months and figured they could just slash prices to compete. Good business or not I'd still feel a little gipped as a consumer since it just happened so fast.
 
Rash said:
Your absolutely Right no card is going to beat mine soundly as there are no games that tax mine yet at 1280x720 max detail, When future games start to struggle at my preferred resolution with max settings then i will look for/wait for a card that will cream my x800xt thats all i was saying.

I had a geforce 4 ti 4200 and when that started struggling in games i upgraded to a 9800xt, when it stuggled at games like doom 3, farcry etc at 1280x1024 max detail, i upgraded to an x800xt, now when this struggles at my current prefered gaming resolution then i will upgrade again. I wouldnt upgrade to an sli setup though for my gaming as you know, it is not needed/cost effective. I just thought, some people on these boards only have 19" LCDS with a max res of 1280 x 1024 which is why it baffles me when they bought 2 6800us a year ago when one would have ran all the games like butter at max detail even at present!! Which is why i say they are idiots, and if they say its for future proofing then why didnt they just wait and just buy 1 6800u a year ago then buy another one at the end of this year when taxing games come out and save themselves a hell of a lot of cash!!

I assumed the mainstream gaming resolution is 1280x1024 as 17" to 19" lcds are the most popular at the moment, bigger size lcds are out of most peoples price range for now. I apologise if im wrong!


Crank up the settings in EQ2...
 
dnavarro said:
I don't see how ATI is not being reactive. What is Crossfire???? Reactive to Nvidia's SLI. What is r520 (and shader 3.0)??? Reactive to Nvidia's 7800GTX (performance wise) and the 6800 series (shader 3.0)? What is the X800GT?? Reactive to the NVIDIA 6600GT (still with no shader 3.0 support my guess).

Think before you speak. An Xbox 360 chip ati has created has nothing to do with PC GPU's at the moment. Ati has said that their chip for the Xbox360 is a radical new architecture and will not be seen in performance or features (in Ati's PC GPU's) until Ati's chips after the R520 in 2006.


http://www.theinquirer.net/?article=25197

"The guy in charge of the demo told us that the graphic part is much more powerful than even the upcoming R5XX, series and that ATI's desktop unit will match Xbox 360 graphics with a next generation scheduled for next year."

To me that speaks volumes on r520. It is nothing new or radical and is just REACTIVE. I mean the PS3 is using a slightly stronger derivative (RSX) of the 7800GTX.

Obviously we will not know until benchmarks come for r520, but I think it will be bested with NVIDIA's Ace card some Ultra derivative of the 7800GTX.

Also, if ATI releases a hardlaunch of their entire line (to save face they have to hardlaunch something) that may be good in the short term, but think about this. NVIDIA can just tweak their entire line (midrange and low especially) to surpass all Ati offers. I think that is why NVIDIA is waiting.

Also, I was at FRY's yesterday (SoCal) and saw the 7800GT's (eVGA) for sale. So NVIDIA has now hardlaunched two cards to Ati's zero cards. In my opinion, these are horrible omens for Ati. They better hope xbox360 sells well because they will lose ground in the PC market this year.

D


Oh man, you have to be joking right.

You say ATi is reactive. I'll believe it when I see it. Is crossfire out yet cockbite? And what about the x800GT, HOW LATE IS THAT?!?! And we're really seeing the r520 today aren't we... NOT

How about you think before you speak dumbass.

Btw, its The Enquirer, and WHO GIVE A SHIT?!

Don't get me wrong, I love both nVidia and ATi cards, but your statement is just plain idiotic.
 
Back
Top