ATI’s R520, RV515, RV530 Processors Ready for Production

tornadotsunamilife said:
That's what we're saying. The consider the R500 (Xenos chip) as a first try out on new features and architecture. Ati will be able to expand on this for the R600 (but this will obviously be a new architecture), ie they already have an advantage next-generation and this generations card isn't even out.
What?
 
neubspeed said:
Very few are waiting for the ATi card, and fewer will pay the premium at the time of launch over a GTX which will be approx $120+ cheaper.

Where the hell are you pulling these prices from? You have no idea how much these cards will cost, ati could sell at a lost if they were desperate, or they can charge more for a card that is a lot faster for either case though.
 
Cali3350 said:
Seeing as how ATI has openly taken unified shaders as the future, and infact already has a part out with them and is moving all of their future products to such a architecture, id say them being reactive to NV is....stupid?

tornadotsunamilife said:
That's what we're saying. The consider the R500 (Xenos chip) as a first try out on new features and architecture. Ati will be able to expand on this for the R600 (but this will obviously be a new architecture), ie they already have an advantage next-generation and this generations card isn't even out.

They themselves weren't 100% responsible for that, Microsoft is also part of the equation and the xbox is Microsoft. nV never stated that unified shaders weren't in the future they said its not necessary right now for desktops. When longhorn is ready it will be a necessity. But we already know that the g80 is infact most likely unified that doesn't really give ATi much of an advantage here. Since thier desktop varient isn't at least that is the general concensus until the r600 is ready which won't be for a while. ATi if infact does release a unified shader chip right now (if the r520 is unified, not saying it is just saying if it is) and performance is agreeable to traditional archictures then kudos to them them thats what I expect from them. If they come out with a r520 thats traditioanl, and hard to produce, and isn't much faster 0-10% faster then the g70 thats not good for them.
 
tornadotsunamilife said:
Where the hell are you pulling these prices from? You have no idea how much these cards will cost, ati could sell at a lost if they were desperate, or they can charge more for a card that is a lot faster for either case though.

Well you have to assume ATI's card will be at least $599 launch week just like the GTX. And right now you can get a 7800 for about $480. Do you seriously think within reason that the top R520 will be sold for any less than that? (Not talking about MSRP even, but the price on the street)
 
dnavarro said:
The r600 may be a step in the right direction, but all they have shown so far is reactive. r600 won't come to desktop until what Middle to end of 2006? At that point what do you think NVIDIA might have? This is all WAY up in the air and nobody knows what will happen. Ati could pull an NVIDIA "FX" release blunder for all we know if this technology isn't all that (as NVIDIA has said with unified shaders). The software must take advantage.

All we have to go on is SLI-Crossfire, 6600GT-x800GT, and of course their following NVIDIA's lead in PS 3.0. So how could anyone in their right mind say with what we know so far that Ati has not been reactive? If you look at this recent history this is quite clear.

Any other speculation is just that until products get released.
They already have a part based on the technology out. Its in the Xbox 360. Any NV30 esque mistakes can be taken care of.
 
tornadotsunamilife said:
The R600 will be WGF 2.0/DX10 compliant. The R500 is WGF 2.0/DX 10 semi-compliant.
How will that give ATi and advantage in the next generation of cards? I assume that you are implying an advantage over nVidia's next generation cards.
 
Cali that is all speculation. Just because something works in the console arena doesn't necessarily mean it will work for PC's. That will all play out with r600 which is way to early to talk about. We are talking about ATI and the last year's history.....all REACTIVE. Nothing that pushes the industry. Follow the leader if you will.
 
mike0219116 said:
How will that give ATi and advantage in the next generation of cards? I assume that you are implying an advantage over nVidia's next generation cards.

No. I'm implying that will have have an advantage when designing and creating the core (although the it may be a hard to transition from console to desktop).
 
dnavarro said:
Cali that is all speculation. Just because something works in the console arena doesn't necessarily mean it will work for PC's. That will all play out with r600 which is way to early to talk about. We are talking about ATI and the last year's history.....all REACTIVE. Nothing that pushes the industry. Follow the leader if you will.
They can look at the hardware, they can see where weaknesses lie, they can improve it. NV30 was so horrible because of its horrible support for shaders. If Xenos has such problems, they still have some time to address it. They already have a core taped out on 90nm using this technology. Noones saying they have a surefire hit, but they sure as hell have an advantage.
 
Terra said:
Why is that?
It's great news for the consumers, when a product is hardlaunched.
ATI would look inferior if they didn't try a hardlaunch.
The bad publicity would be huge..so kudos to NVIDIA for setting a (hopefully) new trend ;)

Terra...

Nah at this point a paper launch with confirmed release a month from now wouldn't harm ATi in the least. It would probably garner more sales for them.
 
^eMpTy^ said:
This drives me crazy...why don't you let these people speak for themselves?

Anyone who spends $600 on a videocard doesn't care about burning money THAT much, or they wouldn't have bought it...and everyone knows that prices drop a few months after a card is launched...it happens with just about every card as long as supply is there...

So yeah...let's see a show of hands instead of just assuming that there is some nebulous group of unnamed consumers out there that are angry about prices dropping on the 7800GTX...

Prices dropping is due to manufacturers competing with one another...nvidia doesn't control it...while we're at it...let's see a show of hands of all the people that think prices dropping are a GOOD THING...

I feel burned! I paid $647 shipped for mine only to see prices drop over $100 a week later which kills the money I will make back by selling this card. So yes I feel cheated.
 
Cali3350 said:
They can look at the hardware, they can see where weaknesses lie, they can improve it. NV30 was so horrible because of its horrible support for shaders. If Xenos has such problems, they still have some time to address it. They already have a core taped out on 90nm using this technology. Noones saying they have a surefire hit, but they sure as hell have an advantage.


Not exactly, if the Fx was the only sm 2.0 card out what is there to compare it too? Also code being written for the xbox 360 will be tailored to it, so weakness might be circumvented without ATi knowing about it. Also the xenon's chip has a butt load of bandwith because of the edram, which will not be in desktop chips anytime soon.

Some weaknesses might show up, but not all.
 
tornadotsunamilife said:
The R600 will be WGF 2.0/DX10 compliant. The R500 is WGF 2.0/DX 10 semi-compliant.

Theres no such thing as 'semi-compliant'. Its either compliant, or its not. If it includes some DX10 feature, but not all of them, then its not DX10 compliant, it as simple as that.
 
5150Joker said:
I feel burned! I paid $647 shipped for mine only to see prices drop over $100 a week later which kills the money I will make back by selling this card. So yes I feel cheated.
cheated? having the newest thing right when it comes out takes a premium beyond the worth of the product. if anything, you cheated yourself.
 
fallguy said:
NV only launched one card, so far. If ATi was to launch multiple cards at once (and actually have them available), that is why he said(I think) it would smear what little NV did. Little as in, they only launched one card. Very few people buy cards for $600, or even $500. Many more buy at a $200 price range.

it would not smear anything because ATI claimed originally you would see the R520 just a little bit after the 7800 launch...

Oh yeah - you will be waiting a long time for Direct X 10, because it itsn't coming

Mircrosft is going a different route this time
 
Zinn said:
cheated? having the newest thing right when it comes out takes a premium beyond the worth of the product. if anything, you cheated yourself.

I didnt buy the card right when it came out, I waited about 2 weeks. Still though 1 week later the price dropped dramatically which left a bad taste. Of course I made a conscious decision to spend that much but I wasn't expecting such a dramatic price drop so soon either.
 
Zinn said:
cheated? having the newest thing right when it comes out takes a premium beyond the worth of the product. if anything, you cheated yourself.


That price dropped by a lot rediculously fast. Dont be a jerk to these people. Niether company pulled a stunt like that until nVidia, i'd be quite dissapointed to see ATI follow suite. Its a sleazy tactic. And dont blame the resellers, again this is the first time this was done, nVidia's enterance MSRP was 599, not chosen by the resellers MSRP like previous price gouging, i highly doubt they still have that listed but every reseller simply decided to drop the price 20%+

Heres the product, but you cant have it at the actual long term MSRP for another month :D

dnavarro said:
The r600 may be a step in the right direction, but all they have shown so far is reactive. r600 won't come to desktop until what Middle to end of 2006? At that point what do you think NVIDIA might have? This is all WAY up in the air and nobody knows what will happen. Ati could pull an NVIDIA "FX" release blunder for all we know if this technology isn't all that (as NVIDIA has said with unified shaders). The software must take advantage.

All we have to go on is SLI-Crossfire, 6600GT-x800GT, and of course their following NVIDIA's lead in PS 3.0. So how could anyone in their right mind say with what we know so far that Ati has not been reactive? If you look at this recent history this is quite clear.

Any other speculation is just that until products get released.

I like how ATI and NV30 keep getting linked. How about...nVidia pulling an NV30 with the G80? Yep i do believe thats just as possible since, heh, they set that benchmark in video card history.

mrhemmy said:
it would not smear anything because ATI claimed originally you would see the R520 just a little bit after the 7800 launch...

Oh yeah - you will be waiting a long time for Direct X 10, because it itsn't coming

Mircrosft is going a different route this time

I know your 7800 is broke so you're pretty mad, but what?

DarkBahamut said:
Theres no such thing as 'semi-compliant'. Its either compliant, or its not. If it includes some DX10 feature, but not all of them, then its not DX10 compliant, it as simple as that.

Sure there is, it doesnt fully support everything WGF2.0 will, however its base, being unified, is very simuliar, therefore its "semi-compliant".


{NG}Fidel said:
And how do you know this?

Its in their future, but nothing is set in stone, honostly i think they'll do what ever they like, ATI does seem very set to go forward with it:

NVIDIA's Chief Architect David Kirk has said in a recent interview that they will do a unified architecture in hardware when it makes sense and when it is possible to make the hardware work faster unified. It will be easier to build in the future, but for the meantime, there's plenty of mileage left in this G70 architecture.

I sure as hell wouldnt place money on the G80 being unified with their stance on it. Their stance is more or less in more programmable pipelines, and every question about unified gets a deflection. Perhaps 2006 will be very interesting indeed.
 
Where are the benches?
Untill I see them, the r520 is still unreleased, and not a factor I reckon with in my buying decisions...

Hell, there are not even any faked "leaked" 3D-shittymark scores...when they start appering we will know the cards are 1-2 months away...

Terra - No matter what any PR-guy says...
 
Sigh. This must be the 50th coming soon thread.

When Newegg has R520 based boards in stock for less than $499 & it beats the 7800GTX by more than 15% & it includes Crossfire, FP16 HDR, SM3.0, H.264..... it will be news worthy.
 
Terra said:
Where are the benches?
Untill I see them, the r520 is still unreleased, and not a factor I reckon with in my buying decisions...

Hell, there are not even any faked "leaked" 3D-shittymark scores...when they start appering we will know the cards are 1-2 months away...

Terra - No matter what any PR-guy says...
I concur.

ATi has fallen behind delivering high-end graphics cards, they're doing just fine in the mid-range / low-range.
 
Shifra said:
I sure as hell wouldnt place money on the G80 being unified with their stance on it. Their stance is more or less in more programmable pipelines, and every question about unified gets a deflection. Perhaps 2006 will be very interesting indeed.


The G80 is the nv50 which is ment for Windows longhorn, this was know for a very long time it was going to have unified shaders. Also don't really need WGF 2.0 to utlize unified shader structure. Stucture of how a chip schedules its interaction of the data given to it doesn't have anything to do with the API.
 
Why on earth would someone buy a 600$ gaming card for back to school?!? :confused: It's be best to launch titles and Hardware at the beginning of summer.
 
razor1 said:
The G80 is the nv50 which is ment for Windows longhorn, this was know for a very long time it was going to have unified shaders. Also don't really need WGF 2.0 to utlize unified shader structure. Stucture of how a chip schedules its interaction of the data given to it doesn't have anything to do with the API.


nope, exact specs of the G80 are still unknown, including its pipeline build, same stands for the R600, its just easier to guess with accuracy the R600 will be unified considering their past comments. Unless you have exact proof from an nVdia executive not downplaying but confirming unified shaders, there is every chance they'll work with some form of advanced pipeline. WGF can use either so its not like its going to matter. It doesnt need unified shaders to be WGF2.0 compliant. The only places that have stated bodly that the G80 is using unified shaders, are just about as accurate as the inq. Im sure VR-Zone got tons of leaked info. This is nothing more then a rumer that may or may not bite these places in the ass, unified shaders is well known because its something WGF2.0 supports completely, so its pretty easy to see how a G80=unified connection can start. Fact is nVidia has been strongly against it. If it gives no benefit and they do it anyway, that would be against what their people have been saying.

PRIME1 said:
Sigh. This must be the 50th coming soon thread.

When Newegg has R520 based boards in stock for less than $499 & it beats the 7800GTX by more than 15% & it includes Crossfire, FP16 HDR, SM3.0, H.264..... it will be news worthy.

Everyone knows your opinion prime, not sure anyone cares. If you dont like these threads, stay out.
 
Shifra said:
nope, exact specs of the G80 are still unknown, including its pipeline build, same stands for the R600, its just easier to guess with accuracy the R600 will be unified considering their past comments. Unless you have exact proof from an nVdia executive not downplaying but confirming unified shaders, there is every chance they'll work with some form of advanced pipeline. WGF can use either so its not like its going to matter. It doesnt need unified shaders to be WGF2.0 compliant.

Well its not a coincidance the nv50/g80 has been delayed every time Windows longhorn has been delayed.
 
razor1 said:
Well its not a coincidance the nv50/g80 has been delayed every time Windows longhorn has been delayed.


Why not? Cores get delayed all the time. And as you said, unified shaders doesnt need WGF2.0, so why would nVidia delay it if it was ready now or a year ago. They like to strut new technology. Longhorn has been delayed so many times i lost count for the last 2 years. So i dont even see how a pattern can be linked.

Just think its folly to automatically assume the company is making a unified part for vista launch when they've spoken so strongly against it for the last few years. If programmable pipes are enough, which i see no reason for them not to be, there is no reason to not assume they'll continue work on the G70 refinement at 90nm.
 
Shifra said:
Why not? Cores get delayed all the time. And as you said, unified shaders doesnt need WGF2.0, so why would nVidia delay it if it was ready now or a year ago. They like to strut new technology. Longhorn has been delayed so many times i lost count for the last 2 years. So i dont even see how a pattern can be linked.

Just think its folly to automatically assume the company is making a unified part for vista launch when they've spoken so strongly against it for the last few years. If programmable pipes are enough, which i see no reason for them not to be, there is no reason to not assume they'll continue work on the G70 refinement at 90nm.


Not true, Nvidia and ATi have numerous chips in development. Just by saying that they don't want to use unified shaders doesn't me they aren't testing it out already. And if you go over to nV news, there is a guy there that worked at nvidia, and he does know somethings ;)

http://www.nvnews.net/vbulletin/showthread.php?t=54104
 
razor1 said:
Not true, Nvidia and ATi have numerous chips in development. Just by saying that they don't want to use unified shaders doesn't me they aren't testing it out already. And if you go over to nV news, there is a guy there that worked at nvidia, and he does know somethings ;)

http://www.nvnews.net/vbulletin/showthread.php?t=54104


Dont misunderstand me, im not saying nVidia is never going unified, i just dont think they're gonna fight it, fight it, fight it, then at the vista launch go "woops you know that looks good".

And i prefer David Kirk :).

Debating unified against separate shader architecture is not really the important question. The strategy is simply to make the vertex and pixel pipelines go fast. The tactic is how you build an architecture to execute that strategy. We're just trying to work out what is the most efficient way.

It's far harder to design a unified processor - it has to do, by design, twice as much. Another word for 'unified' is 'shared', and another word for 'shared' is 'competing'. It's a challenge to create a chip that does load balancing and performance prediction. It's extremely important, especially in a console architecture, for the performance to be predicable. With all that balancing, it's difficult to make the performance predictable. I've even heard that some developers dislike the unified pipe, and will be handling vertex pipeline calculations on the Xbox 360's triple-core CPU.

We will do a unified architecture in hardware when it makes sense. When it's possible to make the hardware work faster unified, then of course we will. It will be easier to build in the future, but for the meantime, there's plenty of mileage left in this architecture.

Those are not comments for the short term. Nvidia is not stupid, so i strongly believe that if they feel programmable seperate pipelines are easier and more efficient, they sure will stick with it while continuing to experiment and refine on a unified part. Dont forget they'll be doing their own series 90nm process jump as well, it might just put them in bad place to attempt a complete die shrink on a complex core like a unified one.
 
LOL, Good opinions everyone but i wont listen to anyone but myself, Fuck the Nvidia 7 series and fuck the r500 series if it doesnt cream my x800xt by a huge margin. I just hope ATI release ther top end card that wastes the 7800gtx and wipes the shitty smile right off your faces. My opinion is these new top end cards are not even needed yet unless your after mega high end resolution gaming, im happy with 1280x720 on my 32" TV full detail running like butter on all current games. lets see the games cripple the last gen cards at mainstream gaming resolutions, then it should be time for the next gen cards. In my eyes you got o be fuckin dumb to spend 500 plus on a graphics card, NVIDIA must be laughing their bollocks off at you, and so am i!!. Im still laughing at all those people that bought two 6800ultras when now one 7800gtx creams them lol silly bastards. YOU HAVE ALL BEEN HAD!! Just my opinion on the situation lol. My moneys going towards a playstation 3, PC gamings had it for a while, fuckin bored with this shit, exspensive hobbies these gaming pcs........I have had too much red bull!!! :D :eek:
 
Shifra said:
Dont misunderstand me, im not saying nVidia is never going unified, i just dont think they're gonna fight it, fight it, fight it, then at the vista launch go "woops you know that looks good".

And i prefer David Kirk :).



Those are not comments for the short term. Nvidia is not stupid, so i strongly believe that if they feel programmable seperate pipelines are easier and more efficient, they sure will stick with it while continuing to experiment and refine on a unified part. Dont forget they'll be doing their own series 90nm process jump as well, it might just put them in bad place to attempt a complete die shrink on a complex core like a unified one.


Noone is saying the G80 is coming out as a refresh ;) but in one year things are going to change quite a bit,

Seperate vertex shader units you mean not pipelines. When did David Kirk state that though its been quite a while, and

Originally Posted by DaveBaumann
I asked DK about unified architectures when I was given my very initial NV40 breifing he argued quite vehemently against them then. He has pretty much been consistently on that path since then, up until fairly recently where be has been making more conciliatory noises about it. Given that DK is working one or two architecture down those types of thoughts are probably about the types of things he's actually working on. Given the recent noises I think its almost certain they will go the unified route at some point, but I personally don't expect it for G80 given the design of this thing is probably in its final stages (i.e. the high level "architecture" choices were set down a long time ago), but possibly for the architecture after - this type of timing would also fit a lot better with the timing for WGF2.0.

If you take DK's words as true the G80 is pretty much ready to go ;), the thing is its quite possible it is unified, and nV is not saying anything about it. They have used this tactic quite a few times, I have talked with guys in nV too, they think ATi is very close to releasing a unified part for the desktop. Which I don't completely agree on but will have to wait and see.

Code names don't really matter. If nV feels that ATi is about to release a unified part with good peformance they aren't going to wait too long to get a unified part out just shuffle which chips they are working on now. What really matters is what people will be thinking marketing wise, nV seems to be doing a pretty good job at hyping features they have that ATi lacks, pushing the envolope, it would actually be advantages to them to release a unified part before ATi for desktops and this is not uncommon to create a smoke screen.
 
Those comments/interview quotation of his arent old at all. July 11th this year.

http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/4.html

Biggest thing i think people are thinking automatically is unified=higher performance. May not be true. R600 is quite possibly going to be unified, and if the G80 isnt and beats it in real world apps, its going to definitly start some fires so to speak. As i said, if it works out this way, 2006 could be very interested indeed.

Code names don't really matter but what does matter is what people will be thinking marketing wise, nV seems to be doing a pretty good job at hyping features they have that ATi lacks, pushing the envolope, it would actually be advantages to them to release a unified part before ATi for desktops.

Yep i think that too. If nVidia saw benefit in unified they'd absolutly seize the chance and upstage ATI. But when their lead developer makes comments like that recently, it makes VR-Zones G80 unified info seem like total non-sense which wouldnt be a first. While im sure they have a unified core they're working on, i just dont see it coming next year, especially with the prospect of a die shrink which may or may not already carry its own obsticles.
 
Shifra said:
Those comments/interview quotation of his arent old at all. July 11th this year.

http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/4.html

Biggest thing i think people are thinking automatically is unified=higher performance. May not be true. R600 is quite possibly going to be unified, and if the G80 isnt and beats it in real world apps, its going to definitly start some fires so to speak. As i said, if it works out this way, 2006 could be very interested indeed.



Yep i think that too. If nVidia saw benefit in unified they'd absolutly seize the chance and upstage ATI. But when their lead developer makes comments like that recently, it makes VR-Zones G80 unified info seem like total non-sense which wouldnt be a first. While im sure they have a unified core they're working on, i just dont see it coming next year, especially with the prospect of a die shrink which may or may not already carry its own obsticles.


I see what your saying but when rumors coming for inside nV saying they are thinking ATi is coming with a unified part soon, and DK's current remarks having a bit of change recently something is up. Either nV is seriously concidering using Unified pipes sooner the later, or they are just trying to get ATi to show thier hand, which I don't think its the later they are already doing that.
 
5150Joker said:
I feel burned! I paid $647 shipped for mine only to see prices drop over $100 a week later which kills the money I will make back by selling this card. So yes I feel cheated.

Ummmm at the time you bought the card you thought it was worth $647 - nobody held a gun to your head. A little refresher on basic economics would alleviate a lot of your misdirected frustration :)
 
razor1 said:
I see what your saying but when rumors coming for inside nV saying they are thinking ATi is coming with a unified part soon, and DK's current remarks having a bit of change recently something is up. Either nV is seriously concidering using Unified pipes sooner the later, or they are just trying to get ATi to show thier hand, which I don't think its the later they are already doing that.

I expect NV50/G80 to be a significant departure from NV30/40 tech since NV47/G70 was ready a very long time ago - maybe even around the x850 refresh timeline but Nvidia probably realized that there was no need for a refresh last generation so they saved it for now. But I agree with the others - David Kirk's recent comments are very contrary to the notion of G80 being unified - unless G80 is their Longhorn product and they will be pushing NV40 architecture until then - which would mean that their next-next-generation part still won't be unified.
 
trinibwoy said:
I expect NV50/G80 to be a significant departure from NV30/40 tech since NV47/G70 was ready a very long time ago - maybe even around the x850 refresh timeline but Nvidia probably realized that there was no need for a refresh last generation so they saved it for now. But I agree with the others - David Kirk's recent comments are very contrary to the notion of G80 being unified - unless G80 is their Longhorn product and they will be pushing NV40 architecture until then - which would mean that their next-next-generation part still won't be unified.


Well if they use the G80 as a refresh for the r580 which isn't unified then it won't be the unified pipes thats pretty certian, I'm pretty sure they won't be using the G80 against the r580. No need for a new architecture for this refresh. The g70 core has alot of room to play and if they start producing that on .09 process that will just give them more clocks to play with (which they are already doing this with the g70 aka RSX). We already know the shader performance of the g70 line is excellent so adding more pipes and increasing clocks would be all that is necessary for a refresh unless ATi's r520 kills the g70, which is highly unlikely possibly performs marginally better though. And coming next year around June/July is when Longhorn is slated for so that would be the best time for unified chips to come out which is also traditionally when nV and ATi for that matter release a new type of core. Doesn't seem to me nV is going to be behind in invoation anymore the way things are going. They are being aggressive on all fronts and not giving ATi any room.

They really had no reason to increase the per clock shader performance for this round for complex shaders. They did it and they did it in a big way, Certain shaders like global illumination run close to 150% faster on the GF7, and no game is going to be using that type of lighting system anytime soon.
 
Rash said:
LOL, Good opinions everyone but i wont listen to anyone but myself, Fuck the Nvidia 7 series and fuck the r500 series if it doesnt cream my x800xt by a huge margin. I just hope ATI release ther top end card that wastes the 7800gtx and wipes the shitty smile right off your faces. My opinion is these new top end cards are not even needed yet unless your after mega high end resolution gaming, im happy with 1280x720 on my 32" TV full detail running like butter on all current games. lets see the games cripple the last gen cards at mainstream gaming resolutions, then it should be time for the next gen cards. In my eyes you got o be fuckin dumb to spend 500 plus on a graphics card, NVIDIA must be laughing their bollocks off at you, and so am i!!. Im still laughing at all those people that bought two 6800ultras when now one 7800gtx creams them lol silly bastards. YOU HAVE ALL BEEN HAD!! Just my opinion on the situation lol. My moneys going towards a playstation 3, PC gamings had it for a while, fuckin bored with this shit, exspensive hobbies these gaming pcs........I have had too much red bull!!! :D :eek:

wtf are you talking about. If your X800xt runs games at low res at max AA/AF etc well, then how is ANY card going to beat it soundly? Your logic is flawed because you don't consider a card to be much better just because at low res they are similar. On board graphics can run pong well, so can your X800xt. Do you then not consider the x800xt to "cripple" integrated graphics?

Who are you to say what "mainstream gaming resolution" is?

I'm rereading your post, and you don't seem so adamant against cards as I first thought you were. But I don't feel like deleting, so my post stays.

The thing about SLI 6800Us is that you could have gotten that 18 months ago, not 2 (7800GTX launch). Yea, a 7800GTX spanks the pants off a 9800pro, but when the 9800pro first came out, would you have wanted to wait for the 7800gtx? Hell no. There's always something better coming, but you have to upgrade sometime.

And I would hold onto an X800 or 6800 series card until the next cards come out since they are supposedly close. 512mb, 90nm, etc will probably spank the pants off a 7800GTX in UK2K7 ans such. That's what I'd buy if I had a X800 or 6800+.
 
"R520 GL" - is that going to be the pro line of cards, i.e the fireGL replacement? Would be sweet if ATI released their CAD/CAM cards at the same time - hopefully top to bottom. the only quadro based on the GTX core is the 4500fx AFAIK (= WAY too expensive :eek: )
 
DarkBahamut said:
Theres no such thing as 'semi-compliant'. Its either compliant, or its not. If it includes some DX10 feature, but not all of them, then its not DX10 compliant, it as simple as that.

That's why I said it's semi-compliant. It meets most of DX 10's requirements but not all. I couldn't think of a better word to use at the time.
 
Back
Top