All quiet on the ATI front

Cali3350 said:
Everyone forgets ATI is in both the Xbox 360 and the Rev.

If they can get a 90nm 332million transister chip that runs at 500mhz and cool enough for a console out then im rather confident they can do the same with R520.

And you're forgetting that all ati did was design the chip, its up to microsoft to manufacture it.
 
forcefed said:
And you're forgetting that all ati did was design the chip, its up to microsoft to manufacture it.


yea thats all they did, i mean, they totally did like basically nothing. They only made it.
 
DejaWiz said:
I can remember when nVidia put the hurt on with the TNT2 and again with the GeForce 256. Those sold for about $299 upon introduction. Here we are 6 years later and video card prices have now doubled.

Yeah and how much faster is a GTX compared to a TNT2 ? You expect to pay the same thing for much better hardware forever? Any tech that is pushing boundaries has very high cost - do you complain that a 32" plasma HDTV is about 5 times more expensive than your 32" CRT?
 
forcefed said:
And you're forgetting that all ati did was design the chip, its up to microsoft to manufacture it.

Rofl. How silly of me. Obviously thats the difficult part.
 
trinibwoy said:
Yeah and how much faster is a GTX compared to a TNT2 ? You expect to pay the same thing for much better hardware forever? Any tech that is pushing boundaries has very high cost - do you complain that a 32" plasma HDTV is about 5 times more expensive than your 32" CRT?

No, but i do complain when a company has a relative monopoly on a current market and makes users pay far more then the stuff is worth. Your a fool if you dont think Nvidia is currently making money hand over fist with this price scheme. When designing hardware you take price into consideration.
 
Cali3350 said:
No, but i do complain when a company has a relative monopoly on a current market and makes users pay far more then the stuff is worth. Your a fool if you dont think Nvidia is currently making money hand over fist with this price scheme. When designing hardware you take price into consideration.

Well considering I paid $500 for my GTX and the XT PE is still $475 I have no idea what you're carrying on about. The only fools would be the people still buying Ultras and PE's today..... ;)
 
trinibwoy said:
Yeah and how much faster is a GTX compared to a TNT2 ? You expect to pay the same thing for much better hardware forever? Any tech that is pushing boundaries has very high cost - do you complain that a 32" plasma HDTV is about 5 times more expensive than your 32" CRT?

Actually a good 32" CRT would be much more expensive than a 32" HDTV. Also, the CRT would look 10x better. (We are talking about a PC CRT right?)

There is no way this is putting ATI out of business. The sales from the PCI 5200's and 9250’s dwarf the sales of the 7800GTX's and X855OMQXTPE Uber edition cards. ATI should be much more afraid of the 6600GT and Nvidia should be afraid of the X300. ATI has the low end Nvidia has the med. end cards. Those are the cards people buy. Millions of those are sold. There might be a thousand 7800GTX's sold so far. There just aren't that many people willing to spend 600$ every times a slightly faster card comes out. Most people go into a Wal-Mart and buy the first card that will fit into their machine.
 
Not only will ATi be collecting royalties from two consoles, they made a almost WgF2.0 compliant part already within the Xbox360. You can bet when unified parts come out ATi will have a very large advantage, they have already made a first generation part.
 
Obi_Kwiet said:
Actually a good 32" CRT would be much more expensive than a 32" HDTV. Also, the CRT would look 10x better. (We are talking about a PC CRT right?)

Nah man, I'm talking about TV's - hence HDTV in my post :p
 
"trinibwoy : Well considering I paid $500 for my GTX and the XT PE is still $475 I have no idea what you're carrying on about. The only fools would be the people still buying Ultras and PE's today..... "



Really?

the xt pe cost 475??

http://www.newegg.com/Product/Product.asp?Item=N82E16814102518

http://www.monarchcomputer.com/Merc...e=M&Product_Code=190668&Category_Code=ALL-ATI

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=1476861&sku=C261-3025 R

Not to mention it is the fastest card for AGP. I love how people make stuff up just to fit their own argument. *sigh* The last card of those 3 is about 150 dollars cheaper than the cheapest gtxs you can find.
 
By the time ATI releases R520 (looks like nov now), NV will have had ample time to create a refresh part... The fact that X850 Crossfire hasnt been released yet but is still slated for release is a bad sign for R520.
 
furocious said:
Not to mention it is the fastest card for AGP. I love how people make stuff up just to fit their own argument. *sigh* The last card of those 3 is about 150 dollars cheaper than the cheapest gtxs you can find.

Not my fault they didn't tell pricewatch or pricegrabber :eek: ....still doesn't refute the fact that the GTX is available now for less than the PE was at launch so the bitching is unfounded. It's kinda retarded to see people complain about pricing when the 7800GT is sure to be faster than last generation's 6800Ultra.
 
tranCendenZ said:
By the time ATI releases R520 (looks like nov now), NV will have had ample time to create a refresh part... The fact that X850 Crossfire hasnt been released yet but is still slated for release is a bad sign for R520.

These companies are usually ahead of themselves anyway. The following cores are in the physical works.

R520->R580->R600
G70 -> XXX ->G80

R520 and the future ones should have native CrossFire support so they wont have to make master/slave cards for them, just the X line. And from people getting Xpress200 test boards (including a person on this forum) there is every reason to believe the products are finalized and in production. As far as the R520, who the hell cares when its released. Its been beat to death. If its fantastic, alot of people will be eating their words, if its a flop, they'll spit that R580 out as fast as they can, if its the same performance, we'll be at another interesting war of the companies defined by drivers and seperated by how the cores handle certain games better then eachother.
 
Shifra said:
These companies are usually ahead of themselves anyway. The following cores are in the physical works.

R520->R580->R600
G70 -> XXX ->G80

R520 and the future ones should have native CrossFire support so they wont have to make master/slave cards for them, just the X line. And from people getting Xpress200 test boards (including a person on this forum) there is every reason to believe the products are finalized and in production. As far as the R520, who the hell cares when its released. Its been beat to death. If its fantastic, alot of people will be eating their words, if its a flop, they'll spit that R580 out as fast as they can, if its the same performance, we'll be at another interesting war of the companies defined by drivers and seperated by how the cores handle certain games better then eachother.

Who the hell cares when its released? ATI I hope. Who is going to spend the money to upgrade to R520 if most who'd spend 500+ on a vidcard have bought 7800GTX? You might have another NV30 on your hands, where the product is delayed so much that the competitors refresh comes out at the same time, putting your card in a bad spot.

Also, X850 Crossfire is truly a mystery to me this late in the game. Who in the heck is going to spend $700 to add a second X850XTPE and ATI mobo when they could have sold their first x850xtpe and got a much faster and fully featured 7800GTX (which can be SLI'd) instead? They should just dump x850 crossfire and release r520 as fast as possible, unless again r520 is having such problems that it won't be out anytime soons.
 
Cali3350 said:
Not only will ATi be collecting royalties from two consoles, they made a almost WgF2.0 compliant part already within the Xbox360. You can bet when unified parts come out ATi will have a very large advantage, they have already made a first generation part.


The unified structure might cause a large disadvantage too this is an unproven fact still, WGF 2.0 can utilize a triditional pipeline structure too, there is no advantage or disadvantage as of now or when its released it will come down to how efficient ATi's unified structure will be with scheduling, and shader performance.

Royalties won't be collected for the next gen consoles for at least 2 more quarters.

BTW there were at least 4 respins for the xbox xenos chip, ATi's chips seem to be extremely hard to produce both thier xbox tech and r520.
 
razor1 said:
The unified structure might cause a large disadvantage too this is an unproven fact still, WGF 2.0 can utilize a triditional pipeline structure too, there is no advantage or disadvantage as of now or when its released it will come down to how efficient ATi's unified structure will be with scheduling, and shader performance.

Royalties won't be collected for the next gen consoles for at least 2 more quarters.

BTW there were at least 4 respins for the xbox xenos chip, ATi's chips seem to be extremely hard to produce both thier xbox tech and r520.

Nvidia stated they were moving to a unified architecture as well.
 
tornadotsunamilife said:
I was under the impression that the R600 and G80 would both be using a unified architecture

That's what VR-Zone said but based on recent interviews with David Kirk - it doesn't seem like Nvidia will be moving to a unified architecture with their next generation part considering that it's development has to be well underway right now.
 
Labrador said:
Sounds like nVidia already won this round before it started ? I mean lets say the R520 is kick ass, once ot comes our retial, nVidia will see how it really performs, and a month later the 7800Ultra will be out at like 550mhz 1400memory ? which should be enough to beat the R520 ?


I have heard that nVidia already has thair card all ready, code named the "G80"... So when ATI Gets the Crossfire/R520 out, nVidia will bring out the G80. Thats what I have been hearing at least. ;) And a 7800 Ultra woundn't be to bad ether hahah :rolleyes:
 
If that is the case and ati manage to do it next-gen then surely they will be at an advantage? Yet we are seeing the exact opposite right now. it seems that the graphics card world is just swings and roundabouts ;)

However, ati are doing another 're-spin' of the crossfire chipset. All this effort in re-tape outs and chipset spins could mean a few things, they are getting poor performance; they are making sure they don't pull a NV3x; they're trying to get the best they can out of their hardware.

It would be great to see ati pull it off with the r5x0 and shut up all the naysayers but we'll have to wait a little longer to find out.
 
I'm starting to think ATI's success was a flash in the pan; they've been milking one basic core design (R300) for 3 years now, and they can't come up with anything to beat nVidia. I guess they don't realize that making a lightning-fast card, which R520 is reported to be, does not matter if you can't get it to market.
 
Cali3350 said:
Nvidia stated they were moving to a unified architecture as well.


Yes but thier chip will also be dependent on those terms, so just have to wait and see what happens. Lets say nV scheduling breaks down or ATi's scheduling isn't that good, and it still has shader performance compairable to the the x800 vs gf 6's. Too many variables to say one will be better then the other.
 
trinibwoy said:
Yeah and how much faster is a GTX compared to a TNT2 ? You expect to pay the same thing for much better hardware forever? Any tech that is pushing boundaries has very high cost - do you complain that a 32" plasma HDTV is about 5 times more expensive than your 32" CRT?
Damn faster. That wasn't my point. Also keep in mind that when companies are mass producing something on a large scale, they develop process improvements to keep costs down. For example, look at the auto industry. When Henry Ford set up the first assembly line, it was a heck of a process improvement for both speeding up production and making it cheaper. Today, the same general idea is used, but there have been leaps and bounds in terms of process improvements to speed production up even more and keep cost down. Can you imagine how much it would cost AM General to produce a Humvee using Mr Ford's old assembly line methods? I'm sure the same can be said for manufactiruing a 7800GTX using the methods used for making a TNT2.

Anyway, as I stated, I realize that ATI is having yield problems with thier new manufacturing and fabrication process, and it's costing them both money and delays to market. But comparitively speaking, it's probably cheaper to make an R520 chip with their new peocess than with the Radeon 256 process from years ago.
 
razor1 said:
Yes but thier chip will also be dependent on those terms, so just have to wait and see what happens. Lets say nV scheduling breaks down or ATi's scheduling isn't that good, and it still has shader performance compairable to the the x800 vs gf 6's. Too many variables to say one will be better then the other.

True, but ATI already has one full attempt to look at, and can only improve upon it.
 
As an average consumer, I can easily say that Nvidia has been on-top ranks ever since the release of the 6800GT and 6800Ultra. For mid-range, they've been in charge as well, with the 6600GT, low-range, the X300 by far, but that isn't a very compelling investment.

Now nvidia has released the G70 which is now putting them ontop again, dissapointing for ATi.
 
Cali3350 said:
True, but ATI already has one full attempt to look at, and can only improve upon it.


Well not exactly, the Xbox chip isn't fully WGF 2.0 compatiable its somewhere in the middle of Dx9 and WGF 2.0. And games being designed for it will be programmed specifically for it, so issues might not show up that quickly as it would with a desktop counterpart.
 
there's something else: If Ati gets the bugs worked out of 90nm, it's likely they would get 2 generations, maybe 3, out of the same process. nVidia's going to have to go 90nm themselves at some point and it's not going to be smooth either, becuase EVERYONE who has made a 90nM part has had problems.

IBM, AMD, Intel.....90nm was a struggle for all of them. Now it's Ati/TSMC's turn. At some point, it's going to be nvidias turn. 110 is not going to take them as far.

ATi is losing big now, but it's likely going ahead with 90 nm is going to payoff down the road. My guess is about 1.5 years from now, ATi will be back on top. How much back on top nobody knows, but they have the forward looking tech on their side.
 
Selecter said:
there's something else: If Ati gets the bugs worked out of 90nm, it's likely they would get 2 generations, maybe 3, out of the same process. nVidia's going to have to go 90nm themselves at some point and it's not going to be smooth either, becuase EVERYONE who has made a 90nM part has had problems.

IBM, AMD, Intel.....90nm was a struggle for all of them. Now it's Ati/TSMC's turn. At some point, it's going to be nvidias turn. 110 is not going to take them as far.

ATi is losing big now, but it's likely going ahead with 90 nm is going to payoff down the road. My guess is about 1.5 years from now, ATi will be back on top. How much back on top nobody knows, but they have the forward looking tech on their side.


nV's mid range chips for the GF 7 line are going to be on .09 micron, but it is probably going to be quite different from ATi/TSMC, nV might go with Sony's fabs instead which they are already producing the PS3 chip on .09. Looking at yeilds ATi has been having for the past generation I don't think all thier problems are from the fab process but also comes from the design of thier chips and boards (the same issues that plagued the nv30). The problems multiply with the new fab process though.
 
pr0nasaurus rex said:
I'm starting to think ATI's success was a flash in the pan; they've been milking one basic core design (R300) for 3 years now, and they can't come up with anything to beat nVidia. I guess they don't realize that making a lightning-fast card, which R520 is reported to be, does not matter if you can't get it to market.


What does it matter what its based on if it works well. Are you saying the R400 cores were all trash? Perhaps i should fit your knowledge to CPU's as well and say that Intel using the basis of the P3 to power their best processor in years and lead them off the P4 next year, is a joke and they should start from scratch to look better.

Heres a fact that im willing to bet you wont like. The NV40 had to be new, do you know why? Because the NV30 WAS total garbage. ATI did not have this problem so they carried it over and overhauled the chip making it use less wattage and giving it twice as much power as an 9800XT. Yes, some could say they were lazy in not including full support for DX9.0C, however, they said SM3.0 would not be needed for awhile, well here we are one year later, is it needed? Hardly. Pure basic image quality between the two cards are identicle and Framerate scores give leads to ATI on the high end however small. So please, stop with these "riding tech" comments, they had this luxery that nVidia did not have with the NV30 but they did have with the NV40. If you have any series performance complaints about the R400 series, im sure they're totally unfounded.

Now, we're at the cycle point again, the G70 (also known as the NV47) is, ta-da, an overhauled NV40. The R520 can safely be called a compeltely new core so we're switching again, except ATI took on a fab jump, so they couldnt spit out a card nearly as fast. I personally have read absolutly nothing thats confirmed by ATI about this core, have you? So please stop second guessing. If you have nothing nice to say, then dont say it.
 
R520 will be the basis for technology for years to come, an extra 4-6 months to wait for it wont kill you, now stop whining. In 3 years you'll be whining for a new core because they keep rehashing it, as they should to save money.

~Adam
 
CleanSlate said:
R520 will be the basis for technology for years to come, an extra 4-6 months to wait for it wont kill you, now stop whining. In 3 years you'll be whining for a new core because they keep rehashing it, as they should to save money.

~Adam

R600 is nothing like R520 :) They are moving it over to unified shaders.

What i heard was R520 is, essentially, still a R300. They just packed in the features.
 
Do remember that 'unified shaders' is a REQUIREMENT of Longho...errr....'Vista'. Or, rather, DX10 (which they are no longer calling WGF 2.0 - it's DirectX again)

So, it's a safe bet that G80 and r600 *will* both be unified.
 
CleanSlate said:
R520 will be the basis for technology for years to come, an extra 4-6 months to wait for it wont kill you, now stop whining. In 3 years you'll be whining for a new core because they keep rehashing it, as they should to save money.

~Adam

Unless R520 is a unified architecture you're wrong. 4-6 months is a lifetime in 3D.
 
dderidex said:
Do remember that 'unified shaders' is a REQUIREMENT of Longho...errr....'Vista'. Or, rather, DX10 (which they are no longer calling WGF 2.0 - it's DirectX again)

So, it's a safe bet that G80 and r600 *will* both be unified.

No that's just the API. The hardware itself need not be unified.
 
im waiting for a few different technologies before i upgrade again....most namely is socket m2 nad the r520 (or 7800ultra whichever is faster)
 
DejaWiz said:
But comparitively speaking, it's probably cheaper to make an R520 chip with their new peocess than with the Radeon 256 process from years ago.

Manufacturing cost is one aspect. There are also R&D and operating costs to be recouped.
 
trinibwoy said:
No that's just the API. The hardware itself need not be unified.
I was making the leap of logic that a unified API will talk better with unified hardware (IE., less driver issues, perhaps better performance) than having to translate back to seperate pixel and vertex shader logic.
 
dderidex said:
I was making the leap of logic that a unified API will talk better with unified hardware (IE., less driver issues, perhaps better performance) than having to translate back to seperate pixel and vertex shader logic.

Nvidia doesn't seem to think so but time will tell.
 
trinibwoy said:
Manufacturing cost is one aspect. There are also R&D and operating costs to be recouped.
I did mention R&D costs in my previous posts. And I know it isn't free for ATI to run a business.... ;)

R&D is probably what costs a technology company the most money and time as a whole. Sure, it's not cheap to push R&D for new products, refine a fabrication process, and pay for operating costs, but once it's all done and mass production gets underway, the cost is usually recouped fairly quickly. But at a price to consumers.
 
Shifra said:
Heres a fact that im willing to bet you wont like. The NV40 had to be new, do you know why? Because the NV30 WAS total garbage. ATI did not have this problem so they carried it over and overhauled the chip making it use less wattage and giving it twice as much power as an 9800XT. Yes, some could say they were lazy in not including full support for DX9.0C, however, they said SM3.0 would not be needed for awhile, well here we are one year later, is it needed? Hardly. Pure basic image quality between the two cards are identicle and Framerate scores give leads to ATI on the high end however small. So please, stop with these "riding tech" comments, they had this luxery that nVidia did not have with the NV30 but they did have with the NV40. If you have any series performance complaints about the R400 series, im sure they're totally unfounded.

Now, we're at the cycle point again, the G70 (also known as the NV47) is, ta-da, an overhauled NV40. The R520 can safely be called a compeltely new core so we're switching again, except ATI took on a fab jump, so they couldnt spit out a card nearly as fast. I personally have read absolutly nothing thats confirmed by ATI about this core, have you? So please stop second guessing. If you have nothing nice to say, then dont say it.


ATI didn't overhaul the r300 chip for the r420 to use be more efficient and use less wattage. They dropped in low k, this dropped the power consumption and added 2 times the pipes with minor modifications.

nV overhauled the fx core to come up with the nv40, the nv40 is directly related to the nv30.

ATi is sticking to an old technology that is obvious at this piont is failing to produce proper yeilds. I am very suprised they are having yeild issues with the r520 because I thought they would have took the design/construction of the chip into account for this after the r420. Obviously they didn't as its worse now.

The r420's weren't "trash" which I concur with you, but they are at the limits of the r300 core tech, thats painly obvious with thier yield issues, and per clock shader performance.

ATi instead of innovating at a timely basis sat back to rack in the profits. But those profits never showed up because they had nothing to sell, they pushed the tech too far. It the r520 is a 16 pipe chip as its been rumored well I don't see a whole lot of difference from the r420 performance wise per clock this will mean ATi when they do release the r520 will have similiar dilivery problems since they have to probably push the clocks very high to compete with nV. I'm pretty sure the r520 has 16 ROP's, so either they have more shader performance then the GTX per clock or ATi is in deep trouble till at least the refresh.
 
Back
Top