When to expect DX10 cards?

Kahnvex

2[H]4U
Joined
Sep 5, 2001
Messages
3,297
So, when are we supposedly supposed to be seeing the DX10 cards and such? I'm going to be building a new PC from scratch this year, and I was thinking about waiting on those cards before I did. If the answer is december, well, I'll just get the latest and greatest whatever this summer I s'pose.

Any hard facts on DX 10 come to light yet?
 
I've read that DX10 wont be available till the release of Vista. I also hear Vista wont be out till the end of the year or early 2007.

I could be wrong but thats what i heard. I wouldnt expect DX10 cards to come out till november/december this year
 
As for consumer aviailibility, on Vista Launch (whenever that actually is) That's according to Microsoft and the DX team along with the developer relations people.

we've seen teh crytek demo, but I'm guessing (no knowledge) that that is a software render as opposed to hardware, and played back. HLSL 4.0 is all I hear MS evangelizing- one shader to rule them all so to speak.

I do have a fair amount of knowledge that FSX (shown at CES in Alpha form, running DX9) will have both engines- DX9 and 10. IT appears that MS will be using this as a showcase app initially.
 
Well thats sort of a downer, but I suppose that the card revisions that will be available over the course of this year will be awesome cards.

I plan on plunking down the cash and going big this round, so I was hoping to get in on DX10. If nothing is going to be using DX10 until 2007 or later, then there really isn't any point in waiting. is there?

Thanks all!
 
the g80 (spring? summer?) won't be dx10. the r600 will be (a year from the r520)
 
johnnq said:
the g80 (spring? summer?) won't be dx10. the r600 will be (a year from the r520)

No the G80 is supposdly going to be Direct X 10. I think you are thinking of the G71 due out in March.

IMO waiting for Direct X 10 is pointless. Just go for a high end Nvidia or ATI card. Games are not going to use Direct X 10 shaders for a while and if by some chance they support Direct X 10 shaders im sure there are going to be Direct X 9.0 shaders in the game anyways. Think the only game coming out right away to use Direct X 10 shaders is UT2007.
 
I've heard no information relating to UT2007 and SM4.0 shaders. Any links?
 
phide said:
I've heard no information relating to UT2007 and SM4.0 shaders. Any links?

Well I did take this challenge, and couldnt find anything. So I went to unrealtechnology.com and it says Direct X 9.0 shaders. So maybe I was wrong, but I swear when the first screenshots came out, it said using Microsofts new Direct X 10 shaders. Oh well, information has changed passed then. I take back that old statement, until I can find some proove, which is looking like it cant be found. I was wrong, ill admit it :D
 
The only engine that I know of that may use DX10 shaders is CryEngine 2 as of right now, no idea when a game based on it will be out though, i'm sure there will be more engines to follow. CryEngine 1 was an early adopter of DX9 shaders.
 
I'm looking at this the opposite way.
Once DX10 cards come out, they'll be so powerful that playing games like F.E.A.R. will be a walk in the park, even at ultra high settings.

That's all I care about. Cause for a GREAT DX10 game to be released (like what F.E.A.R. did for 9.0) will take a long time.
 
There really is no point in waiting, unless there is about to be an immenent release of next-gen technology.

I just purchased the ASUS X1800 XT from newegg a few days ago. And in a few months Nvidia is coming out with its new card. Every six months something bigger and better is going to come along and blow everything else away. Thats the way of this industry. If you want to wait for that next big thing, you will keep on waiting.

Just buy the best now and be happy.
 
I actually think this is a time to wait. Games may not make much use of DX10 functionality at first, but thats not the point. DX10 is a new API; ie it is not built "on top of" DX9, as previous versions have basically been. It thus runs with a lower system overhead, and can achieve DX9 results in fewer API calls.

Furthermore DX10 is also based on the existence of unified shaders, which R600/G80 are bringing to the table. The potential performance improvements with respect to overall efficiency/bottleneck removal are huge. This generation should also bring DDR4 memory, supposedly to start at about 2.6GHz. This extra bandwidth will make HDR, soft shadows etc much more viable.

R580 and G71 are the conclusion of "this gen", kicked off with the 6 series and to a degree the X8 series. They are the fastest non unified cards that will be made. In themselves they are incredibly powerful, and definitely desirable, but they do not represent the future for GPUs. They have already made the first steps towards unified architectures (ATI with "decoupled" ALUs, nV with "decoupled" ROPs), but PS and VS are still separate components. It will remain this way at least until Nov 22 for ATI, when their 1 year exclusive Xenos agreement with Microsoft ends. I expect around the same time for nV also.

Thus you can see why I think sticking out this gen might be worthy. The GPU shift we are about to see is incredible.
 
While everything you say IS possible. There is no way of knowing until we get these cards into the market. Your whole post can also be applied to the emergence of PCI-E about a year ago, and I do not see a whole big performance difference between PCI-E and AGP.

The bottom line is DX10 cards will not be available for at LEAST 6 months. And during the entire time DX9 was out a game like FEAR (which is the culmination of DX9 technology in my opinion-well, minus SM 3.0) was not out until recently.

So if the OP wants to upgrade now he should, since if he waits six months for the new thing, six months after that there will be a new thing coming out and so on so what's the point really?

EDIT- And I also want to point out that SM 3.0, which is a DX9 feature was absent from most games and still is. Even as DX10 will be released it will be long before games are produced with SM 4.0 features.
 
Please reread my post and realise that:

*Unified Shader Architecture is coming with R600/G80
*DDR4 is coming with R600/G80

These alone will render the next step in VGA evolution quite impressive, to say the least. Only the first paragraph of my post was completely dedicated to DX10. My point is that the "next-gen" is not only a DirectX upgrade, but a complete marchitecture/memory evolution as well, and is the reason I am holding off upgrading till then.

Edit: Also a change in memory/marchitecture is not analogous to a simple bus change (AGP to PCI-E). As for PCI-E making little difference; in single card situations yes, but SLI or CrossFire anyone? You need a duplex bus for those.....
 
solideliquid said:
There really is no point in waiting, unless there is about to be an immenent release of next-gen technology.
And in a few months Nvidia is coming out with its new card. Every six months something bigger and better is going to come along and blow everything else away. Thats the way of this industry. If you want to wait for that next big thing, you will keep on waiting.
I'm not waiting for the best to get better, I was thinking of waiting for something compatible with a totally new tech tree. Thats a little different than waiting for the clock speed refresh of springs and winters. Trust me, I am more than aware of the 6 month product cycle, but I refuse to particpate. Thats why when I plan on droppinga couple grand here this summer, I'd like to go big, so I don't have to mess with it for a while. My system now is just starting to show it's age, and with what I'm planning this year, I should be good for a while. ;)
 
i can visualize the excitement and anticipation of the months before the first DX10 card comes out, people will crap their pants at the sight of every PCB picture and fuzzy cellphone spycam shot
 
ManicOne said:
Please reread my post and realise that:

*Unified Shader Architecture is coming with R600/G80
*DDR4 is coming with R600/G80

These alone will render the next step in VGA evolution quite impressive, to say the least.
You're being a bit overawed by buzzwords. The first GDDR4 cards will be incrementally better than GDDR3, just as it was over DDR2. Its evolutionary, not revolutionary...and all its benefits won't be seen the first year or two. The same is true for Unified Shaders.

G80/R600 will be large steps forward...but not much larger than average for a new GPU introduction. If you're looking for a card that'll last you for years, you're going to be sadly disapointed.

As for PCI-E making little difference; in single card situations yes, but SLI or CrossFire anyone?...
Neither of which was available when PCI-E was first introduced. Another example of new technology taking time to reap its benefits.
 
Masher one thing I have is a passion for 3D, and a good grasp of the technology involved. "Buzzwords" only exist to characterise feature-sets; in some cases overblown, but in others justified. You may think of "unified architecture" and "DDR4" as gimmicky terms used to sell more vga cards, but in this case its not true. The potential shift in gpu power resulting from far greater bandwidth, combined with a completely flexible shader architecture, is for want of a better term, fuckin' huge.

Furthermore I do not want a card that will last me for years, just until my next upgrade. What I want is technology I can appreciate, basically hardware done right.

If you want to understand more, just head on over to beyond3d.com; There is a stack you can learn there, enough to hurt your brain!
 
my 9800pro came out in what? 01, 02? the only thing it couldn't handle well was fear. used it up till last christmas when I upgraded to an X850pro. It will be the last vid upgrade I make and probobly the last upgrade: just ordered 2gb ram. going to decommision when vista comes out and have it as my DX9 gaming rig. now I do appreciate the jump in doom 3 from 35fps on high setting 2x aa to 59 fps on ultra 4xaa. I simply...am not going to need any more power until vista and dx10. my system lasted 2 years without any upgrades...didn't even take much of a hit when farcry came out. there is no need to upgrade every six months. I didn't even build an extreme high-end system for the day. the whole thing including $160 case was less then 1000 dollars! I upgraded to a gig of ram about 6 months ago from 512. but thats it. actually im almost done with my third year. this march. i'd say... an upgrade 1 1/2 years and a new comp every 3 years would easily suffice.

although... I loaded up a newegg cart with everything I want for my new comp minus ram, motherboard, cpu. and it came out to $1600, thats with 2 of the new raptors, a lian li case, modular 600w power supply, sound card, RAID controller, and a few other odds and ends. I could buid an entire 939 system for less than $300
 
ManicOne said:
You may think of "unified architecture" and "DDR4" as gimmicky terms used to sell more vga cards
I think you failed to _comprehend_ my post. They're both very important enhancements...but neither is going to totally invalidate prior hardware, nor result in cards light-years ahead of today's.

Furthermore I do not want a card that will last me for years, just until my next upgrade.
We're speaking about the OP's upgrade, not yours. And the cards sold today will be viable and last a person as long as those sold 9 months from now....when you factor in 9 months of additional use.

What I want is technology I can appreciate, basically hardware done right.
So current cards are "hardware done wrong"? :rolleyes:

The potential shift in gpu power resulting from far greater bandwidth..is for want of a better term, fuckin' huge.

GDDR4 chips are being sampled at 2.8-2.9 ghz. That works out to a ~92 GB/s peak bandwidth, or a bit less than double today's best cards, with a slightly _higher_ latency.
Or, in other words...pretty much identical to GDDR3 over GDDR2, and GDDR over VRAM. And the resultant performance boost was substantial, but not-- to use your colorful English-- "fuckin' huge".

The same goes for Unified Shaders. It's analogous to Hyperthreading, an architectural enhancement designed to use more of the processor's functional blocks simultaneously. Share the pipeline, and use pixel and vertex shaders interchangeably. But, just like HT, it will cause a slight performance drop in some current games, while boosting others. It won't be until games are developed and tuned for US that we see its _full_ benefit.

For the past decade, each new GPU from ATI and NVidia raised the performance bar some 30-100%. This cycle will be no different...on the high side of that range, but nothing to justify the orgasmic content of your last post.
 
Speechless. You can read my posts however you want. My posts have less "orgasmic content" than yours subtle vitriole. You win. Whatever. I was trying to be nice and I more than stand up for what I said. Anyone interested in this thread read the posts for their intellectual content and not personal attacks. Please.

Masher to me: "If you're looking for a card that'll last you for years, you're going to be sadly disapointed."

My response: "Furthermore I do not want a card that will last me for years, just until my next upgrade."

Masher: "We're speaking about the OP's upgrade, not yours."


And for anyone interested go here: http://www.hardforum.com/showthread.php?t=1008825
 
ManicOne said:
My posts have less "orgasmic content" than yours subtle vitriole.
My vitriole is rarely subtle :p

In any case, you're still not addressing the primary point-- that USA and DDR4 are both, while worthwhile enhancements, not going to result in increases substantially above past GPU introductions. Waiting six months for a new card is forever in the graphics industry; unless you're scrounging pennies, there's no real reason to wait to upgrade.
 
ManicOne said:
As before I disagree, but we will have to see won't we......
If you want to lay a wager, I'll put a couple hundred bucks where my mouth is. We can let the [H] staff hold the funds till the cards come out ;)
 
Hmmm, I think its a little early to start thinking about DX10 capable PC's, and not because DX10 might or might not be a long way off, but the additional wait for the DX10 supported effects to actually be used in the mainstream game market once its available.

Lets face it with shader2.0 and shader3.0 effects (while improving scene quality greatly) did not start to appear until a while after the hardware was available, much less since DX supported it.

If waiting until maybe Q3/Q4 is going to be a stretch, simply to get a DX10 capable video card, I'd probably have to advise against it, trends have shown us its going to take a long while before them effects are used mainstream.

Im not saying dont get these cards when they're first available, but specifically prolonging a potentially much needed upgrade for upto what could be a year is probably not a smart move.

I'll probably get hardware that supports DX10 as soon as its available, but then I upgrade my PC fairly often and it will be an apropriate time by then.
 
Must. Have. Last. Word........

You can bet one of the mods will have the last word if that's all there is left to say in this thread. This consitutues your friendly "stay on topic" reminder. - DougLite
 
Back
Top