Xbit Labs CS Source Benches

^eMpTy^ said:
And furthermore...the fact that the CS: Source benchmarks back up precisely what the beta benchmarks showed should completely end all discussion on the subject...not to mention the fact that the video cards used in both instances of benchmarking were not the same...

The 5900 still does horribly in DX9...nothing has changed...the two sets of benchmarks done by valve and later done on the beta completely agree with one another...

I see what you're trying to say, but basically the only evidence you have for your entire argument is the fact that YOU saw missing textures on YOUR computer...and I don't think that really means dick in the big picture...


Soure benchmarks? Oh yeah right you mean some guy wandering about an unpopulated map, thats soo indicitive of real gameplay. Take note hardocp, for future benchmarks, get the developers to make a function to remove all enemy characters so you can roam about and that gives you your gameplay numbers.

:rolleyes: :rolleyes: :rolleyes:

Now that my pissbreaks over im off to bed again, have fun replying to this with what you saw in your magical beta that noone else seen.
 
Chris_B said:
Soure benchmarks? Oh yeah right you mean some guy wandering about an unpopulated map, thats soo indicitive of real gameplay. Take note hardocp, for future benchmarks, get the developers to make a function to remove all enemy characters so you can roam about and that gives you your gameplay numbers.

:rolleyes: :rolleyes: :rolleyes:

Now that my pissbreaks over im off to bed again, have fun replying to this with what you saw in your magical beta that noone else seen.

Nobody else saw it? I don't see anyone jumping to your defense Chris...maybe you should start a poll?

So you tell me Chris...how should this be looked at? To hell with the VST...to hell with the beta HL2 benchmarks...to hell with the CS: Source benchmarks...ATi is going to OWN in HL2 no matter what? I mean seriously...take a deep breath and engage your brain....if you have one
 
^eMpTy^ said:
Nobody else saw it? I don't see anyone jumping to your defense Chris...maybe you should start a poll?

So you tell me Chris...how should this be looked at? To hell with the VST...to hell with the beta HL2 benchmarks...to hell with the CS: Source benchmarks...ATi is going to OWN in HL2 no matter what? I mean seriously...take a deep breath and engage your brain....if you have one


VST = video stress test, not gaming benchmark, mean "dick all" for game performance as you would say.

Beta hl2 = Inadmissable non working horseshit, anysite that benchmarks leaked "beta" software is pretty lame anyway.

cs-source = one guy wandering about empty maps is not indicitive of gameplay, any tool knows this, even you should know this. If you dont, well speaks volumes for you then doesn't it.

Untill valve release a proper GAME BENCHMAKING UTILITY you can quote numbers all you want cause they really mean sweet f**k all to me, the only legit BENCHMARK (not video stress test) that has been done by valve had ati on top.

Night night kid.
 
Chris_B said:
the only legit BENCHMARK (not video stress test) that has been done by valve had ati on top.

so...all the new benchmarks on current hardware are meaningless...and the old benchmarks, which were run on an older version of the engine...on LAST YEAR'S HARDWARE are the one's that matter?

You should apply for a position at ATi's PR department...you're even more full of shit than they are...
 
^eMpTy^ said:
so...all the new benchmarks on current hardware are meaningless...and the old benchmarks, which were run on an older version of the engine...on LAST YEAR'S HARDWARE are the one's that matter?

You should apply for a position at ATi's PR department...you're even more full of shit than they are...

Heh, so true. Wonder if ATI covers brain damage in it's medical plan?
 
Needs more Hl2 final benchmarks :rolleyes:
From the SShots I've seen, CS:S doesn't even look all that good.
But go ahead, think that nvidia is going to maintain this lead, given that it's an ati game.
Delude your self that nvidia is king shit of fuck mountain.
 
Well not to get in the middle of yet another ATI vs Nvidia Jihad but as for shimmering with the 6800 GT, all I did was use RivaTuner and set LOD to .3 and it looks just as good if not better than my 9700 Pro did *shrug*

Still hate the inability to force Triple Buffering, that really annoys the piss out of me
 
MFZ said:
Its the Fry's in Sacramento, feel free to call. It was listed as $599. If they tell u $499, telll me the guys name so I can rush there and get it.


I have had a BFG 6800 Ultra OC on order at Fry's for over month now and they STILL dont know when it will be in.

And the price IS $499. HERE

Fry's = www.outpost.com
 
Moloch said:
Needs more Hl2 final benchmarks :rolleyes:
From the SShots I've seen, CS:S doesn't even look all that good.
But go ahead, think that nvidia is going to maintain this lead, given that it's an ati game.
Delude your self that nvidia is king shit of fuck mountain.

The screenshots dont do it justice trust me ;) (of course it still isnt any Doom 3 but its definitely up there...somewhere...)
 
DemonDiablo said:
The screenshots dont do it justice trust me ;) (of course it still isnt any Doom 3 but its definitely up there...somewhere...)

HL2 is behind D3 in looking good..IMO!!!!
shit we all know HALO2 will kick all ass and so will Unreal3 :D so screw HL2. HALO and Unreal will own it. along with quake4.....hehehe. and CS sucks. play a better game. :D
 
Anyone have a mirror of the article? I can't get most of the pages.
 
gordon151 said:
And that would explain why I get a much lower minimum framerate than a person with a less powerful cpu than mine but a much better graphics card how?

Edit: Either way, like I said. Graphics power is a factor in dogfight situations, as the cpu isnt the only limiting factor.


Then you have issues with your network, there is alot more to it then just the graphics card, And the graphics card is not the bottleneck ;)
 
^eMpTy^ said:
so...all the new benchmarks on current hardware are meaningless...and the old benchmarks, which were run on an older version of the engine...on LAST YEAR'S HARDWARE are the one's that matter?

You should apply for a position at ATi's PR department...you're even more full of shit than they are...


You really are deluded, the benchmarks valve ran were proper benchmarks, they were not from some half assed leak that barely worked. Their benchmarks had the ai working, npc's enemies etc. The leak did not, is that so hard to understand? Im really having trouble figureing out if you can work out that much.

A vst does not constitute a benchmark of ingame performance, more like pixel shaders and thats about your lot.
 
Chris_B said:
You really are deluded, the benchmarks valve ran were proper benchmarks, they were not from some half assed leak that barely worked. Their benchmarks had the ai working, npc's enemies etc. The leak did not, is that so hard to understand? Im really having trouble figureing out if you can work out that much.

A vst does not constitute a benchmark of ingame performance, more like pixel shaders and thats about your lot.


value = ati's whore
anything that valve benchmarks will be in ati's favor. :eek:

i think you need to get over yourself, no one is jumping in trying to help defend your so called facts. my friend you seem to the deluded one here.
 
First off ATi is in the lead so far in the VST, and since its not a true game test, it should be taken with a grain of salt. But it was interested why Vavle released the VST and the timing of it right after Doom 3 even though HL2 wasn't ready yet. All Vavle had to do was give out thier results on different cards to quell the public about "are our cards fast enough to run HL 2" they didn't need to release the HL 2 VST.
 
I hate to sound like an NV !!!!!!, but it is, with this information I can fairly declare NVidia has this graphics battle won.

Think back a while: NVidia wins with Doom 3, but then ATI card owners are hyped about the imminent release of HL2 to make them kings. However, thanks to some driver optimizations by Nvidia, the geforces have come back in the 9th inning. By the way, where are those ATI OpenGL optimizations? Nvidia made changes for Direct3d. It is, in my opinion, ATi's time to back down and rethink their strategy.
 
[RIP]Zeus said:
value = ati's whore
anything that valve benchmarks will be in ati's favor. :eek:

i think you need to get over yourself, no one is jumping in trying to help defend your so called facts. my friend you seem to the deluded one here.


To be blunt i dont need anyone to help "defend me" my opinions are based on what i experienced with the hl2 leak. Empty and his little running buddy burninggrave can cry all they want but the fact remains, using leaked buggy software as a benchmark is total bs. If it was so legitimate why didn;t hardocp use it? Far as i know one website used the hl2 leak as a benchmark and these two are running around preaching that shit like its gospel.
 
ElementK said:
I hate to sound like an NV !!!!!!, but it is, with this information I can fairly declare NVidia has this graphics battle won.

Think back a while: NVidia wins with Doom 3, but then ATI card owners are hyped about the imminent release of HL2 to make them kings. However, thanks to some driver optimizations by Nvidia, the geforces have come back in the 9th inning. By the way, where are those ATI OpenGL optimizations? Nvidia made changes for Direct3d. It is, in my opinion, ATi's time to back down and rethink their strategy.

Actually, the reason why you don't see as many huge advancements in ATI's driver releases is because of the architecture.

When ATI designs a card, it is a very smart on the transistor level. The drivers are just there to implement the features and to work in the OS. You can't really optimize that much.

When Nvidia designs a card, it's dumb on the transistor level. The drivers are what improves performance, and each release can be optimized further to alway improve performance.

*Note, this is just from my brother-in-law's attempts to write drivers for ATI and Nvidia cards in Linux.
 
Tim said:
Actually, the reason why you don't see as many huge advancements in ATI's driver releases is because of the architecture.

When ATI designs a card, it is a very smart on the transistor level. The drivers are just there to implement the features and to work in the OS. You can't really optimize that much.

When Nvidia designs a card, it's dumb on the transistor level. The drivers are what improves performance, and each release can be optimized further to alway improve performance.

*Note, this is just from my brother-in-law's attempts to write drivers for ATI and Nvidia cards in Linux.


Well actually take alook at the first least of the r300 drivers till now

There has been a fairly large improvement in the perfromance of the r 300 lines.

Transistors can't do any work without being told what to do, Being smart at the transistor level means very little if the drivers can't execute the paths correctly or efficiently.

If you take the frist drivers for the 60 series till now Nv's core temperature has been dropping, The drivers are more efficient and this translates to better core performance with less work. This is true to ATi's cards aswell, but you really won't see a change since the drivers are alread mature.
 
rancor said:
Well actually take alook at the first least of the r300 drivers till now

There has been a fairly large improvement in the perfromance of the r 300 lines.

Yes, but there's also been newer chips released in that timeframe.
 
Tim said:
Yes, but there's also been newer chips released in that timeframe.


The architecture hasn't changed that much. Take a forceware 50 drivers and use it on the gf 6, it will perform like crap, Take an early ati 4 series driver and and use it on the x800, it will still perform very close to what its doing now.
 
[RIP]Zeus said:
HL2 is behind D3 in looking good..IMO!!!!
shit we all know HALO2 will kick all ass and so will Unreal3 :D so screw HL2. HALO and Unreal will own it. along with quake4.....hehehe. and CS sucks. play a better game. :D

....

you are just the lamest hatenest poster I've ever seen. (yes I invent words) :)

I've never seen anyone hate a subject more than the klan!
You act like HL2 and ATi killed your cat!.......Together!
[RIP]Zeus: I hate cats!

And graphics, Halo 2 doesn't look that much better than halo 1. It's made to run on the XBox

And supporting Chris_B, He's right doing benches on HL2 beta are worthless (any game beta) the code isn't fully debugged and optimized. That's why they are betas.

And supporting both arguements: CS:S without people(real or simply game people) is a pretty silly benchmark. But still a glimpse to what MIGHT show in real world experience
 
rancor said:
The architecture hasn't changed that much. Take a forceware 50 drivers and use it on the gf 6, it will perform like crap, Take an early ati 4 series driver and and use it on the x800, it will still perform very close to what its doing now.

Wait, doesn't that prove my point?
 
will be interesting to see some scores with the sm3 patch involved unless they are only releasing a patch for halflife 2 and not cs:s
 
Tim said:
Wait, doesn't that prove my point?


not really take an ati 3 series driver and use it on the x800 you will see it will get crap performance too.

Since nV architecture is completely different from last lines what your brother in law said about smart transistors doesn't apply here.

Take the geforce 3 to the geforce 4 lines for example, drivers improved the preformance very little for the the geforce 4 lines. And thats the same thing you are seeing with the r 300s to the x800's.


Also if you look at it nV's pixel shader performance per clock is higher then ATi's about 25%, vertex shader performance is lower by that amount too.
 
rancor said:
not really take an ati 3 series driver and use it on the x800 you will see it will get crap performance too.

Since nV architecture is completely different from last lines what your brother in law said about smart transistors doesn't apply here.

Take the geforce 3 to the geforce 4 lines for example, drivers improved the preformance very little for the the geforce 4 lines. And thats the same thing you are seeing with the r 300s to the x800's.


Also if you look at it nV's pixel shader performance per clock is higher then ATi's about 25%, vertex shader performance is lower by that amount too.

Yes, but the 3 series driver set was never meant to run R420 hardware, and probably isn't even going to recognize half of the features.

As for Nvidia's hardware, I think the vast performance improvements from the 62's and 63's to now really proves my point about Nvidia hardware needing more driver optimizaitions versus ATI's, whose performance has not varied by that much since the release of the X800 series.
 
Tim said:
Yes, but the 3 series driver set was never meant to run R420 hardware, and probably isn't even going to recognize half of the features.

As for Nvidia's hardware, I think the vast performance improvements from the 62's and 63's to now really proves my point about Nvidia hardware needing more driver optimizaitions versus ATI's, whose performance has not varied by that much since the release of the X800 series.


There was a big improvement from the 3 to the 4 for ATI aswell The 4.6's were the last of the 9800's if I'm not mistaken. And the 2.0 were the start of the r300 line. 2.0-4.7 there was a good increase in performance for the 9700 to 9800 line.

Basing on a completely new architecture takes time to get mature drivers. The r 300 came out, it had alot of bugs and was better then the gf 4's but with a few tweeks/driver revisions (from 2.0 to 3.0 the 9700 was there and its performance was shown to be better then the 5900

so the 3 series of catalysts were made for ATi's r300 lines and those will work just fine with the the r400's, the only main features they don't have is dx9b.
 
Tim said:
Yes, but the 3 series driver set was never meant to run R420 hardware, and probably isn't even going to recognize half of the features.

As for Nvidia's hardware, I think the vast performance improvements from the 62's and 63's to now really proves my point about Nvidia hardware needing more driver optimizaitions versus ATI's, whose performance has not varied by that much since the release of the X800 series.

If you look back to the earlier 6x.xx series drivers like the 61.77's and compare performance with Tri and AF ops enabled and disabled you'll see that for the most part the nVidia cards lost very little performance. The ATI cards however lose alot more performance when their optimizations are disabled. The early ForceWare driver sets for the NV40 series had very little in the way of optimizations and the cards were pretty much running off raw horsepower vs the X800's which were highly optimized already coming off the highly optimized R300 cores.

Really, optimization is just a loose term thrown around for lower image quality in exchange for higher fps. You can't really optimize something unless you lower its quality or change its quality in some fashion. ATI for example uses a ton of optimizations in the area of Anisotropic Filtering. They use different tricks to apply AF in the most noticeable areas and reduce it in others to keep performance high. Its why you see very little fps drop even at 16xAF for the X800's. The NV40's were losing fps more quickly to AF being enabled because they were applying AF more heavily and not as adaptive to the situation. nVidia's AF quality is also a little better then ATI's. nVidia has since switched to an adaptive algorithm like ATI uses for AF filtering in the newer ForceWare beta drivers. Its the reason why the NV40's have taken the lead now in several D3D games that ATI once had the lead in at high resolutions with AF.

I look for nVidia's performance levels to continue to rise steadily while ATI's begins to top out. nVidia has alot more to work with in these new NV40 cores while ATI is still working off advanced R300 technology.

If you compare a 6800U to an X800XT PE without AA/AF enabled the 6800U wins nearly every time no matter what the resolution is. It will come down to filtering optimizations now as to who reigns supreme in D3D. OpenGL and Linux however is completely dominated by nVidia and will continue to stay that way.
 
a guy a few rooms down has a x800pro and i have a 6800GT, the x800pro runs higher framerates at times than my 6800GT, but my card is very consistent and in the end puts a better overall score. just my experience.

concerning cards, the 6xxx series just continues the trend i have seen that the "slower" card is better, the 9700/9800 series was slower than the 5800/5900 series and was overall faster and now the 6800 series is overall barely faster than the "faster" clocked x800 series. I do like Nvidia drivers overall more though. If I ever had a problem with a game it was with an ati card not nvidia.
 
avatar_of_might said:
a guy a few rooms down has a x800pro and i have a 6800GT, the x800pro runs higher framerates at times than my 6800GT, but my card is very consistent and in the end puts a better overall score. just my experience.

concerning cards, the 6xxx series just continues the trend i have seen that the "slower" card is better, the 9700/9800 series was slower than the 5800/5900 series and was overall faster and now the 6800 series is overall barely faster than the "faster" clocked x800 series. I do like Nvidia drivers overall more though. If I ever had a problem with a game it was with an ati card not nvidia.


Hey Avatar long time no see where have ya been?
 
rancor said:
Valve is probably forcing them to run Dx 8 since they will get killed in Dx 9
Yup, that is what they are doing. i believe they are forcing the DX8.1 path on all FX cards. i think it was anandtech or so that i read it on, it make since tho, cuz those FX card would get raped by DX9.

let me tell you all something that maybe you didnt know, Valve has made a secret Blood Pact with S3. The whole idea is to Stirr all u goof-balls up thinking your "new" purchases would lead you to superior performance, buuuuut nooo, its going to run like crap unless you are running an "S3 ChromeSe, Silver boxed Gold plated Platinum edition XT Ultra Premium supreme Banshee destroyer 5000 w/ 2.5Tb onboard ram**"...muahaha i have mine on pre-order..watch as i blaze past you at [20 to the 5th power] FPS....you will all be pwnerated...ahahaha


**(fine print) only available with 64bit memory interface

{/Exit alternate reality]



On-topic, lately i have noticed ati (disregard the Doom3 drivers) has been a bit unregular in their releases but their Ogl performance has improved. im wondering if they dont have an Uber drive in the works that they are going to drop when Hl2 is released we have been seeing alot of technology come out from ati in regards to their drivers, Technology which is a great stepping stone to build upon greater driver performance. im thinking mostly on the AI....not to mention do we have any number on the difference 3Dc will make or Instancing compared to not?

but not even that isnt it wonderful that Nvidia can squeeze soo much from their GPus with a simple driver update? whats going on with that? is it some marketting ploy? maybe some padding this time around to make sure they dont get humped like they did with the FX series? is it simply that Ati doesnt have the same Experience with umm...."Optimisations".. or is it lacking on the driver teams side, granted last round ATI didnt have many performance issues needing to be fixed it was more so Bug fixes and watching NV try to catch up, the shoes on the other foot this time around and im not really sure what to think
 
its not really a lack on ATi's side.

Any software production on a new architecture you start of with what you know will work then optimize. Thats why drivers when they are first released for a new core are general slow. If you don't have drivers that work to begin with its kinda like a chicken with its head cut off, you don't have a base, with out a base you have no idea of where to go, how to get there and when. Also based on the architecture there is only so much you can optimize. Example, my engine when I made the 1st version of the current render core, was doing around 300k polys at 30 fps on a gf3 with everything going, 2 versions later I'm at 600k or so at 30 fps with everything going. I'm pretty much at the max theoretically performance limits of todays GPU's, I know I can get 40% more performance out of it by using 2 times vram, but there is no need to go that far at least not yet, would rather have the vram saved for other things.

This is the same with the Unreal engine if you scale the orginal Unreal engine you will see they were at the pinnicale of perfromance when it came up, and comparing that to the Unreal 3 engine, the Urneal 3 engine isn't faster just has advanced features.

If you take alook at the earlier gf 6 drivers they were more cpu dependent then ATi's drivers, the later revisions, its turned, ATi's drivers are now more cpu dependent.
 
The BFG 6800U at Fry's for $599 is the watercooled one that comes with the waterblock on it. There's a couple of them in the Burbank Fry's too.

-MrD
 
Chris_B said:
To be blunt i dont need anyone to help "defend me" my opinions are based on what i experienced with the hl2 leak. Empty and his little running buddy burninggrave can cry all they want but the fact remains, using leaked buggy software as a benchmark is total bs. If it was so legitimate why didn;t hardocp use it? Far as i know one website used the hl2 leak as a benchmark and these two are running around preaching that shit like its gospel.

lol...preaching like it's gospel? dude, you're unbelievable...

You've already stated that the holy grail of HL2 benchmarks was the one run by Valve last year...so let's list why that opinion makes you ignorant shall we?

1. They were run by Valve who was pushing a deal with vouchers going out with ATi cards at the time...therefore they were biased
2. The cards benchmarked were a completely different architecture and the benchmarks hardly matter anymore.
3. CS: Source benchmarks and alpha benchmarks of HL2 both show the 9800xt whipping the 5950U by a margin consistent with the original Valve benchmarks, thus these benchmarks seem to at least scale the same way as the official Valve benchmarks. So when those same tests show the 6800U keeping up with the x800xtpe, from a scaling perspective, there is no reason to believe that the benchmarks are, in any way, inaccurate.

I would obviously prefer to see benchmarks of the final game...but the fact of the matter is that with currently available information, there is more than ample evidence to support the argument that the 6800s will keep pace with the x800s in HL2...

On the other hand, there is absolutely ZERO evidence that the x800s will beat the 6800s by a wide margin.

So, by your own logic, you can hold one of two positions, the outcome is either unknown, or they are dead even.
 
Netrat33 said:
....

you are just the lamest hatenest poster I've ever seen. (yes I invent words) :)

I've never seen anyone hate a subject more than the klan!
You act like HL2 and ATi killed your cat!.......Together!
[RIP]Zeus: I hate cats!

And graphics, Halo 2 doesn't look that much better than halo 1. It's made to run on the XBox

And supporting Chris_B, He's right doing benches on HL2 beta are worthless (any game beta) the code isn't fully debugged and optimized. That's why they are betas.

And supporting both arguements: CS:S without people(real or simply game people) is a pretty silly benchmark. But still a glimpse to what MIGHT show in real world experience

Actually, Halo 2 has a completly redone engine that uses a lot of different mapping techniques. Of course its still gonna resemble halo...its halo..TWO ( keyword: two ) IMO, Halo 2 is much more impressive than HL2 considering what its running on. HL2 just uses a lot of high resolution textures on flat environments. Plus, that new leaked Halo 2 trailer got me hyped up x100. HL2 can lick on my nuts for all I care, Its all bout' Halo 2.
 
rancor said:
There was a big improvement from the 3 to the 4 for ATI aswell The 4.6's were the last of the 9800's if I'm not mistaken. And the 2.0 were the start of the r300 line. 2.0-4.7 there was a good increase in performance for the 9700 to 9800 line.

Basing on a completely new architecture takes time to get mature drivers. The r 300 came out, it had alot of bugs and was better then the gf 4's but with a few tweeks/driver revisions (from 2.0 to 3.0 the 9700 was there and its performance was shown to be better then the 5900

so the 3 series of catalysts were made for ATi's r300 lines and those will work just fine with the the r400's, the only main features they don't have is dx9b.

I really wonder sometimes if you know what youre talking about. What exactly are you referring to with comments like "the 3 series of catalysts were made for ATi's r300 lines and those will work just fine with the the r400's". Are you suggesting that the designs are so similar that they can share the same driver base?
 
^eMpTy^ said:
lol...preaching like it's gospel? dude, you're unbelievable...

You've already stated that the holy grail of HL2 benchmarks was the one run by Valve last year...so let's list why that opinion makes you ignorant shall we?

1. They were run by Valve who was pushing a deal with vouchers going out with ATi cards at the time...therefore they were biased
2. The cards benchmarked were a completely different architecture and the benchmarks hardly matter anymore.
3. CS: Source benchmarks and alpha benchmarks of HL2 both show the 9800xt whipping the 5950U by a margin consistent with the original Valve benchmarks, thus these benchmarks seem to at least scale the same way as the official Valve benchmarks. So when those same tests show the 6800U keeping up with the x800xtpe, from a scaling perspective, there is no reason to believe that the benchmarks are, in any way, inaccurate.

I would obviously prefer to see benchmarks of the final game...but the fact of the matter is that with currently available information, there is more than ample evidence to support the argument that the 6800s will keep pace with the x800s in HL2...

On the other hand, there is absolutely ZERO evidence that the x800s will beat the 6800s by a wide margin.

So, by your own logic, you can hold one of two positions, the outcome is either unknown, or they are dead even.

You know, the funny thing is that when the actual benchmarks are out and if the x800 xt pe does command a good lead. Some people will fall back on the CS:S and leaked hl2 code benchmarks and ask why those results show a different scenario than the actual game benchmarks themselves (totally ignorant of the fact that the cs:s benchmarks are of a different game and the leaked alpha benchmarks are from a point in the games design when it wasnt even close to being half way finished).
 
gordon151 said:
I really wonder sometimes if you know what youre talking about. What exactly are you referring to with comments like "the 3 series of catalysts were made for ATi's r300 lines and those will work just fine with the the r400's". Are you suggesting that the designs are so similar that they can share the same driver base?


The Cat 3.4 were the first for the 9800's if I remember correctly Gordan. 4.5's were the first drivers that were used for the x800's also but testing the cards on previous drivers you will see that the cards still performs well. There isn't a huge monumental leap of different technology here which will force a complete rewrite of drivers.
 
Back
Top