XP vs Vista: Redux @ [H]

Actually, on a 2 core / 4 thread cpu while those threads are supposed to have their priority reduced, in my testbed watching performance counters indicated that Vista was indexing, defragging, and doing all kinds of other background tasks without properly throttling. I've actually watched Enemy Territory's thread go from normal to low priority hundreds of times a minute as Vista can't make up it's mind what it wants to do. These tests didn't suffer as much on a single core, single thread platform.

Did you try running the tests with those background tasks disabled? If so, did gaming performance increase? If it did, I guess I have to eat my words. I had assumed Vista would have all that sorted out, hopefully future updates will solve that.
 
First, thanks for doing this test. It matters immensely to the community that you took the time to listen to our feedback and run more tests. It's why I continue to read your site, why I continue to click on your page ads, and why I continue to buy things from your sponsors (mostly Newegg).

This confirms what I already knew. NVIDIA's drivers sucked at launch, and they suck (less) now. ATI's drivers are closer to their XP counterparts, though CCC still sucks.

Ironically, one of my friends now works at NVIDIA on their Vista drivers. Even he admits that they suck.

Even more ironically, while I had an ATI card when you published the last article (x1300XT/x1600), now I have an NVIDIA card (8600GT). I have been running Vista since the day of the launch, and running betas since 2003.

Longhorn has essentially been done since mid-2006. There is no excuse for NVIDIA being so far behind.

WDDM (Vista's display driver framework) is entirely new. The 3D pipeline is used for everything now, from video acceleration to desktop composition, WPF compoition, and of course games. WDDM allows the pipeline to be shared (try running two 3D apps in XP at the same time), which means it can handle things like running WoW in a window and using Media Center at the same time, all in a 3D desktop. It's also more stable because most of the display driver stays out of the kernel. Ever overclock your card too much and cause XP to bluescreen? More often than not, Vista will restart the driver and keep on chugging.

Unfortunately, with this increased stability and functionality comes additional overhead. Fortunately, it's not much. We're seeing about a 5-10% reduction in performance in Vista, which absolutely no one is going to care about in 9 months when ATI and NVIDIA release new hardware that's twice as fast.

No one is to blame here except NVIDIA. WDDM works really, really well, considering how new and different it is. ATI has drivers that are solid, and each release gets them a bit closer to XP in performance.

NVIDIA got caught with their pants down. Their drivers sucked in beta. At launch, they didn't even have features like LCD aspect ratio control. And they still aren't close to their XP counterparts in features or performance.

Vista is good enough that I use it as my primary OS every day on every computer I own. It's solid today, and SP1 will only make it better. Whether you like it or not, the world will switch to Vista in the next few years. You can run XP for now if you prefer, and there are good reasons to do so. But running 98 today makes you look like a luddite. Running XP in 5 years will be no different.
 
DeschutesCore - regarding your question about the DRM pipeline affecting gaming performance, there is the following Q&A from January:

http://tinyurl.com/2psb77

Will the video and audio content protection mechanisms affect gaming on the PC?

The Windows Vista content protection features were design for commercial audiovisual content and are typically not used in game applications. A game author would have to specifically request these features for them to impact game performance.

Sounds very promising, but...

Does this complicate the process of writing graphics drivers?

Adding new functionality usually introduces new complexity. In this case, additional complexity is added to the graphics driver, but that complexity comes with the direct consumer benefit of new scenarios such as HD-DVD or Blu-Ray playback.

Increased complexity is nearly always accompanied by a tradeoff in performance. That said, I'm not going to cry over a dropped frame here or there. I've already made the transition from XP to Vista and don't plan on going back.
 
The "it's the drivers" excuse is old now.

Vista...it's like polishing a turd.
 
The "it's the drivers" excuse is old now.

Vista...it's like polishing a turd.

Say whatever you want, but Vista is only about 5 months old, and with ATI at least performance is close to XP.

Cry me a river about a 5% framerate difference. At the end of the day, Vista has graphics that multitask better and are more robust. If 5% difference is the cost for that, it's well worth it.
 
DeschutesCore - regarding your question about the DRM pipeline affecting gaming performance, there is the following Q&A from January:

http://tinyurl.com/2psb77



Sounds very promising, but...



Increased complexity is nearly always accompanied by a tradeoff in performance. That said, I'm not going to cry over a dropped frame here or there. I've already made the transition from XP to Vista and don't plan on going back.

A modern CPU can process more than two billion instructions per second. A couple of thousand instructions to check some DRM flags aren't going to amount to a hill of beans.

PUMA was already in XP, it was just called "Secure Audio Path'. WM-DRM was also in XP (and 98/ME/2000, if you have the WM format runtime installed). XP even supports the image constraint token and HDCP with newer graphics drivers.

So, what's left? Protected Video Path. It's the boogie man for all Vista DRM FUD. But, unless you're using applications that invoke PVP, it doesn't do a hell of a lot of anything. It doesn't even work if you're running unsigned video drivers, which many of us are (e.g. on my notebook).

Read my lips: Vista's lower 3D performance is because of two reasons:
  • Overhead from moving the drivers out of kernelspace and changing certain things in the driver framework (WDDM)
  • Relatively immature graphics drivers because of said new driver framework

It is NOT because of any of the following reasons:
  • Aero Glass. The DWM (composition engine which draws the graphics effects) is disabled when you start a full-screen Direct3D application. There is no overhead because it's not doing anything. Disabling Aero Glass will do nothing for fullscreen games (though it may make games faster in windowed mode)
  • Background processes. Unless you're low on memory, these don't do much of anything. SearchIndexer is the worst culprit, but it (and all of the other background tasks, like the defragger) is more disk-intensive than anything else. You may see load times increase, but just like a Raptor doesn't make your games get higher framerates, a little disk activity isn't going to make them significantly slower. Unless, of course, you're low on memory - in which case performance is going to suck anyway.
  • DRM. None of the DRM systems in Vista take up a significant number of cycles. PVP doesn't affect your framerates unless your game happens to use it. Which none do.

Vista isn't going to eat your children, it's not going to kill PC gaming, and it's not the second coming of Jesus. Vista is just the new version of Windows. Like XP, games run slightly slower. Like XP, it requires more memory. And, like XP, it has more background tasks.
 
I recall playing around with NT4.5 and thinking I wish they made this more Dos compatible cause it was so much more stable than win98 at the time. Then win2k came around and imo its still the best operating system from MS . WinXP was the bloated POS update we didn't need frank(imo) .
My point is that Vista is a bit like NT4.5 , at the time no point getting it but the sequel became an excellent piece of software that had me dump win98 in a flash regardless of the limited dos support. It is possible by SP2 there will be no point in hanging with XP , but who knows.




* edited. Thanks.
 
First off, thanks to [H] for revisiting this. The added content does paint a better picture. I think a lot of you are missing what is going on here though. Vista has changed the way it uses video. The way XP does video is vastly different then Vista. I am almost 100% sure video drivers for Vista was ether a complete rewrite or at least mostly was for venders. one fact is that Microsoft has changed the way and standard of video driver support in Vista.

This is my speculation, I fully believe that Vista will come along as driver support for not only video but everything matures. Remember Microsoft sets the sets the rule and the way drivers work with the OS, it's up to the venders to follow that.
 
bsoft said:

Great post!

Cry me a river about a 5% framerate difference. At the end of the day, Vista has graphics that multitask better and are more robust. If 5% difference is the cost for that, it's well worth it.
Emphasis on the "if" there, I might add. Some published test results don't necessarily belie the fact that more than just a few people report a great gaming experience under Vista. And the results aren't necessarily applicable right across the board, on all hardware and in all games. We're presented with a suggestion that 'worse' performance in games has been measured and shown, but how does that explain the following:

I'm currently giving F.E.A.R. a re-run, because I've fitted newer hardware which allows me to up the detail levels in that game. I've run it under both XP and Vista on that hardware. In Vista I'm actually able to tweak the detail levels a weensy, tiny tad higher than I can under XP. Does that somehow 'prove' that Vista is a better performance OS? Not at all. Too restricted a test.

But hey? At least it's a more meaningful approach than the "I can measure a difference so the gaming must suck more!" one. If somebody wants to convince me that Vista doesn't provide a 'real world' gaming experience which is as good (because of framerate differences) then please show me where those measured differences have had impact. Have I had to tweak detail settings downward to stop my machine from shitting itself? Can I no longer use 16x AA, for example, or perhaps I've had to alter a 'shadows' setting to make it 'Medium' rather than 'High' detail? And pffft! to those small average framerate differences. what's been the impact on my minimum framerates? That's where the real impact on playability comes into it.

If those things aren't being demonstrated then the conclusions can only ever be drawn in relation to benchmarking rather than to 'gaming'. They're not 'real world' gaming tests and results unless they show me 'real world' impacts on my gaming activity.

I'm not dissin' Jason or [H]ard specifically here. This is something very common in the review/hardware scene. But it's still misleading and wrong. If the testing really only relates to competitive benchmarking rather than to the gaming experience then the comments, conclusions and intimations should be restricted to the competitive benchmarking realm, rather than extended to realms which haven't really meaningfully been assessed!



Sorry for the revisiting of this by the way. I mentioned earlier that I was sorta 'done' with this, and have reneged. But I've received privately criticism that I'm sorta somehow belabouring the point that Vista will get 'better' in the future and that's absolutely NOT what I'm saying at all. Instead, I'm arguing that there's not really anything 'wrong' with framerates right now. I'm suggesting that the minor discrepencies don't really impact on gaming. For actual gaming, Vista is as good right now as it really needs to be. (With respect to framerates and their impact, anyway.)

Similar issues and concerns were raised by me in the previous thread. They weren't addressed then. Only the "You didn't test ATi gear!" complaints were really responded to.

I'd dearly love to see response to the points made, though ;)
 
The "it's the drivers" excuse is old now.

Vista...it's like polishing a turd.
So lets use a legacy os instead. Reminds me so much of OMG I can't run 16 bit code anymore, so I'll just keep using windows 98! Not that I agree with you one bit, however the comparison you give isn't very good.

XP = old, Vista = new. XP = Legacy.
 
Did you try running the tests with those background tasks disabled? If so, did gaming performance increase? If it did, I guess I have to eat my words. I had assumed Vista would have all that sorted out, hopefully future updates will solve that.

As the tests stand now, this is with defrag scheduled to run at 1 A.M. every night, at any time of the day, it pops up anyway. I've read a fair amount of available documentation on the defrag process, and have a pretty good understanding of how the heuristic works for defrag, having written a defragmenter for internal use on windows 98-2000. I know it's drastically different now, with no ETA and no visual feedback on it's status.

The defrag UID is a running process and is using cycles AND performing disk access. The entire point of this test protocol is to test networked 3D rendering for server farms. The test monitors all the active performance counters and XP by far demolishes Vista in network access and thread prioritization. Granted, there are MANY more processes in Vista then XP by default.

Vista has it's high points. Photo import and sorting has improved. The breadcrumb has become my preferred method of navigation. But it's performance misses the mark for me in many ways.
 
But hey? At least it's a more meaningful approach than the "I can measure a difference so the gaming must suck more!" one. If somebody wants to convince me that Vista doesn't provide a 'real world' gaming experience which is as good (because of framerate differences) then please show me where those measured differences have had impact. Have I had to tweak detail settings downward to stop my machine from shitting itself? Can I no longer use 16x AA, for example, or perhaps I've had to alter a 'shadows' setting to make it 'Medium' rather than 'High' detail? And pffft! to those small average framerate differences. what's been the impact on my minimum framerates? That's where the real impact on playability comes into it.

If those things aren't being demonstrated then the conclusions can only ever be drawn in relation to benchmarking rather than to 'gaming'. They're not 'real world' gaming tests and results unless they show me 'real world' impacts on my gaming activity.

I'm not dissin' Jason or [H]ard specifically here. This is something very common in the review/hardware scene. But it's still misleading and wrong. If the testing really only relates to competitive benchmarking rather than to the gaming experience then the comments, conclusions and intimations should be restricted to the competitive benchmarking realm, rather than extended to realms which haven't really meaningfully been assessed!

Thanks for your thoughts.

I think we have different definitions of what "real-world" gameplay is.

I'll tell you what other publications do - they set their games to run on a time-demo. They run 3DMark. And then they make their decision on how good the machine is. That's what we call "canned" benchmarks. What we do is that a person (a real, live person) sits down at a computer and actually plays the games. He goes for the sweet spot on the graphics settings so that we get an optimal gaming experience, then we record our framerate while actually playing the game - not a time demo or the like.

Now, what you're saying absolutely has merit. Will a person notice a difference between 55 and 59fps as a net effect on their gameplay experience? Not a chance. That said, we were trying to run a scientific experiment here. At some point, every game will be playable on a given hardware configuration. Determining what that point is happens to be very important, but it's also arbitrary and very subjective. In our example of 55 vs. 59fps, the graphics settings probably will not change, which I imagine is the point you're trying to make.

My point is that we had to have a "control" set in our experiment. If you're looking for what the playable settings were for Vista in this article - that's easy - look at the screenshots. All of the games were tested on Vista first to determine what the playable settings were. Our control set was the Vista settings. We mirrored the settings in XP and ran the same framerate capture. That was the "experimental" set. The test was done blind and I didn't know the results until I started putting the article together.

Frankly, it would be a pretty crappy experiment if we kept changing the variables that we were testing under by messing with the graphics settings. The only way to be scientific was to associate numbers with our experience. In some cases, they turned out to be more statistically significant than others.

Also, when you start getting into graphics settings, it becomes EXTREMELY subjective and individual-specific as you start to place values on which graphics settings are more important. Do you prefer higher resolution, more AA, higher terrain textures, etc. I think the article would have been of little value to anyone if we sat niggling back and forth about if we should have bumped the AA down to 2X so that we could get to 12x10 from 10x7.

For some of the games, I think the net effect on the gameplay experience was well exemplified in the gross framerate discrepancy. While you may not be able to tell the difference between 55 and 59fps, I'd wager you could pick out 35 from 55fps. That's a tangible real-world effect.

If you think that looking at the minimum framerate is more indicative of the gaming experience than an average framerate over several minutes of gameplay, then so be it. Personally, I think picking out a single data point (1 second) over several minutes in order to determine the quality of gameplay is a bit ludicrous. I'll tell you what, though - I still have alllllll the framerate data. I'll see if I can put together a table for you.
 
Quick and dirty of the minimum framerates for all of the captures. Some of them aren't there because they weren't retested with the 158's. I'm not sure what this tells you that it doesn't tell me, but here you go.

 
People keep saying that XP is old, that XP is legacy, and that nothing is wrong with vista and that we all should be using it, but the question that I ask is why? People moved from 98 to XP (or atleast I did) simply because of stability. The Blue Screen of Death was the reason for the move. In XP, the only time you get those are with driver problems, and if you are complaining that your overclocked hardware is causing your XP to blue screen, then maybe you need to back down a little.
The biggest 'feature' with Vista that people seem to herald is the capability to play HD content. It's not that XP can't play a movie at that resolution or that it can't pass a high-quality signal to your sound card, but it's because of Hollywood. In order to gain the ability to play such 'content', Vista forces your programs to give up direct access to hardware. Even on Creative's own message boards, they have stated that in Vista that DirectSound does not exist. Instead it must be emulated and passed through a Windows Audio Session, and because of such EAX effects are no longer possible. The ramifications of this could range from those effect from being played, to preventing the whole game or program from running according to Creative. The same also applies to echo cancelation used in VOIP applications, since applications can no longer get the signal going out to your speakers to filter out of the incoming signal coming from your mic.

The only other big 'feature' reserved only for Vista is DirectX 10, however there is already a group working on getting that to work on XP, and already have an alpha version that is capable of running the Dx10 demos included with the developer SDK. They are planning on having a full featured version ready into for the halo 2 pc release.

So, with all the features that you lose, for features that you could already get on XP, why should you put up with decreased performance just to appease hollywood?
 
I guess you're too young to have been around for the early days of XP, then, when the same sort of nonsense was filling page space all over the place? When the titillation of the benchmark-obsessed was being pandered to whilst the people who simply made the change and got on with things were happily enjoying their gaming?


Same deal all over again, only this time it's Vista being decried and XP getting carried on about as if it's all of a sudden miraculously divine. Last time around Windows 98 was the Holy Grail, and XP the plaything of the devil.

Right now:

  • If benchmarks are more important to you than gaming (ie if your main concern is the framerate readout rather than the gameplay experience) then stick with XP.
  • If 3D audio in games is important to you then stick with XP.
  • If neither of the above are relevent to your gaming activity then the differences are so slight that you're not even going to have to turn your detail levels down!

Lot of hot air being expended about nothing! But I've yet to see an article like this'n admit the fact!

Hell,wish I was too young to have been around!:) Actually,at the time I went from 98 to XP and was grateful for it,never had anything but headaches with 98. My point is that after all this time and all the upgrades,you'd think they'd learn from past mistakes.After all,they have millions of our hard earned dollars to pay for R&D.
 
It's all very well to Vista is getting faster, due to better drivers, but it is still slower than XP.

Also, when can you find a definative list of DX10 supporting Video Cards? So far all I know of is the 8800 and ATI's new hot & expensive card.

I emailed the GFW website but they couldn't be bothered to reply. :mad:
 
Maybe what is needed to to look away from ATI and nVidia for a moment. We could test some other video cards with WDDM for Vista and some games like World of Warcraft and other other less demanding games. (or just laugh at the 800x600 resolution) Say Intel's X3000, x3100, GMA950 video cards and S3 Chrome S27. Maybe ever the S3 Chrome S27 in SLI? Then we could see if Intel and S3 also have framerate issues, or if its just an AMD/nVidia issue.
 
It's all very well to Vista is getting faster, due to better drivers, but it is still slower than XP.

Also, when can you find a definative list of DX10 supporting Video Cards? So far all I know of is the 8800 and ATI's new hot & expensive card.

I emailed the GFW website but they couldn't be bothered to reply. :mad:

Congrats, you've made a definitive list.

Vista actually only requires DX9 capable cards, but the DX10 games, to run in DX10 mode, will obviously require DX10 cards.
 
Absolutely appreciate and applaud you for the time taken to make those comments, Jason, and for the concessions made there. You're quite correct. What i'm getting at is whether or not the settings will need to change, and the fact that to the end-user it's all gonna be somewhat of an irrelevence if they don't.

I'd like to disagree with your comments about whether or not it'd be 'scientific'.

Sure, it'd be 'bad science' to chop and change about all over the place when testing, comparing unlike data sets. But it's every bit as much 'bad science' to take a very restricted set of measurements and run with conclusions drawn from them. It's just 'bad science' in a different way, is all. And it's perfectly possible to conduct valid and sound study regarding the impact on playability and the need (or lack thereof) to adjust detail settings. The only 'subjective' component of such a study would be the need to settle upon baseline figures to represent 'acceptable framerate' levels within the study. that wouldn't be difficult, because there seems to me to be broad general consencus that a game becomes 'unplayable' when it dips down below that 45fps or thereabouts average, or suffers minimums of under about 25fps or thereabouts during those intense moments. Sure, there's gonna be disagreements from particular individuals, but in general if figures around there or perhaps a wee tad higher were settled upon as 'baselines' it'd be acceptable enough to the majority of discerning folk.


A decent study of that type is gonna require a helluva lot more measurements to be taken, though. It's a bigger commitment. It's not necessarily what a writer is gonna want to do, have time to do, get paid enough to do. That's why I haven't been suggesting that it's what a writer should do. Instead, I've merely been arguing for acknowledgement of the constraints and of their import toward conclusions. But hey? Such a study would be of little benefit to anybody? Wow! To the end-user sitting there at the PC wanting to actually play the games and wondering how far the detail setting can be cranked up on the hardware he has it's about the most important consideration of all, isn't it?


Thanks for the table of minimums too, by the way. They tell me a couple of things:

  • Firstly, that nothing alarming gets introduced when the data set presented is expanded to include that measurement also. Acceptably playable framerates are being maintained, and major problems in the way of stutters shouldn't be getting introduced.
  • Secondly, that I'm damned glad I'm not a NFS:C fan! That particular game seems to buck the trend.
heh heh....


Gengar said:
People moved from 98 to XP (or atleast I did) simply because of stability.
That they did. AFTER all this same "Woe is me, it's slower!" stuff died down. Took near on 12 months before the migration became a vast wave. It's OK to look back now and suggest the people rushed to adopt XP because of stability, but that isn't really what was happening during the comparable period in its lifespan ;)
 
meanmodda[H];1031148655 said:
I recall playing around with NT5 and thinking I wish they made this more Dos compatible cause it was so much more stable than win98 at the time. Then win2k came around and imo its still the best operating system from MS . WinXP was the bloated POS update we didn't need frank(imo) .
My point is that Vista is a bit like NT5 , at the time no point getting it but the sequel became an excellent piece of software that had me dump win98 in a flash regardless of the limited dos support. It is possible by SP2 there will be no point in hanging with XP , but who knows.
Say what? Windows 2000 *is* NT 5.0
 
Absolutely appreciate and applaud you for the time taken to make those comments, Jason, and for the concessions made there.

I'd like to disagree with your comments about whether or not it'd be 'scientific'.

Sure, it'd be 'bad science' to chop and change about all over the place when testing, comparing unlike data sets. But it's every bit as much 'bad science' to take a very restricted set of measurements and run with conclusions drawn from them. It's just 'bad science' in a different way, is all. And it's perfectly possible to conduct valid and sound study regarding the impact on playability and the need (or lack thereof) to adjust detail settings. The only 'subjective' component of such a study would be the need to settle upon baseline figures to represent 'acceptable framerate' levels within the study. that wouldn't be difficult, because there seems to me to be broad general consencus that a game becomes 'unplayable' when it dips down below that 45fps or thereabouts average, or suffers minimums of under about 25fps or thereabouts during those intense moments. Sure, there's gonna be disagreements from particular individuals, but in general if figures around there or perhaps a wee tad higher were settled upon as 'baselines' it'd be acceptable enough to the majority of discerning folk.

No problem - happy to clear things up.

What conclusions did we make that you don't agree with? I mentioned that we jumped the gun a bit in the first article, but in the second article, we deliberately stepped away from saying anything concrete. The only "conclusion" that we came to, of sorts, is that we said that there is a very strong overall trend in the data that shows Vista consistently underperforming given the same hardware and test scenarios. I also said that the previous NVIDIA driver set was very disappointing in its performance and we saw a modest increase in performance, generally across the board, with the 158's. Did I make some other grandiose claim that I'm not aware of?
 
Say what? Windows 2000 *is* NT 5.0

It is and it isn't. Way way back (1997, maybe) PC Magazine did a hands-on with Windows NT 5.0 (way before the Window 2000 moniker was bandied about) and it was significantly different in featureset and day-to-day usage than what eventually became known as Windows 2000. If I can dig up that article I will link it here. Microsoft decided to give Windows 2000 the 5.0 revision number to make the public version number remain sequential (unlike what they did with NT 3.51). However, Windows 2000 != NT 5.0 so far as what was promised and what was actually delivered. Now that I think about it, it's very similar to the features originally promised for Longhorn and what was actually released as Windows Vista.

Found some discussions that illuminate some of the issues:

http://billslater.com/teach/nt/nt5intro/
http://news.com.com/2100-1001-217134.html
http://www.winsupersite.com/reviews/nt5_ws.asp
http://www.wired.com/science/discoveries/news/1998/12/16974

pay attention to the dates as things happen and changes occur.
 
My point is that we had to have a "control" set in our experiment.
...
Frankly, it would be a pretty crappy experiment if we kept changing the variables that we were testing under by messing with the graphics settings. The only way to be scientific was to associate numbers with our experience. In some cases, they turned out to be more statistically significant than others.
...
Also, when you start getting into graphics settings, it becomes EXTREMELY subjective and individual-specific as you start to place values on which graphics settings are more important. Do you prefer higher resolution, more AA, higher terrain textures, etc. I think the article would have been of little value to anyone if we sat niggling back and forth about if we should have bumped the AA down to 2X so that we could get to 12x10 from 10x7.
I don't see the difference between testing GPUs, CPUs or Operating Systems, so why was your normal method of testing GPUs and CPUs (I believe real-world gameplay equating to maximum playable settings) not applicable to this test? Did you not just do the "canned benchmark" here?
 
I would really like to see some testing done of gaming with Vista 64 and 4GB of memory. Vista loves memory and can make good use of it.
 
I would really like to see some testing done of gaming with Vista 64 and 4GB of memory. Vista loves memory and can make good use of it.

i'd love to see this too since my Gigabyte 590 SLI board with 4GB of memory can neither properly install nor run Vista with all 4GB installed in it (even after applying the hotfixes and other tweaks).
 
....but in the second article, we deliberately stepped away from saying anything concrete...

Saw that and appreciated it for what it was, Jason. In the original article the Introduction and Conclusion sections, together with the page header, made explicit comment which suggested that 'Gaming' itself suffered:
... gaming in Vista. Some folks are staying away from the new OS simply because they feel it doesn’t game well. We set out to put some hard numbers on those claims.
This time around the intimation is more in the Introduction than it is anywhere else. Wordings have been more cautious,but that Introduction provides the frame of reference for meaning, and it clearly associates itself with what has gone before. Comments there indicate that you're wanting to scientifically test claims that Vista is a 'poor gaming Operating System'. You're retesting to check on different hardware, re-using the games which 'perform poorly' and thus expanding the data set. As a reader I feel short-changed. As a person critiquing the article insofar as its adherence to scientific principles I'm left feeling that I have to be somewhat condemnatory. I've hoped for something which overcomes the shortcomings I'm seeing in articles elsewhere, and my hopes haven't been met.

As said, though, that's not so much a criticism of [H]ard or of yourself and your article as it is a criticism of the hardware reporting/review scene in general. Not all the 'other sites' simply run Canned benchmarks. Matter of fact there are plenty of sites which run more exhaustive testing than has been done here, producing voluminous reports of up to 30 pages or thereabouts in length. But regardless of each site's approach, the specific tests they've chosen, and the length and detail of the reports they've produced, it's almost ubiquitous to find that same, simplistic assumption underlying it all. "If I can measure a framerate drop then the 'gaming' must be worse!" None of it encompasses the reality that you've acknowledged in your comment earlier:

"Will a person notice a difference between 55 and 59fps as a net effect on their gameplay experience? Not a chance."


It's all fair enough for the technically competent, truly 'Hardcore' person who understands these matters, but who just can't overcome the fact that the knowledge of the small discrepencies is going to impact on their enjoyment, even though the actual playability hasn't really been affected. That's what we're really talking about here, if we're conceding that the small discrepencies don't actually impact on the payability to any noticeable extent. The person who is playing the game, but for whom it's unbearable that a higher framerate readout could have been achieved but isn't.

But just how much of the audience is really making such discernments? And how is it to be countered that for every competent 'hardcore' reader there is probably a dozen or more less capable readers, who are skimming the graphs and swallowing the intimations hook, line and sinker? Not just here, in this article. We simply see something here which is repeated yet again, on yet another site.

I keep waiting to see the "You know, these numbers don't show that your games won't run, or that you'll need to turn the settings down to get them running. It's just that if you get irritable about numbers not being as big as they might've been, you'll prolly get irritable!" The comment that locates the tests and results in the context of meaning.

Or, to be fairer about it, I keep waiting to see the tests run and reports written which have the scope to portray such considerations. Seems to me that far too many sites are simply throwing out the assumption that 'bigger numbers mean better gaming' and then throwing some hastility collected data up alongside it to demonstrate bigger numbers.


I'll start getting more excited when I see studies conducted which begin by setting that 'acceptably playable' baseline for average and minimum, be it 45/25 or some other such combination in the vicinity, and then testing to see what impact on that baseline the change of OS actually has. That's when I'll be seeing 'Gaming' results rather than 'Benchmarking' results. That's when it'll be truly 'Real World' testing, far as I'm concerned.


Bigger numbers mean absolutely jack when you've just spent $xx on a game and want to know if it'll run well enough. If we're comparing Operating Systems then that's the context in which the comparison is meaningful for 'Gaming'.
 
i'd love to see this too since my Gigabyte 590 SLI board with 4GB of memory can neither properly install nor run Vista with all 4GB installed in it (even after applying the hotfixes and other tweaks).

Nvidia has lots of work to do with the chipset drivers as well as the videocard drivers. Nvidia and Creative seem to be far behind others in preparing for a launch they had years to get ready for. They have mostly cought up with the video cards at least.

The 158.45 drivers makes my Vista control panel disappear. To get it back I have to disable and re-start some services at every boot up. Take the drivers off and the problem is magically gone. I realize the drivers are beta but come on.

I also get a system freeze every so often and again, its these drivers. Revert to the older driver WHQL and all better.

The great thing about this driver set is for gaming its the best so far with regards to frame-rates and IQ. For Vista 64 its the first good driver set, and combined with the latest creative drivers most of my gaming problems are solved.
 
So lets use a legacy os instead. Reminds me so much of OMG I can't run 16 bit code anymore, so I'll just keep using windows 98! Not that I agree with you one bit, however the comparison you give isn't very good.

XP = old, Vista = new. XP = Legacy.

Actually, I was referring to NV's driver issue. ATI seems to have their act together in that respect. I should have been more specific.

My OS is nothing more than an application that I use to launch programs that I'm interested in. My OS must be free of useless applications and glittery crap that gets in my way. Sorry, but I spend some hard earned coinage on hardware and I don't want to bung up the works with an inferior OS.

I'll stay with "Legacy" until something better comes along. New isn't always better. In the case if Vista, new is different.
 
Yeah and XP versus Vista can't be compared to W98 versus XP because W98 was not nearly as stable as XP. Moving to w2k or XP gave a marked improvement in system reliability in addition to the new functionality. XP is bloated though. But at least there was a real reason to migrate.

With XP and Vista it's reversed. Vista is currently nowhere near as stable as XP due to an abundance of driver/hardware compatability problems and Vista related bugs. So what does Vista bring on the table? Desktop eye candy and the artificially forced DX10 support which seems to be the last straw of the marketing to convince Vista is worth it.

If they'd build Dx10 to XP (entirely possible despite some fud around) most gamers would never adopt Vista. Some won't even as it is. Not like there's actually working DX10 capable drivers or games available.. :D
 
Vista is currently nowhere near as stable as XP...
I've been using Vista (64-bit) for a few months now. In that time I have had zero unexplained crashes. And no, I'm not just picking my ass web-browsing all day... The crashes I have had turned out to be from these causes:

  • bluescreen when booting with 4 GB (fixed with patch)
  • bluescreen when trying to come out of hibernation (needed BIOS update for my 680i board)

Even when the crappiest early releases of the 8800 GTX drivers would shit their pants, Vista would just restart the driver, instead of grabbing its ankles....

As for the lower framerates, my attitude is that I don't really care if FPS are worse by a small constant factor. What NVidia really need to work on is the games where there's a larger discrepancy, where the difference is really starting to impact on your gaming experience. Still, It could be that even when NVidia has better optimized their drivers for the WDDM, there will still be a general trend of a slightly worse performance in Vista. Personally I'm willing to accept a small performance hit like that, in exchange for some added stability.
 
I've been using Vista (64-bit) for a few months now. In that time I have had zero unexplained crashes. And no, I'm not just picking my ass web-browsing all day... The crashes I have had turned out to be from these causes:

  • bluescreen when booting with 4 GB (fixed with patch)
  • bluescreen when trying to come out of hibernation (needed BIOS update for my 680i board)

Even when the crappiest early releases of the 8800 GTX drivers would shit their pants, Vista would just restart the driver, instead of grabbing its ankles....

As for the lower framerates, my attitude is that I don't really care if FPS are worse by a small constant factor. What NVidia really need to work on is the games where there's a larger discrepancy, where the difference is really starting to impact on your gaming experience. Still, It could be that even when NVidia has better optimized their drivers for the WDDM, there will still be a general trend of a slightly worse performance in Vista. Personally I'm willing to accept a small performance hit like that, in exchange for some added stability.

Yeah so what you're saying in essence is that Vista is perfectly stable if you overlook all the times it faulted and that it's slower than XP. Heck I'll run into a store now to buy one. :D
 
If they'd build Dx10 to XP (entirely possible despite some fud around) most gamers would never adopt Vista. Some won't even as it is. Not like there's actually working DX10 capable drivers or games available.. :D


There is the 158.45 nvidia drivers combined with the DX10 patch for Company of Heroes. Its not polished, not optimized and you take a very large performance hit but there is at least 1 DX10 game now. Hopefully in the next few driver releases they can work the kinks out
 
Yeah so what you're saying in essence is that Vista is perfectly stable if you overlook all the times it faulted and that it's slower than XP. Heck I'll run into a store now to buy one. :D
I'm actually not interested in what you do. I just don't think you know what you are talking about when you say Vista is "nowhere near as stable as XP".
 
Just a few points to add:

Does few fps drop really matter ? In reality not, but in certain cases, looking at the test, (at least at the moment) it's a bit too much, especially considering today's widescreen lcds and gpu hogs like stalker of gothic 3. For the time being at least.

As far as for Vista - the level of overbloatness and overall bullshit (few years old halo 2 as vista exclusive, dx10 as vista only) that Microsoft has reached is quite stunning for me. But after all, they need leverage to make people move to Vista after all. Personally, if I wanted overbloated, I'd take linux with beryl wm. To which aero doesn't even compare, imo. but I still live in [real !] decent shell.

As for added complexity, I haven't seen anyone posting this: http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.html link. It's good read, but be sure to actually read, not skip it quickly. After that, current state of Vista suppport, drivers or not, should be less of a mystery.

My 2 cents.
 
I notice that Prey is the only game that performed better in Vista than XP (albeit by a small margin).

Is it possible that this is because it was the only one run in administrator mode with XP compatibility turned on? If you run the other games in this same way, does it impact the framerates of those games in any way.

I'm wondering if running the game like that bypasses some of the DRM slowdown present in Vista....

Friedmud
 
I know i'm a little late with this comment. But what about other factors when it comes to gaming in vista, like sound quality? Does 5.1 sound work okay?
 
I notice that Prey is the only game that performed better in Vista than XP (albeit by a small margin).

Is it possible that this is because it was the only one run in administrator mode with XP compatibility turned on? If you run the other games in this same way, does it impact the framerates of those games in any way.

Carbon was run in admin mode and it wasn't even close. No, I don't believe that was the cause. Prey only outperformed XP by 0.9fps. It was a higher deficit on the NVIDIA hardware.
 
Back
Top