Watch Dogs Image Quality Preview [H]

you don't need to use process explorer to see the amount of committed memory. i just did it, it was 2.2 GB before and 7.2 after.
Thanks for that.

You don't mention what settings you use but, to put it into perspective, the 'memory in use' part of it only shows 2.1gb of overall memory being used by the application. The rest of it would be those 2gb of video memory you have plus whatever non-local VRAM there isn't space for on the GPU. That's 900 megabytes of video memory spilling into system RAM.
 
Wow, just wow. And the best part is that these textures are very poor. We are entering a new era of lazy ports from consoles with 8GB of unified memory.
Is it really going to be a problem over the length of the generation?
A year from now, 16 GB RAM and 4 GB VRAM will be standards. Who cares.

You'll look back at this thread a few years from now and laugh.
 
Oh hey look Titan's with 6 GB of vram are finally showing their usefulness vs the other 3 GB cards. Too bad it took a year to get here but I'm betting a lot more of these next gen games will do the same.
 
Is it really going to be a problem over the length of the generation?
A year from now, 16 GB RAM and 4 GB VRAM will be standards. Who cares.

You'll look back at this thread a few years from now and laugh.

The problem is that this game doesn't really look any better than any of the other games we've had that use far less VRAM/System RAM and run far better.

If I am going to be upgrading to 16+ GB of RAM and 4+ GB VRAM for a game, I want the game to actually reflect that in terms of image quality and performance.
 
The problem is that this game doesn't really look any better than any of the other games we've had that use far less VRAM/System RAM and run far better.

If I am going to be upgrading to 16+ GB of RAM and 4+ GB VRAM for a game, I want the game to actually reflect that in terms of image quality and performance.
^ THIS
 
Wow, just wow. And the best part is that these textures are very poor. We are entering a new era of lazy ports from consoles with 8GB of unified memory.

I honestly don't think it's laziness. I think MegaTexture is amazing in principle but it is also hampered by download caps and hard drive sizes. People tend to knock the 'database sizes' of id tech 5 games when the bigger issue is simply the shipping media; both Rage and Wolfenstein would look incredible if 1tb download sizes were acceptable.

The biggest issue Wolfenstein faces (on PC) is people trying to run it with uncompressed textures. There really is no point to it because both settings look identical while costing almost twice the amount of VRAM if not more. It's not so much laziness on the developer side as much as people having unrealistic expectations.
 
If they are identical, why even give that option?
As for 1tb of textures - what? There are games with great textures that weigh less than 40gb so if idtech needs insane 1tb for its textures to shine, then it means it's badly written.
 
I just checked my ram usage and it is way different than it was earlier this week. I distinctly remember seeing WD using over 8gb of system ram at several points when I was playing Tuesday night and being shocked at how much memory it was taking up. I just ran through the city for about 15 minutes and it topped out at around 2.5gb. More surprising to me is the VRAM usage, 1.1gb used. This is at Ultra textures, ultra level of detail, high reflections, medium shadows, MHBAO, high water, medium shaders, no AA @ 3840x2160. Since people had mentioned it I did notice some stutter when the game first loaded but after about a minute in game it went away and moved into "console smooth" 30 fps :p

jm2ySuy.jpg
 
What I want to know is how can a game that looks somewhat mediocre to whats out there now, and even more so compared to the original 2012 E3 demo, yet use so much resources RAM wise?

Something is not computing because it isn't earth shatteringly fabulous at High or Ultra (which does look better).
 
I had a look at my usage in a Titan, running through the thick part of the main city I'm sitting at 3.5 GB of vram used. This is 1080p with everything at max (no motion blur, no DOF) and 4x TXAA. I don't think that is bad considering it is an open world game.

Weird tip, disabled my pagefile and all of my stuttering has went away. I can now run 4x TXAA at 1080p. I may try with 4k tomorrow and see what I can crank it up to.
 
I had a look at my usage in a Titan, running through the thick part of the main city I'm sitting at 3.5 GB of vram used. This is 1080p with everything at max (no motion blur, no DOF) and 4x TXAA. I don't think that is bad considering it is an open world game.

Weird tip, disabled my pagefile and all of my stuttering has went away. I can now run 4x TXAA at 1080p. I may try with 4k tomorrow and see what I can crank it up to.
Why use TXAA when you can use temporal SMAA and get much better performance and no blurriness?
 
Why use TXAA when you can use temporal SMAA and get much better performance and no blurriness?

I guess there are trade offs to be had between using TXAA or SMAA. I may try both and see what I personally like the best. I like the fact that (I personally) don't feel the need to use either at 3840x2160 @ 185 ppi.
 
Why use TXAA when you can use temporal SMAA and get much better performance and no blurriness?

the txaa implementation in this game seems better than in other games. it's not ridiculously blurry like it is in arkham origins and it completely removes all aliasing. smaa leaves a lot of aliasing and it looks kind of ugly.
 
the txaa implementation in this game seems better than in other games. it's not ridiculously blurry like it is in arkham origins and it completely removes all aliasing. smaa leaves a lot of aliasing and it looks kind of ugly.
It looks really blurry to me in the screenshots even on Nvidia's site. Plus temporal SMAA has hardly any performance hit, needs hardly additional vram over no AA, and looks as good or better. I just dont understand the point of TXAA has its seems no pros at all to me.
 
TXAA also misses some extreme angles of AA that SMAA catches, note the first AA comparison screenshot in our preview.
 
TXAA also misses some extreme angles of AA that SMAA catches, note the first AA comparison screenshot in our preview.

THIS. SMAA just looks way better then TXAA in all of the [H]ard screenshots, so I can't believe anyone would say that TXAA is looks better unless they are simply trying to defend having that option for some reason.:confused:
 
For me TXAA look better during camera movement - less shimmering, flickering, pixel crawling.

I'm sitting at 3.5 GB of vram used. This is 1080p with everything at max (no motion blur, no DOF) and 4x TXAA. I don't think that is bad considering it is an open world game.
It's "not bad" compared to what? Do you know any other open world game that requires more memory?

I guess that when you have a titan, any usage below 6GB is "not bad" :)
 
Last edited:
I'm not sure how people are claiming the game is poorly optimized because it uses higher resolution textures. This isn't anything new. Download 4K texture packs for Skyrim, turn up the AA and see what happens. People have been buying 4GB cards for modded Skyrim since the NVIDIA 600 series, so it's not like this should be a surprise to anyone that open world games with large textures need a big frame buffer.

High resolution textures use a lot more VRAM (exponentially more). Anti-aliasing, particularly MSAA, uses a lot more VRAM.

If you want to run Ultra textures and high levels of AA you are going to need more VRAM. The only "optimization" you can do is reducing the size of the textures and/or using greater compression, which would negate the entire point of using higher res textures in the first place. Ubisoft already provides you with options to fix the problem - set the textures to High instead of Ultra. Or upgrade your PC. The only other thing they could do would be to use a streaming texture system like MegaTextures which has it's own set of technical problems, which IMO are worse (texture pop-in).
 
Only problem is that these "ultra" textures that require absurd amounts of vram are not in fact of ultra quality. They are good, but not incredible, yet they require incredible amount of memory. Also ubi and nvidia mislead people by saying that 3gb is enough for ultra, while in fact game is unplayable because of constant hitching and stuttering.
 
Only problem is that these "ultra" textures that require absurd amounts of vram are not in fact of ultra quality. They are good, but not incredible, yet they require incredible amount of memory. Also ubi and nvidia mislead people by saying that 3gb is enough for ultra, while in fact game is unplayable because of constant hitching and stuttering.
That is certainly your opinion, but I haven't seen any screenshots of Ultra where I thought that the textures were of particularly low quality. You can't objectively qualify whether or not the textures are good or bad, but for the most part the texture quality from what I have seen seems good. I haven't bought the game yet, so I don't have first hand experience, but I'd welcome any screenshots of textures you find to be lacking. Not every single texture is going to be incredibly high resolution, but the Ultra textures in the comparison scenes in this preview do show an obvious improvement over High quality.
 
I never said they were low quality. I said they were good but not exceptional (as the name ultra suggests). And on so called "high" setting the quality is low for 2014 standards. Curiously, there is no "low" setting. This is psychological manipulation. Medium texture setting should be called low, high should be called medium and ultra should be called high. And they should stop lying that 3GB is enough for highest quality textures.
 
I never said they were low quality. I said they were good but not exceptional (as the name ultra suggests). And on so called "high" setting the quality is low for 2014 standards. Curiously, there is no "low" setting. This is psychological manipulation. Medium texture setting should be called low, high should be called medium and ultra should be called high. And they should stop lying that 3GB is enough for highest quality textures.
Most likely they just named them according to whatever their internal metric is (512x512, 1024x1024, 2048x2048, etc). I think you are blowing this a bit out of proportion. The texture quality isn't the biggest issue with this game by far.
 
Only problem is that these "ultra" textures that require absurd amounts of vram are not in fact of ultra quality. They are good, but not incredible, yet they require incredible amount of memory. Also ubi and nvidia mislead people by saying that 3gb is enough for ultra, while in fact game is unplayable because of constant hitching and stuttering.

Exactly, the texture quality on ultra doesn't justify the high usage of vram. If the texture quality was on par with say crysis 3 or metro last light then it would be understandable. The texture res isn't 4k proly not even 2k, sleeping dogs looks better than watch dogs and runs butter smooth /shrug
 
Considering when I play at 1080p on ultra (for everything including 4x TXAA) and I only use 3.5 GB vram I imagine you could probably play with 3 GB of vram if you lower / turn off AA. One thing that greatly reduced my stuttering even at 3840x2160 was turning off my pagefile.
 
Exactly, the texture quality on ultra doesn't justify the high usage of vram. If the texture quality was on par with say crysis 3 or metro last light then it would be understandable. The texture res isn't 4k proly not even 2k, sleeping dogs looks better than watch dogs and runs butter smooth /shrug


Sleeping Dogs had its share of lowres textures but the game looked great overall and you could play in 1080p60 on 2GB 670. Judging by this metric, Watch Dogs should look twice as good as Sleeping Dogs, and it clearly doesn't.

Considering when I play at 1080p on ultra (for everything including 4x TXAA) and I only use 3.5 GB vram I imagine you could probably play with 3 GB of vram if you lower / turn off AA. One thing that greatly reduced my stuttering even at 3840x2160 was turning off my pagefile.
What does you having to manually disable page file tell you about the quality of this port? I would be surprised if turning off AA helped, just like the new drivers from nvidia and day one patch from ubi did not help. Besides, if I have to disable AA to play then something is seriously wrong here. Not to mention the game would be horribly aliased and therefore ugly.

Most likely they just named them according to whatever their internal metric is (512x512, 1024x1024, 2048x2048, etc). I think you are blowing this a bit out of proportion. The texture quality isn't the biggest issue with this game by far.
This is the first time in recorded history that a game requires 4GB VRAM for best textures, and these textures don't even look THAT good. That is a big deal, unprecedented is the fitting word here.
 
Last edited:
Skyrim with uber-crazy texture packs? Then again, that would look better than this game...

Yeah, but EVERY other game I know either offers comparable texture quality and requires less memory, or requires the same amount of memory but offers vastly superior textures. Watch dogs texture/memory management is extremely poor.
 
Considering when I play at 1080p on ultra (for everything including 4x TXAA) and I only use 3.5 GB vram I imagine you could probably play with 3 GB of vram if you lower / turn off AA. One thing that greatly reduced my stuttering even at 3840x2160 was turning off my pagefile.

The fact that turning off the page file improves the game performance just screams that its a poorly ported console game. It was designed with certain memory usage behaviors that fit into what consoles provide with lower resolution textures (most likely). Turning off the page file forces the game to use the ram differently.

I am glad they gave the option to turn up quality for PC users, but there's something amiss when disabling a page file improves performance.

From what friends have told me about this game as far as the gameplay goes, they were dissapointed and said GTA was more fun. This game was mainly hype. Save your money, wait for a steam sale, or buy it on the console.
 
Last edited:
To all those who try to defend ubi about the vram requirements. Ubi themselves acknowledges this is an issue and are investigating it:
http://steamcommunity.com/app/243470/discussions/0/540743032548687853/ said:
- PC - Frame rate stutter on high end GPU

Game stutters on ultra textures and other lower settings on certain hardware.

[Status] Issue reported - updates to follow
 
Read that carefully, that doesn't say it's an "issue" that just says the issue from feedback has been reported. "Updates to follow" could just mean better explanation of how it works. It doesn't mean it is an issue they can fix right out. I await the updates to follow.
 
Read that carefully, that doesn't say it's an "issue" that just says the issue from feedback has been reported.
I am not a native speaker, what is the distinction? Name of the topic is "Watch Dogs Known Issues"

"Updates to follow" could just mean better explanation of how it works. It doesn't mean it is an issue they can fix right out. I await the updates to follow.

What's there to explain? In game options it says that 3GB is required for Ultra textures, yet the game is unplayable on 3GB card with ultra textures. In nvidia's performance guide they also recommend ultra textures for 780Ti, yet it is unplayable because of the stutter. Maybe they will fix it by changing 3GB to 4GB in the text.

At least they acknowledged that they are aware of the problem.
 
Last edited:
I am not a native speaker, what is the distinction? Name of the topic is "Watch Dogs Known Issues"



What's there to explain? In game options it says that 3GB is required for Ultra textures, yet the game is unplayable on 3GB card with ultra textures. In nvidia's performance guide they also recommend ultra textures for 780Ti, yet it is unplayable because of the stutter. Maybe they will fix it by changing 3GB to 4GB in the text.

At least they acknowledged that they are aware of the problem.

All they acknowledged is that an "issue" has been reported by others. They haven't come out and said it IS an issue.
 
Anyone have to modify their typical GPU overclock(s) for Watch Dogs? I have a couple OCs that I use, most games work fine with the max OC and a few need the lower OC to be stable. Watch Dogs didn't play nice with the lower OC so it looks like I might have to make an even more conservative OC for it.

The 780 HOF still handles it nicely at manufacturer speeds. GPU will be showing heavy usage (70-99%) and the game will still be running smoothly. I think I dropped AO a couple notches, use TXAA2 instead of 4, and maybe dropped shadows a notch with everything else at max. Primary goal was to keep the more prominent visual stuff at ultra and then chisel away at the minor details.

As far as stutter, I get an occasional hitch but it's mostly fluid. Maybe that stutter is the legit version having trouble phoning home because the servers are being hammered. ;) I kid.
 
Back
Top