Wide gamut, sRGB and ATI GFX cards

That sounds sweet. I'm gonna try a 48x0 and see what it can do with my wide gamut NEC LCD2690. This looks very promising.

The color space conversion would come in 100% handy, because that is something that cannot be done without a color aware application like Photoshop.

Interesting news.

Thanks for the update Tamlin.

Regards,

10e
 
Nice 10e! Please report back. I'm curious about the color space convertion and the 30-bit dithering they advertise possible on a 24-bit screen. :D
 
Anyone else with a 4850 and wide gamut monitor? I will definitely be picking up a 4850 if it does gfx lvl color correction.

hey 10e: do you already have a 4850 or are you gonna order one?
 
Too bad I'm nVidia fan, guess I have to wait for nVidia to "strike back" :)
 
So I did pick one up on the way home last night. I didn't post due to "housely" duties.

Outside of the cooling the card is pretty impressive for what it costs ($199.00). It is faster than my 8800GTS 512MB and single slot but that's another story.

The CCC from Catalyst 8.6 didn't show me anything special concerning gamuts or anything, just the typical ATI Avivo color controls which are useful because they allow saturation to be lowered a bit, great for the LCD2690. You can do this "per monitor" which is useful.

It seems this is available through an API. Maybe it's placebo effect, but I ALWAYS see less banding with ATI cards than nVidia cards (especially after calibration). I'll have to take some up close comparison shots.

My computer is in need of a restore, so I will do so tonight or tomorrow (ie. from image) and try some other stuff with my calibration software etc...

Regards,


10e
 
So I did pick one up on the way home last night. I didn't post due to "housely" duties.

Outside of the cooling the card is pretty impressive for what it costs ($199.00). It is faster than my 8800GTS 512MB and single slot but that's another story.

The CCC from Catalyst 8.6 didn't show me anything special concerning gamuts or anything, just the typical ATI Avivo color controls which are useful because they allow saturation to be lowered a bit, great for the LCD2690. You can do this "per monitor" which is useful.

It seems this is available through an API. Maybe it's placebo effect, but I ALWAYS see less banding with ATI cards than nVidia cards (especially after calibration). I'll have to take some up close comparison shots.

My computer is in need of a restore, so I will do so tonight or tomorrow (ie. from image) and try some other stuff with my calibration software etc...

Regards,


10e

There's supposed to be some hotfix drivers for the 48xx series. Perhaps they have put the new color management tools there? Vista is required though for the 30-bit (10-bit per RGB) dithering, since XP only supports up to 8-bit color depth per RGB (24-bit).
 
Update:

Either I'm blind or an idiot (or both), but I didn't notice that AS SOON as the ATI/AMD Catalyst 8.7 drivers came out, a lot of oversaturated images (in Vista only) came out less saturated on the LCD2690. Only in Vista though. In XP it's a requirement to use the saturation/AVIVO color controls, which is fine.

Sitting next to my standard gamut monitor, I would say 95% plus of the test images I've tried show no saturation difference, and some of them with strong cyans, greens, and reds which show the most difference, so there is definitely something happening here.

FYI
 
Update:

Either I'm blind or an idiot (or both), but I didn't notice that AS SOON as the ATI/AMD Catalyst 8.7 drivers came out, a lot of oversaturated images (in Vista only) came out less saturated on the LCD2690. Only in Vista though. In XP it's a requirement to use the saturation/AVIVO color controls, which is fine.

Sitting next to my standard gamut monitor, I would say 95% plus of the test images I've tried show no saturation difference, and some of them with strong cyans, greens, and reds which show the most difference, so there is definitely something happening here.

FYI

How can this help when you already have calibrated (?) the monitors internal 10-bit LUT of the NEC 2690?
 
This is due to colourspaces not calibration. If the software used to display an image is not colour aware (takes into account monitor profiles as well as the colour space of an image) colours will be displayed incorrectly unless the colourspace of the image is identical to that of the monitor.

The bigger the difference between the two the more noticable the colour shift is. The 2690 is a wide gamut screen and as the standard colourspace is sRGB which has a much smaller gamut colours will appear over staurated. In different colourpsaces the same colour number will be for different colours. R230 G110 B55 in sRGB is a different colour to R230 G110 B55 in AdobeRGB for example.

Calibration just ensures that with colour aware software the colour displayed will be correct.
 
This is due to colourspaces not calibration. If the software used to display an image is not colour aware (takes into account monitor profiles as well as the colour space of an image) colours will be displayed incorrectly unless the colourspace of the image is identical to that of the monitor.

The bigger the difference between the two the more noticable the colour shift is. The 2690 is a wide gamut screen and as the standard colourspace is sRGB which has a much smaller gamut colours will appear over staurated. In different colourpsaces the same colour number will be for different colours. R230 G110 B55 in sRGB is a different colour to R230 G110 B55 in AdobeRGB for example.

Calibration just ensures that with colour aware software the colour displayed will be correct.

I see, this is very interesting and i can see i still have a lot of basic stuff to learn. :)
 
Yes, this is true. I mistakenly mentioned calibrating to sRGB in the past as being a factor to getting the monitor closer to standard gamut (visually), but it turns out that this is not the case. Calibrating the monitor to sRGB in Windows XP shows no difference in saturation, though obviously the colors are (seemingly) more accurate. This is because obviously I am calibrating to sRGB gamma NOT gamut.

I thought originally it was FireFox 3 doing the work (which it is naturally, as it is color aware), but in Windows XP the same images are different between the two monitors, and in Vista (with ATI 4850/70) they are very close. Naturally there are some differences but they are far smaller now than ever.

In this case, I am commenting on Tamlin's earlier post concerning ATI's advertised gamut conversion that apparently is supposed to work in Vista. It may also be that Vista is properly color managed, but I'll have to grab my nVidia card to make sure that it is the ATI Cat 8.7 drivers (which I am also using on XP), because the same images in the cheapo image viewers on both Vista and XP look obviously different to me.

I can provide an image (I think/hope will show) the difference between the two monitors in XP, and virtually no difference in Vista.

I would hate to try and understand if this would wreak havoc on "color managed" workflow professionals, which based on my ignorance, it's obvious that I'm not :)

This is due to colourspaces not calibration. If the software used to display an image is not colour aware (takes into account monitor profiles as well as the colour space of an image) colours will be displayed incorrectly unless the colourspace of the image is identical to that of the monitor.

The bigger the difference between the two the more noticable the colour shift is. The 2690 is a wide gamut screen and as the standard colourspace is sRGB which has a much smaller gamut colours will appear over staurated. In different colourpsaces the same colour number will be for different colours. R230 G110 B55 in sRGB is a different colour to R230 G110 B55 in AdobeRGB for example.

Calibration just ensures that with colour aware software the colour displayed will be correct.
 
I have Vista and the desktop seems to be colour managed. The built in picture viewer is colour managed. I have a 2690 and dual boot with XP so I can see the difference in saturation easily between the two operating systems.
 
Same here with Vista 64 SP1 and XP 32 SP3.

So then it must be super advanced Windows Vista doing this. Same situation here. I even copy my TGT and calibration files from one SV install to the other to ensure consistency (I believe). Hail to Vista ;)

It finally has A use :)

I have Vista and the desktop seems to be colour managed. The built in picture viewer is colour managed. I have a 2690 and dual boot with XP so I can see the difference in saturation easily between the two operating systems.
 
What about 3rd party apps? irfanview? Games? Movies?

I'm also very interested to know this. Is it a Vista thing where the inbuilt apps are aware of it or is other software also positively impacted?

I never even took into account anything about colorspaces when shopping for a wide-gamut LCD. No time to follow everything anmyore; thank God for forums.:)
 
Movies/videos through WMP or Media Player classic look very close between the LCD2690WUXI and FP241VW. The difference is subtle and may be chalked up to IPS vs MVA panel differences. I'll try out a media player like Cyberlink PowerDVD and see what I find.

Games I'll let you know, because I'm reinstalling some into Beasta. IrfanView I'll also check out with same images I use that are packed with reds and greens that show the most in terms of saturation. What I'm more curious about is with the FP241VW what games will be like and if the color profile will get completely wiped out like it does in XP. The FP241VW has very obvious changes when calibrated, whereas the NEC is more subtle, so I look for saturation differences.

What about 3rd party apps? irfanview? Games? Movies?
 
Strange I am fairly certain Media Player classic is not color managed. I wonder how the OS is doing color management for non managed applications, how does it know which ones are color managed? Otherwise something like Photoshop would get doubly toned down...

Something doesn't add up here.
 
What about 3rd party apps? irfanview? Games? Movies?

Vista doesn't make non colour manged 3rd party apps colour managed. If an application isn't colour managed then colours will look off. WMP and other video players I have used are not colour managed. I have tweaked the settings in my Nvidia control panel so video (whatever the player) looks better for me - turned down the saturation a bit. I am not sure if Irfanview is fully colour managed. I don't use it so haven't found out.

I use Firefox 3 as my default web browser now as it has full colour management (when turned on).

Far Cry definitely isn't colour managed. I like the highly saturated effect though. Most games aren't. Half Life 2 has a colour correction setting (on or off) but I don't know exactly what it does.
 
Vista doesn't make non colour manged 3rd party apps colour managed. If an application isn't colour managed then colours will look off. WMP and other video players I have used are not colour managed. I have tweaked the settings in my Nvidia control panel so video (whatever the player) looks better for me - turned down the saturation a bit. I am not sure if Irfanview is fully colour managed. I don't use it so haven't found out.

I use Firefox 3 as my default web browser now as it has full colour management (when turned on).

Far Cry definitely isn't colour managed. I like the highly saturated effect though. Most games aren't. Half Life 2 has a colour correction setting (on or off) but I don't know exactly what it does.

Hmmm, so this supposedly color management on GFX level doesn't work on games and movies? Kinda dull if it doesn't effect games and movies. :(
 
Well I have a Nvidia graphics card so ATI't thing definitely won't work for me.
 
Hmmm, so this supposedly color management on GFX level doesn't work on games and movies? Kinda dull if it doesn't effect games and movies. :(
If it's done on GFX instead of colour management tools of OS then used application shouldn't have any effect to availability of colour space conversion.
 
I'm not sure the term "colour management" is properly understood by some in this thread.

A colour managed application only acts differently than a non-colour managed application if there is an ICC profile embedded in the content material. If there is no tag and/or profile embedded in the content, there's nothing to manage.

For instance, untagged JPEGs are generally processed as sRGB, because that is USUALLY what they are. When I generate JPEGs from my RAW format photos I specifically tag all of them sRGB. If I export something in AdobeRGB, it must be tagged with the aRGB profile. If I export a JPEG specifically for printing at a print house, I will convert it to the ICC profile of the specific printer and include that ICC profile in the JPEG (options when saving from Photoshop).

A non-colour-aware picture viewer would treat both photos as sRGB and the aRGB photo would look way off (pale and dim, generally). A colour-aware picture viewer would read the aRGB tags and be able to translate that into the currently configured display's profile and display the picture with proper colour. There are four ways to translate out-of-gamut colour information and those method must be chosen.

So, is any video content actually tagged with colour space information? HDTV, BR and content like that should all be in a standard colour space. The only alternative colour space I'm aware of that's anywhere close to the consumer market is x.v In that case, x.v content would need to be tagged as x.v, and a colour-aware player would have to translate from x.v to the colour space of your TV (if it isn't x.v).

I don't think that x.v is a factor in most people using Media Player Classic.

Now, calibrating a monitor so that it has proper grey scale, gamma and sRGB response (because sRGB is the computer "standard") will make your monitor as correct as possible. Applications such as games which expect sRGB colour space don't need to be colour managed to benefit from this. If the game expects sRGB and that isn't what your uncalibrated/profiled monitor displays - calibrating it will help regardless if the app knows anything about colour. The applications only needs to be colour managed if different colour spaces come into play (such as sRGB, aRGB, ProPhoto RGB with photos).
 
I'm not sure the term "colour management" is properly understood by some in this thread.

Now, calibrating a monitor so that it has proper grey scale, gamma and sRGB response (because sRGB is the computer "standard") will make your monitor as correct as possible. Applications such as games which expect sRGB colour space don't need to be colour managed to benefit from this. If the game expects sRGB and that isn't what your uncalibrated/profiled monitor displays - calibrating it will help regardless if the app knows anything about colour. The applications only needs to be colour managed if different colour spaces come into play (such as sRGB, aRGB, ProPhoto RGB with photos).

Except you can't calibrate a non sRGB monitor to sRGB. You make it sound like calibrating your monitor will correct for gamut differences for all applications. This is clearly not the case. They are different color spaces and you can only get proper color by using an output profile. Applications that don't use the profile to translate to the output color space will not be managed and will produce incorrect color (looking oversaturated when sending sRGB signals to wide gamut monitors).

Calibration:
If you have a non-sRGB monitor you calibrate it's gamma curve like just like you do for any monitor, this does NOTHING to change the inherent color differences. This just ensures that when you give red 185 in you get the appropriate level of red output to match the gamma curve you selected. Nothing at this step adjusts or reacts to color space.

Profiling:
The next step is profiling your monitor. This reads the actual color space of your monitor. This profile is used by color managed applications to do input to output color space translations. VERY few application do this.

Color Management:
Some applications do input color management, in that they will read the color space of the JPEG and translate, but they always assume an sRGB output. So they are not doing output color management as needed for different Gamut monitors. irfanView is one such application. It can (if enabled) translate your aRGB tagged files to sRGB for viewing, but it doesn't know about converting it to the color space of the monitor and always assumes sRGB (like most applications). AFAIK, outside of FF3 and a few graphics applications (and Vista desktop, newest Office?), nothing else uses the profile and color space will not be translated.

Operation System/Video card fix?:
What people are hoping for is something that will handle all the old non managed applications ( 99%+ of software). I am doubtful.

Video card:
I don't see a video card solution unless you treat everything as sRGB and use the profile to translate, effectively remapping ALL output to sRGB. There are issues with this. Actual profile aware color managed applications will be wrong because they will get double profiled. The second issue is that you have completely remapped your output device into an sRGB device (which is still likely inferior to a real sRGB). So what is the point at all of having wider gamut in this case? The remapping likely impacts display step size and image quality (more likely to have banding etc) and now you can't get the wider gamut at all.

Operating System:
The OS could probably pull this off because you could tag which applications get treated as sRGB and which are profile aware and passed through, but this is not there yet and who knows if it ever will be. There may be too big a performance hit to do color space translation for gaming.

I will probably be out of the monitor market for the next 4 years. I wonder if there will be a good solution for legacy applications even then?
 
Well it turns out that I was only partially right, or mostly wrong :)

In Firefox 3 on Beasta with color management disabled, the saturation differences showed up between monitors. I then realized I never enabled color management on XP Firefox 3, so I went ahead and did that, and things turned out better. The only problem is that my standard gamut FP241VW shows less saturation because Firefox can only use one ICM profile within the color management app, which in this case is the LCD2690WUXI profile. Using Firefox 3 on NEC and Internet Exploder on the BenQ, things look identical on test images with strong reds and greens, so I guess I will load Firefox on the NEC and Internet Exploiter on the BenQ for consistent browising colors :)

Additionally on videos the same thing with Media player classic. Skin tones aren't badly affected but strong greens and reds show up as over saturated. Games the same thing. The reds on the Gears of War menu screens are very strong on the NEC, and not so much on the BenQ.

It turns out that Windows photo gallery built in to Vista IS color managed. The desktop backgrounds showed up with saturation differences on Beasta as well, just like they did on XP.

Either way, it's not a huge deal for me, as I still absolutely love the NEC (as I also have a very good, uniform panel) and with console games the added saturation is not as bad, because I am used to it (as I turn up saturation on the BenQ through the OSD).

I plan on loading up Photoshop and seeing how accurate the color is, as I also have (what I believe is) a wide gamut printer in the Epson R1800, so I'll print out some test images and see what I find.

But it does seem that while this "gamut translation" is built in to the ATI drivers, it is not specifically enabled. Tamlin mentioned that it is only supposed to work in Beasta so I'll see what I can find concerning this, as I do prefer my ATI 4850/4870 to my 8800GTS (or 9800GTX+ dedicated 24/7 to folding right now), and I don't plan to remove the Radeons for a while. Also the 10-bit color depth is an added bonus, though I'm seriously doubting that it is doing much.

Regards.
 
But it does seem that while this "gamut translation" is built in to the ATI drivers, it is not specifically enabled. Tamlin mentioned that it is only supposed to work in Beasta so I'll see what I can find concerning this, as I do prefer my ATI 4850/4870 to my 8800GTS (or 9800GTX+ dedicated 24/7 to folding right now), and I don't plan to remove the Radeons for a while. Also the 10-bit color depth is an added bonus, though I'm seriously doubting that it is doing much.

Beasta? :confused:

10 bit is almost certainly doing nothing unless you can convince me you have a 10 bit data path to your monitor and your monitor is 10 bit aware and accepts 10 bit signals and does the proper thing with them. IOW, no it won't do anything with current LCDs.
 
The explanation that I heard is that it uses a form of dithering to "approximate" it. Yay more dithering!

Apparently the Dell 2709W does this as well to give supposedly expanded color space of 1.04B colors. Though I don't know how this is done if it's getting 8-bit input (????)

Beasta = Vista = big RAM eating OS with lotsa nice bloatware to enhance efficiency, peformance and productivity. ;)

Beasta? :confused:

10 bit is almost certainly doing nothing unless you can convince me you have a 10 bit data path to your monitor and your monitor is 10 bit aware and accepts 10 bit signals and does the proper thing with them. IOW, no it won't do anything with current LCDs.
 
I really don't wan't my graphics card adding dithering, thanks, but no thanks.

It couldn't be FRC because they don't have the bandwidth to do that. So it would have to be static, which seems like a complete and silly waste. Besides what applications will support this "10bit" output?

Basically this is a bunch of marketing smoke and mirrors. It would likely do more harm than good if enabled.
 
Yes, out of control marketing with little to no useful real world results. We should be welcome to this new monitor landscape where TV features and marketing continue to pervade.

I will try out software calibration of sRGB mode on the 26" display and see how that works out. Just for fun after I take some super macro photos of text with different sharpness levels.

Long weekend crash testing at its best.
 
The downside
DisplayPort output support

* 24- and 30-bit displays at all resolutions up to 2560x1600

No mention of 30bit support on HDMI :(
 
10 bit is almost certainly doing nothing unless you can convince me you have a 10 bit data path to your monitor and your monitor is 10 bit aware and accepts 10 bit signals and does the proper thing with them. IOW, no it won't do anything with current LCDs.
It couldn't be FRC because they don't have the bandwidth to do that. So it would have to be static, which seems like a complete and silly waste. Besides what applications will support this "10bit" output?
Using spatial/temporal dithering for simulating 10 bit colour accuracy in graphic cards output wouldn't require any additional bandwidth because images would be sent to monitor still using 8 bit accuracy.
That would be exactly same thing as how TN monitors claim 8 bit colours even though TN LCD panels are only 6 bit accurate... and as I recall you're the one keeping it as perfectly working system.

And while actual 10 bit accuracy from source to output device and device capable to it would be better this could still help avoid some of the colour banding problems when adjusting colours/gamma from graphic card's settings.
Logically thinking as potential downside this already present FRC might cause somekind interference with monitor's own processing... especially with that for FRC in TNs.
 
Using spatial/temporal dithering for simulating 10 bit colour accuracy in graphic cards output wouldn't require any additional bandwidth because images would be sent to monitor still using 8 bit accuracy.
That would be exactly same thing as how TN monitors claim 8 bit colours even though TN LCD panels are only 6 bit accurate... and as I recall you're the one keeping it as perfectly working system.

FRC/Temporal dithering is done by rapidly varying the signal, you would need a lot more bandwidth to do this with the graphics card as you would have to rapidly cycle between images faster than the current refresh rate. At minimum you would need to send the signal at double the current 60Hz refresh. So yes it does need more bandwidth, double at minimum, not to mention how well this works is tied intimately to the monitor involved (can it even cycle fast enough).

All you can offer is static/spatial dither which is completely pointless. If there was a better shade to use on the graphics in the first place, you would already be using it.
 
Beasta? :confused:

10 bit is almost certainly doing nothing unless you can convince me you have a 10 bit data path to your monitor and your monitor is 10 bit aware and accepts 10 bit signals and does the proper thing with them. IOW, no it won't do anything with current LCDs.

There are LCDs with 10bit colour (not cheap yet).
http://www.displayblog.com/2008/06/10/hp-dreamcolor-lp2480zx-24-lcd-monitor/

My TV has x.v.color so can display up to 16bits per colour.
Some Blu-Ray players handle deep color or x.v.color and there is media that uses them.
 
Yep ok, it may work if you have $3000+ for a serious pro monitor, but for monitors that people buy it is still the case. Do you know which software/OS will take advantage of 10 bit output?

My 42" Plasma TV is half that price and does 48bit colour so there are some affordable solutions.

HDR will (and needs to) be able to use the larger colour space.
I'm not sure if current HDR implementations can use it though.
Art packages and video editing software benefift too.

I dont know of any software that can utilise more than a 24bit display available now.
Looks good for the future though :)
 
...So then it must be super advanced Windows Vista doing this. Same situation here. ...

I have Vista and the desktop seems to be colour managed. The built in picture viewer is colour managed. I have a 2690 and dual boot with XP so I can see the difference in saturation easily between the two operating systems.

...It may also be that Vista is properly color managed...

Sorry, it sounds nice but no Vista does not properly manage colors (8800 GT). The best way to see this is just put an image on your desktop that is sRGB and then view the same image in PS. Same with IE. Vista didn't change a thing except for the Picture Viewer (not IE, desktop or other apps and OS operations).

Now if there is a trick I am missing, hit me with it.

So is there a setting in the card that enables this feature or what? This is the sort of thing I doubt would be in a video card review but if it works that would be exciting stuff.
 
Yep, I posted more findings later and rectified my original thoughts.

I then proceeded to do the following:

1) Warm up the NEC for an hour
2) Put it in sRGB mode and wait for a while (half hour or so)
3) Calibrate with BCC 4.1.8 in XP (crashes in Vista 64)
4) Transfer the color profile to Vista from XP and install

Long story short, the colors are, as expected, far closer to sRGB than before. Greens are a bit too saturated, and reds are a bit maroon looking. It is similar to using ATI CCC to lower saturation by 20%, but not as fast (as the monitor needs time to "adjust to sRGB). Additionally if the ATI CCC avivo approach works for games, I think I'd prefer that right now.

And you are right, none of the included applets other than Windows Gallery/Photo Viewer are color managed in Vista.



Sorry, it sounds nice but no Vista does not properly manage colors (8800 GT). The best way to see this is just put an image on your desktop that is sRGB and then view the same image in PS. Same with IE. Vista didn't change a thing except for the Picture Viewer (not IE, desktop or other apps and OS operations).

Now if there is a trick I am missing, hit me with it.

So is there a setting in the card that enables this feature or what? This is the sort of thing I doubt would be in a video card review but if it works that would be exciting stuff.
 
Profiling:
The next step is profiling your monitor. This reads the actual color space of your monitor. This profile is used by color managed applications to do input to output color space translations. VERY few application do this.

...

Video card:
I don't see a video card solution unless you treat everything as sRGB and use the profile to translate, effectively remapping ALL output to sRGB. There are issues with this. Actual profile aware color managed applications will be wrong because they will get double profiled. The second issue is that you have completely remapped your output device into an sRGB device (which is still likely inferior to a real sRGB). So what is the point at all of having wider gamut in this case? The remapping likely impacts display step size and image quality (more likely to have banding etc) and now you can't get the wider gamut at all.

Solution?
Load a color-profile to the graphic-cards LUT, yes - as you mentioned that would defeat the whole purpose of a wide-gamut display but it's getting increasingly harder to get a non-wide-gamut display anyway. Also you could easily swap out that profile when doing other work.

This seems to be possible on ATi cards:
http://www.driverheaven.net/vista-r...2-ati-color-control-custom-color-profile.html

Wouldn't that would work to *convert* a wide-gamut to an sRGB display (no, won't be as good as a native sRGB display) and when you don't want it (like when working in PS) you just turn it off or switch profile.

Sounds to good to be true so I assume that I've missed something, I hope someone points it out :)
 
Back
Top