Something's been bugging me about one of the 2490's I bought. Basically, it's white level and brightness is VERY inconsistant. It seemed much "colder" on the edges than the middle. I have two monitors, one with the SpectraView bundle, so I was able to check the white level characteristics of both screens.
Calibrated to 220cd/m2 and with ColorComp off, screen 1 has a maximum variance of 489K and 27cd/m2 between the center and the upper left corner.
Screen 2 has a variance of 1,015K and 43cd/m2 between the center and upper left corner.
Now the ColorComp feature set to "3" was able to bring this discrepancy lower, but at the cost of brighter black levels, lower measured contrast, and (less importantly since it's so bright already) lower max brightness.
Is this a "standard" level of deviation for these screens? At over 1000K color difference it was horribly obvious that something was off, while the other screen with less than half the issue was much harder to tell (and can be brought to nearly perfect at a lower ColorComp level).
Calibrated to 220cd/m2 and with ColorComp off, screen 1 has a maximum variance of 489K and 27cd/m2 between the center and the upper left corner.
Screen 2 has a variance of 1,015K and 43cd/m2 between the center and upper left corner.
Now the ColorComp feature set to "3" was able to bring this discrepancy lower, but at the cost of brighter black levels, lower measured contrast, and (less importantly since it's so bright already) lower max brightness.
Is this a "standard" level of deviation for these screens? At over 1000K color difference it was horribly obvious that something was off, while the other screen with less than half the issue was much harder to tell (and can be brought to nearly perfect at a lower ColorComp level).