Sony OLED PVM 2541A TRIMASTER EL

zzcool

Limp Gawd
Joined
Jul 2, 2009
Messages
138
i don't know if there is a thread on this particular monitor but i am very very VERY interested in this one

i have asked sony myself and they confirmed that the TRIMASTER EL brand is the true sucessor too trinitron that alone made me want it

i also asked them if this would be good for gaming movies and general use and they said yes

so my question is are there any negative sides on this one or is it a monitor worthy of replacing the Sony GDM FW900 24 inch widescreen trinitron monitor?
 
I read on here that it was going to be years before they knew how to make a computer monitor.
Is this truly an OLED monitor or are they playing word games?
Of course there is no mention of the cost. Most likely way out of the question of buying one.
 
It's meant for professional purposes so it has some more exotic inputs and certainly comes with a big price tag (~5000$ iirc) so it's no surprise that there hasn't been much enthusiasm about getting one. I'm sure the picture is great for what it was made for, but I have my doubts whether the early OLED panel it uses is quite the perfection for consumer use.
 
The Sony PVM-2451 was being discussed as far back as 2011.
http://hardforum.com/showthread.php?t=1586235&page=7

Problem is that this is NOT intended for use as desktop computer monitor, but as a professional VIDEO production monitor for TV/Movies.

Video is a very averaged/dynamic source and as such burn in will be quite even. Even so they recommend frequent calibration IIRC.

Desktop monitors OTOH keep a lot of static images on screen, and until burn-in is significantly improved (which might not happen for a decade or more), OLEDs won't really be suitable for desktop monitors.

For now OLED desktops are a pipe dream.
 
Indeed...the idea of paying 2.5 to 3 times what the FW900 cost and then having it burn in would be beyond my emotional and fiscal tolerance...

Very much hope it's not 10 years though....
 
The Sony PVM-2451 was being discussed as far back as 2011.
http://hardforum.com/showthread.php?t=1586235&page=7

Problem is that this is NOT intended for use as desktop computer monitor, but as a professional VIDEO production monitor for TV/Movies.

Video is a very averaged/dynamic source and as such burn in will be quite even. Even so they recommend frequent calibration IIRC.

Desktop monitors OTOH keep a lot of static images on screen, and until burn-in is significantly improved (which might not happen for a decade or more), OLEDs won't really be suitable for desktop monitors.

For now OLED desktops are a pipe dream.

so theres nothing to replace the fw900 yet then oh well thats that dream shot down

the monitor costs about 6000$ it's an insane amount of money but it's affordable if you do part time payment which is something i was considering next year if it indeed was better then the fw900 in a flat format but if it has burn in issues then it's not meant for normal use

i wanted a flatscreen with as good clear colors or better then the fw900 with perfect gaming capabilities and perfect black

as finding a fw900 right now is impossible it's getting tiresome owning crts my p1130 has been acting up on me so i am now on a 19 inch sony multiscan g420

i can't keep this up i want a flatscreen good enough to replace the fw900 :(
 
you make it sound as if the fw900 isn't a flatscreen. Perhaps you meant to say "thinscreen" :p

and my understanding is that even those fancy PVM and even BVM trimaster oleds only support 1080p and have a native resolution, so if you're looking for quality, keep your eyes out for another fw900 or a good 21 inch trinitron.
 
interesting point. Suppose it were impulse driven - would that allow you to display a still image for the amount of time it takes for the impulse to decay? When I program visual stimuli in psychtoolbox for matlab, I can only get the stimulus to display for a single frame, which on our new vpixx display is 8.33 ms. How would an impulse driven display interact with this. I'm assuming the temporal width of the impulse function is on the order of microseconds or perhaps 1-2 ms.

I'm trying to wrap my head around all these ideas (and I figure with all your lightboost testing you have a solid grasp on these concepts!).
 
Maybe is time for the population to wake up and don't accept crap items. OLED dream is the result that of the current technologies. And that is simply because current technologies are so bad, and people desire for much more. But guess what, if we don't MAKE them come up with proper quality, they won't give it to us. Where are the days of "lets do this monitor good enough so that people will want to change"? Why do people buy items that aren't as good as the top of the line versions of current tech? Do you really want to get yourself in a new "LCD" like era and wait 10 more years to get a decent panel? I really wish population would wake up and not buy an item just cause is new, instead buy it ONLY IF ITS BETTER !!!!
 
(Most) People tend to be sheep. Always have, always will. There's a reason the marketing engine works...
 
i've gotten a confirmation from sony that their oled monitors does not suffer from burnin problems

Thanks for been a loyal Sony user for so many years, the PVM-2541 is a great monitor, and I will say that’s better compared to a CRT even our prof. HD CRT monitors, OLED don’t suffer with burn in problems.
 
i've gotten a confirmation from sony that their oled monitors does not suffer from burnin problems

All you found was an idiot salesman who doesn't know his own product:

From the 2541 manual:

http://pro.sony.com/bbsc/assetDownl...$SEL-asset-299223$original&dimension=original

On Burn-in
Due to the characteristics of the material used in the
OLED panel for its high-precision images, permanent
burn-in may occur
if still images are displayed in the
same position on the screen continuously, or repeatedly
over extended periods.
Images that may cause burn-in
• Masked images with aspect ratios other than 16:9
• Color bars or images that remain static for a long time
• Character or message displays that indicate settings or
the operating state
• On-screen displays such as center markers or area
markers
To reduce the risk of burn-in
• Turn off the character and marker displays
Press the MENU button to turn off the character
displays. To turn off the character or marker displays
of the connected equipment, operate the connected
equipment accordingly. For details, refer to the
operation manual of the connected equipment.
• Turn off the power when not in use
Turn off the power if the viewfinder is not to be used
for a prolonged period of time.

All these warnings are aimed at Video production usage. Essentially anything other than viewing 16:9 video may cause burn in.

As I stated before, this is definitely not suitable for use as a desktop computer monitor.
 
Last edited:
interesting point. Suppose it were impulse driven - would that allow you to display a still image for the amount of time it takes for the impulse to decay?
Correct, static frame flashed for a specific impulse length. It can be a full-screen strobe, or a sequential strobe (pixel-at-a-time, or scanline), the visual effect is essentially the same to the human eye, the length of a flash per pixel per refresh. Basically, for fps=Hz motion (120fps@120Hz), eye tracking along the vector of object motion, milliseconds rounded-off to the nearest 1ms for mathematical simplicity:

sample-and-hold (8ms @ 120Hz) -- baseline
50% frame impulse (4ms flash) -- 50% less motion blur
25% frame impulse (2ms flash) -- 75% less motion blur
12.5% frame impulse (1ms flash) -- 87.5% less motion blur

This is directly comparable to staying sample-and-hold but instead adding more refreshes instead of black periods between refreshes:

sample-and-hold (8ms, 120fps@120Hz) -- baseline
sample-and-hold (4ms, 240fps@240Hz) -- 50% less motion blur
sample-and-hold (2ms, 480fps@480Hz) -- 75% less motion blur
sample-and-hold (1ms, 960fps@960Hz) -- 87.5% less motion blur

Obviously, you see diminishing points of returns here, but 50% versus 87.5% is still significant. This becomes even more dramatic when using 1/60sec as baseline. This means 1/120sec impulses has 50% less motion blur than 1/60sec, while 1/960sec impulses has 93.75% less motion blur than 1/60sec.

One example of a stroboscopic display is LightBoost. In an optimized setting (LightBoost=10%), it uses 1.4ms (1/700sec) stroboscopic flashes, so when you play a videogame at fully synchronized (VSYNC ON, 120fps@120Hz), then LightBoost has the exact same mathematical motion blur equivalence of a 700fps@700Hz display. One way to visualize this is mathematically equivalent to 700fps with lots of black frame inserted in between (120 visible rendered frames, spaced equally apart in time 1/120sec apart, in 1.4ms flashes with (8.3ms - 1.4ms = 6.9ms of blackness between refreshes). PixPerAn motion tests confirm this equivalence, too -- during PixPerAn motion, there's a blur trail that's measured 6x longer in non-LightBoost mode on the very same display. This exactly corresponds to a 8.33ms:1.4ms ratio, which confirms the math is accurate. (This is also where my oft-quoted number 12x less motion blur than 60Hz is -- the 1.4ms:16.7ms ratio -- and definitively confirmed in motion blur trail measurements in PixPerAn and the upcoming BlurBusters Motion Tests). Other scientific papers (Science & References) have already touched upon this eye-tracking-based motion blur equivalence between increased Hz method, versus black insertion method (flicker displays like CRT) -- the stroboscopic method of eliminating motion blur without needing the GPU power of insane frame rates / refresh rates.

The motion blur math above is pretty simple if you have clean strobes (full bright quick, full dark quick) such as flicker-driven OLED's, or LightBoost displays. There are other factors that can fudge the numbers, such as phosphor decay. CRT has a noticeably measurable phosphor decay so the math is not as simple, but it appears that the baseline measurement for phosphor decay is to 90% black (phosphor on a medium-persistence phosphor CRT typically takes 1-2ms to lose 90% of light after being excited by the electron gun beam -- takes only microseconds to light up, but excitation time does NOT predict eye-tracking-based motion blur. Instead, the length of the impulse does).

If one has difficulty understanding, it's worth studying camera photography -- A shutter speed twice as fast, results in half the motion blur -- likewise, flash photography bypasses shutter speed speed limitations and the amount of motion blur in the photograph is directly proportional to the length of illumination from the flash (if there's no external light source). Eye-tracking based motion blur has a surprisingly close equivalence on stroboscopic displays and becomes equally easy to compute the predicted eye-tracking-based motion blur of (especially when "clean" strobes are used -- on immediately, static image for a certain amount of time, then off immediately). Obviously, the faster the shutter speed, the faster the motion needs to be in order to create visible motion blur. The same holds true for stroboscopic displays. Eventually, eye-tracking-based motion blur is so completely eliminated that motion needs to be too fast to create motion blur (e.g. faster than the human eyes can reliably track). For close viewing distances in sharp-resolution-material (e.g. videogames), the sweet spot is approximately 1ms frame length. (e.g. 1ms strobe flash). This can be off by quite a lot depending on the human, though, but this number covers the majority of humans.

Barring that, if you've got clean strobes of a static frame (on-then-off), the motion blur mathematics is extremely simple: Eye-tracking-based motion blur is directly proportional to the impulse length. The blur trail is easy to calculate: The impulse length percentage is the percentage of the original motion blur trail. Basically, if you've got a 10mm of moving-edge blur (caused by eye-tracking motion) during constant motion of a moving object on-screen, shortening the impulse to 50% of its length, automatically shortens the motion blur trail by 50% (e.g. 5mm of moving-edge blur). I'm of course, excluding other variables such as pixel persistence, eye tracking inaccuracy/limitations (which occurs when moving objects become too fast to track), phosphor decay, and other factors that fudge this math. Fortunately, LightBoost displays have proven remarkably efficient (TFTCentral said LightBoost outperforms all scanning backlights they have ever tested, and LightBoost completely bypasses pixel persistence), and the mathematic equation holds up amazingly well on a LIghtBoost display. This math should also hold up very well on flicker-driven OLED displays (OLED pixels turn on and off nearly instantly!), as long as they're full-screen-impulse driven or sequential-impulse driven (edge to edge scan, such as top to bottom). I am rooting for flicker-driven modes on computer-based OLED displays, so we can get the stroboscopic motion-blur eliminating effect that CRT's have long had. Not everyone likes flicker, but it should be easy to enable/disable such a mode.

When I program visual stimuli in psychtoolbox for matlab, I can only get the stimulus to display for a single frame, which on our new vpixx display is 8.33 ms.
Are you using scanning backlight mode on your Viewpixx? If you turn on the scanning backlight mode, its response time is only 1ms (according to the manufacturer), but if this is a sequential scanning backlight, it would take about 8ms to flash the LED's sequentially from the top edge to the bottom edge.

Motion blur will be proportional to the illumination length of a single point of the display, so I'd expect Viewpixx display probably has a motion blur trail of approximately 1/8th the frame step. (1ms / 8.33ms). Basically 8 times less motion blur in scanning backlight mode than non-scanning-backlight mode. (Unless the pixel persistence is streaking excessively between refreshes)

How would an impulse driven display interact with this. I'm assuming the temporal width of the impulse function is on the order of microseconds or perhaps 1-2 ms.
Although OLED can turn on near instantly (microsecond league), turning the OLED off almost immediately after would lead to a very dark picture because of the long black period between refreshes (since you need to stick to one strobe per frame(refresh), for proper motion blur elimination). So we have the tradeoff between impulse length and brightness. Shorter impulses, the picture is too dark. Longer impulses, there can be more motion blur. So this is a technological challenge.

There are points of diminishing returns for shorter impulses. Eventually it becomes uneconomical (you need an insane amount of brightness for ultra-short flashes, to prevent a dim image). The sweet technological spot is near 1ms. This represent the point of the end of diminishing returns for most of human population. Even a portion of gamers is unable to see benefits of LightBoost (1.4ms = 1/700sec stroboscopic flashes of refreshes), while others see minor improvement (not as big as 60Hz-vs-120Hz), while yet others see stunningly major improvement (far bigger than the difference between 60Hz-vs-120Hz, see testimonials).

This means a stroboscopic display of 1ms impulses (easy to do with CRT, OLED, or LightBoost-like display), or doing the same 1ms frame length on a sample-and-hold display capable of 1000fps@1000Hz (difficult to do with present technology). This 1ms sweet spot is pretty common for CRT's, but very rare for LCD's (except a few high-end HDTV's). Unfortunately even Panasonic plasma with 2500 Hz Focused Field Drive is hamstrung by plasma phosphor decay limitations (~5ms for red/green phosphor). Fortunately, OLED's should have no problems reaching the impulse-length of about 1ms, provided there's enough brightness in the impulses to compensate for the long black periods between refreshes (1ms:8.33ms ratio at 120Hz, or 1ms:16.7ms ratio at 60Hz). OLED pixels are near instant-reacting so are very impulse-friendly. I'm personally looking forward to the day that OLED panels successfully achieve enough brightness to make short impulses possible.

At the very end of the day, to gain CRT-quality motion on a flicker-free sample-and-hold display, would require ~1000fps@1000Hz (to gain equivalence to CRT phosphor of ~1ms decay) necessary to eliminate flicker AND eliminate interpolation AND eliminate motion blur (simultaneously). And to take advantage of that, requires GPU capable of 1000fps. Neither the display (native 1000Hz refresh) nor the GPU (1000fps capable) is possible today. Something that's currently only possible via interpolation, as today's GPU's are not able to pull that off natively. So we'll have several decades worth of technological progress improvements to finally merge the benefits of CRT-quality motion clarity, with the benefits of a completely flicker-free display at completely native refresh rates. As a compromise, I'll take flicker, as I'm motion-blur sensitive but not very flicker-sensitive (ala CRT-using population, the target audience of LightBoost, which uses flicker to eliminate motion blur). For the next few decades for flicker-sensitive/motion-blur-sensitive people, we have to put up with either: flicker OR interpolation OR motion blur (or more than one of the above). That said, OLED is probably the prime candidate technology to result in the world's first commercialized 1000fps@1000Hz native-refresh-rate display (albiet probably not this decade).

P.S. Are you aware of Strobemaster's work in converting LightBoost displays into tachitoscopes? He does that by hooking an external circuit to the LightBoost strobe circuit. This allows full external control of the impulse length of the LightBoost strobes.
P.P.S. OLED displays that come out in the future, that are active-shutter-3D friendly, will probably be adequately-good motion-blur-reducing displays, although they would probably have a longer impulse length to compensate for OLED's traditional brightness difficulty. As a compromise, OLED's can also technically use dynamic impulse lengths (e.g. shorter impulses for dark pixels, longer impulses for bright pixels) which would lead to less motion blur during dark scenes and more motion blur in bright scenes. (This bright-ghosting effect occurs on CRT as well; bright images have a longer phosphor decay).

EDIT: There may be other technologies (e.g. blue phase LCD's) that may end up being able to do ultrahigh refresh rates more easily (e.g. 1000fps@1000Hz), but realistically, I don't see 1000fps@1000Hz displays becoming practical in the near future.
 
Last edited:
thanks for the fantastic and readable post.

Yes we'll be using the scanning backlight mode (my supervisor just finished calibrating the display today so hopefully I'll be able to run my experiments on it very soon). I'm assuming a scanning backlight implies that it's a sequential strobe (emulating a crt scanline)? If so, it's interesting that the manufacturer claims a 1ms response time given the sequential strobing. I'm pretty sure he'll be at the vision science society conference (starting in about a week and a half), so perhaps I can bring it up with him there.

I wasn't aware of Marc Repnow's work - this is very interesting. For my current area of study, being able to control the onset and offset of stimuli with sub-10 ms timescales would be useful (notwithstanding Bloch's Law.

In case you're curious, the colors vary widely on the vpixx display depending on viewing angle. Also, based on a conversation I had today with my supervisor, it seems that the contrast ratio (not ANSI) is about 1000:1 - he measured full brightness at 99 cd/m^2 and full black at 0.1 cd/m^2
 
thanks for the fantastic and readable post.
Yes we'll be using the scanning backlight mode (my supervisor just finished calibrating the display today so hopefully I'll be able to run my experiments on it very soon). I'm assuming a scanning backlight implies that it's a sequential strobe (emulating a crt scanline)?
Yes, albiet likely in a coarse manner (full row of LED's illuminating a cross-section of the LCD). Though I've seen some manufacturer interchangeably call scanning backlights as impulse backlights, and vice-versa, so it would be nice to have real confirmation.

If possible, get an inexpensive high-speed camera (such as a $300 Casio Exilim EX-FC200S or EX-ZR200, search ebay - imports from Japan) and point it at the Viewpixx in both 480fps recording modes and 1000fps recording modes. Even a cheap high speed camera should be able to determine your Viewpixx's approximate scanning backlight sequence. This high speed camera is a very worthy verification check. I used this camera for my high speed video of LightBoost

I wasn't aware of Marc Repnow's work - this is very interesting. For my current area of study, being able to control the onset and offset of stimuli with sub-10 ms timescales would be useful (notwithstanding Bloch's Law.
Yes, Bloch's Law makes sense. Once a strobe flash is short enough, strobes look exactly the same to the eye. A 1ms strobe flash -- versus a 1 microsecond strobe flash that is 1000 times brighter -- looks the same -- it's the same number of photons hitting the eye. That said, this is for static impulses.

When we're talking about tracking motion, motion definitely makes things interesting. Take the scenario of a flying bullet going through an apple -- the famous high-speed photographs. A bullet becomes motion-blurred with millisecond strobe light, but you get perfect strobe captures with microsecond flash. Even naked human eyes can glimpse a speeding bullet floating in mid-air, if the strobe light went a microsecond, timed exactly at the moment of time the bullet zoomed directly in front of human eyes. Just like high-speed flash photography would capture a speeding bullet, if the flash was fast enough. A 1 millisecond flash is not fast enough, but a 1 microsecond flash can be fast enough for naked human eyes to see a speeding bullet. In this situation, what the human eyes is that the scene stroboscopically flashes briefly. The 1ms flash looks the same length to human eyes as the 1 microsecond flash (if the 1 microsecond flash is 1000 times brighter, due to Bloch's law). But the motion (speeding bullet) is more frozen by the shorter strobe, and thus the bullet becomes seen by the human eye as a momentary flash of a scene containing a bullet floating in mid-air, even though it was speeding along supersonically. Applies to naked human eyes staring at the scene, not just photo film. Likewise, shortening stroboscopic flash lengths on a strobe display (at one short strobe per refresh/frame), have the similiar effect of making motion clearer while eyes are tracking (in this case: Instead of reducing photographic blur caused by external motion, you're now reducing eye-tracking-caused motion blur).

For strobe displays, for maximum motion clarity, it is very important to have frame rates exactly matching refresh rate, for the most perfect possible motion. This is because stutters are essentially repeat refreshes, which creates a longer sample (sample-and-hold motion blur). It also causes discontinuities in eye tracking synchronization with moving object on the screen. Any deviations from ideal line is perceived as judder/stutter/motion blur (at very high framerates, ultra-high-speed repeat refreshes during fast motion, often simply blend into what looks like a motion blur. e.g. 60fps@120Hz looks like a double-edge motion blur. In fact, stutter-created motion blur still is visible to the human eye even in the neighborhood 120fps@240Hz, precisely because of this discontinuity, given sufficiently fast enough motion going at least ~240 pixels per second and greater and when you can see 1 pixel of motion blur in high-resolution moving objects).

fps-vs-hz-small.png


Mathematically, this is the reason why fps=Hz benefit strobe displays far more than sample-and-hold displays. Going from 30fps@60Hz to 60fps@60Hz on an LCD only reduces motion blur by 50%. However, going from 30fps@60Hz to 60fps@60hz on a CRT has a far more dramatic effect (looks many times clearer looking at 60fps@60Hz CRT than 30fps@60Hz CRT). Likewise for LightBoost 60fps@120Hz (so-so) versus 120fps@120Hz (dramatic). (LightBoost is hardware-restricted to only function at 100-120Hz). The eyes stay more perfectly in sync with the moving object, and it becomes possible to eliminate measurable eye-tracking-based motion blur. For strobe displays, fps=Hz is a dramatic improvement for people who have a good eye-tracking ability.

Likewise, impulse-driven OLED should behave the same -- dramatic improvement in motion clarity -- assuming the impulse lengths are made sufficiently short (~2ms).

In case you're curious, the colors vary widely on the vpixx display depending on viewing angle. Also, based on a conversation I had today with my supervisor, it seems that the contrast ratio (not ANSI) is about 1000:1 - he measured full brightness at 99 cd/m^2 and full black at 0.1 cd/m^2
Interesting to know. I wonder what technology the Viewpixx LCD is.
 
Last edited:
that bullet example is an excellent thought experiment to illustrate these concepts!

I'll look into the high speed camera - it's something I've thought of before, and I could use it for my athletic pursuits also (there is critical information involved in tennis strokes that is revealed at those timescales).

Let me know if you have any specific questions about the Viewpixx display. I can bring a list of them to ask him (I'm flying out on the 9th of May).
 
All you found was an idiot salesman who doesn't know his own product:

From the 2541 manual:

http://pro.sony.com/bbsc/assetDownl...$SEL-asset-299223$original&dimension=original



All these warnings are aimed at Video production usage. Essentially anything other than viewing 16:9 video may cause burn in.

As I stated before, this is definitely not suitable for use as a desktop computer monitor.

If you want to be anal about user manuals, then IPS LCD is also unsuitable for desktop computer monitor usage. Here's a section from HP's IPS monitor user guide:

http://h10032.www1.hp.com/ctg/Manual/c03592306.pdf

Code:
CAUTION: Burn-in image damage may occur on monitors that display the same static image on
screen for a prolonged period of time.* To avoid burn-in image damage on the monitor screen, you
should always activate a screen saver application or turn off the monitor when it is not in use for a
prolonged period of time. Image retention is a condition that may occur on all LCD screens. Monitors
with a “burned-in image” are not covered under the HP warranty.
* A prolonged period of time is 12 consecutive hours of non-use.

Code:
HP Watermark and Image Retention Policy
The IPS monitor models are designed with IPS (In-Plane Switching) display technology which provides
ultra-wide viewing angles and advanced image quality. IPS monitors are suitable for a wide variety of
advanced image quality applications. This panel technology, however, is not suitable for applications
that exhibit static, stationary or fixed images for long periods of time without the use of screen savers.
These types of applications may include camera surveillance, video games, marketing logos, and
templates that are displayed on the screen for a prolonged period of time. Static images may cause
image retention damage that could look like stains or watermarks on the monitor's screen.
Monitors in use for 24 hours per day that result in image retention damage are not covered under the
HP warranty. To avoid image retention damage, always turn off the monitor when it is not in use or use
the power management setting, if supported on your system, to turn off the display when the system is
idle.

At our office we have about a dozen old (4+years) IPS monitors that all show signs of permanent retention. CRT's were also theoretically susceptible to burn-in but very few people saw it in practice. There are a few people out there who use the smaller Sony 11" and LG 15" OLEDs released a few years ago for gaming and reported no signs of burn-in. We also know that Samsung's AMOLED burns in easily as seen from numerous videos/screenshots of Galaxy phones on the net.

It remains to be seen if the new larger OLEDs are easy or difficult to burn-in.
 
Last edited:
If you want to be anal about user manuals, then IPS LCD is also unsuitable for desktop computer monitor usage.

I am not being anal. I am warning someone considering wasting $5000+ dollars to use this as a computer monitor, when it would certainly burn in and he wouldn't be covered by warranty.

There are a few people out there who use the smaller Sony 11" and LG 15" OLEDs released a few years ago for gaming and reported no signs of burn-in. We also know that Samsung's AMOLED burns in easily as seen from numerous videos/screenshots of Galaxy phones on the net.

It remains to be seen if the new larger OLEDs are easy or difficult to burn-in.

There is nothing different about larger screen sizes and OLED technology. As you have noted, even Phones are burning in easily and if you consider how the phones work, they dim seconds after you stop touching them.

It would be ten times worse if using OLED as a desktop monitor. The only reason you don't hear about burn-in on the small LG/Sony TVs is because they barely sold any at all. They were expensive novelty items. Even the few people that have them, likely don't use them much, certainly not as their every day monitor. They are just too expensive/small/low res.
 
I am not being anal. I am warning someone considering wasting $5000+ dollars to use this as a computer monitor, when it would certainly burn in and he wouldn't be covered by warranty.



There is nothing different about larger screen sizes and OLED technology. As you have noted, even Phones are burning in easily and if you consider how the phones work, they dim seconds after you stop touching them.

It would be ten times worse if using OLED as a desktop monitor. The only reason you don't hear about burn-in on the small LG/Sony TVs is because they barely sold any at all. They were expensive novelty items. Even the few people that have them, likely don't use them much, certainly not as their every day monitor. They are just too expensive/small/low res.

My point was that you don't really know it will burn-in easily unless someone has done long-term testing on this specific model. The tech is moving forward all the time I'm sure they must be making some advancements in burn-in prevention if they intend for OLED to replace LCD as the next TV standard.

I have not seen any reports of burn-in on these monitors and they have sold something like 20K+ units according to a PR I saw a few weeks ago. Even in video production, people leave static images on the screen all the time and run aspect ratios that don't match 16x9. Disclaimers in user manuals are there for legal reasons, as in the HP IPS example. Yet it's not easy to burn-in an IPS monitor over a short period of time. These OLED's may be similar if used with the proper precautions.
 
that bullet example is an excellent thought experiment to illustrate these concepts!

I'll look into the high speed camera - it's something I've thought of before, and I could use it for my athletic pursuits also (there is critical information involved in tennis strokes that is revealed at those timescales).

Let me know if you have any specific questions about the Viewpixx display. I can bring a list of them to ask him (I'm flying out on the 9th of May).
Thanks for the compliment! Presently I'm most interested in seeing how the display looks under a high speed camera -- to find out its scanning backlight pattern, and how it meets the manufacturer's 1ms claim.
 
My point was that you don't really know it will burn-in easily unless someone has done long-term testing on this specific model.

That isn't a logically consistent argument to make. You don't assume the status quo has changed and keep requiring proof that it hasn't.

We assume the status quo until proven otherwise. You are free to retest along the way, but the prevailing assumption logically holds until it is proven faulty.

It is also likely that a technology breakthrough affecting burn in will be highly advertised, then that can be tested.

Until testing proves otherwise, it is safe to assume the pervasive OLED burn-in issue remains.
 
Thanks for the compliment! Presently I'm most interested in seeing how the display looks under a high speed camera -- to find out its scanning backlight pattern, and how it meets the manufacturer's 1ms claim.

yea i think the 1ms is bs. Spoke to supervisor yesterday and it is indeed a sequential strobe so it does take 8ms to complete the scan. I'm curious to speak to the manufacturer in person about his 1ms claim.

On the other hand, the monitor has some pretty advanced capabilities - you have a lot of control over the bit depth, and all the video processing is done in a separate video processing box. Also, the monitor can go up to 250 cd/m^2 if I ain't mistaken.

I agree, a high speed camera would be great.
 
Last edited:
In lieu of evidence otherwise, I would agree that Sony needs to be taken at its word (i.e., it will burn in). Also read accounts of it having an aggressive screen saver further suggesting such a concern...
 
Last edited:
Until testing proves otherwise, it is safe to assume the pervasive OLED burn-in issue remains.

Can you site some examples of this pervasive OLED burn-in issue? I'm only aware of the one on the AMOLED phones. Those are ~$50 screens with likely inferior materials to those in a $5K monitor. They are also subject to higher brightness requirements to deal with outdoor lighting.

I guess we are all free to speculate. I prefer not to jump to conclusions about a new product without actual testing. Fact is that there is not a single burn-in report I've been able to find on the net for the BVM/PVM OLED's. On the flip side, there are several reports from owners claiming no burn-in after months of 8+ hours/day usage.
 
Can you site some examples of this pervasive OLED burn-in issue? I'm only aware of the one on the AMOLED phones. Those are ~$50 screens with likely inferior materials to those in a $5K monitor. They are also subject to higher brightness requirements to deal with outdoor lighting.

It is the same kinds of materials with the same kind of issues. Price difference is yield (much worse on big devices) X Area related.

If you understand how OLED works you would understand that burn in is inevitable with this kind of technology. It is the nature of the beast. The best you can hope for is that it will take longer, rather than shorter to happen.

It is pervasive in that it happens everywhere consumers use OLEDs and report on it. There are few tablet OLEDs out there and the reports of burn in on those is even worse than for OLED phones. Phones are lighweights for screen abuse compared to computer monitors. Have you seen many LCD phones screens with burn in? I don't think so.

On the flip side, there are several reports from owners claiming no burn-in after months of 8+ hours/day usage.

Please share these reports. If they are displaying video (shocking for a video production monitor), one wouldn't expect burn in. This is just about the easiest thing you can do for a screen that has burn in propensity, show video on it. Video is essentially a random signal and will create even wear.

This is why Video is the next step for OLED screens and not computer monitors. Computer monitors are the worse case for Screen abuse. OLED is NOT ready for this level of abuse.

The original poster should stay far, far away from this screen for computer usage. But wizziwig, feel free to put your money where your mouth is, and jump right in.
 
I think its safe to say that OLED has only recently got off the ground for mass production in the last few years. Its going to take more time to develop and mature it before its ready to be shoved into larger sizes and ready for the critical eye of a desktop enthusiast.

The TV market will make OLED mature faster because of its naturally deep contrast which is what most people look at when they see "numbers" being thrown at them left , right and center.

So for now just be patient.
 
So for now just be patient.
Agreed. My feeling though, is that OLED isn't going to really overtake LCD for many years. Probably not till the 2020's. It took many years for SSD to become popular, and even SSD still has not yet completely replaced HDD's.

OLED checks off a lot of boxes, but I already know there's a lot of problems with OLED's being ready to replace LCD.
 
It is the same kinds of materials with the same kind of issues. Price difference is yield (much worse on big devices) X Area related.

If you understand how OLED works you would understand that burn in is inevitable with this kind of technology. It is the nature of the beast. The best you can hope for is that it will take longer, rather than shorter to happen.

It is pervasive in that it happens everywhere consumers use OLEDs and report on it. There are few tablet OLEDs out there and the reports of burn in on those is even worse than for OLED phones. Phones are lighweights for screen abuse compared to computer monitors. Have you seen many LCD phones screens with burn in? I don't think so.



Please share these reports. If they are displaying video (shocking for a video production monitor), one wouldn't expect burn in. This is just about the easiest thing you can do for a screen that has burn in propensity, show video on it. Video is essentially a random signal and will create even wear.

This is why Video is the next step for OLED screens and not computer monitors. Computer monitors are the worse case for Screen abuse. OLED is NOT ready for this level of abuse.

The original poster should stay far, far away from this screen for computer usage. But wizziwig, feel free to put your money where your mouth is, and jump right in.

So you're citing the same burn-in example I did - Samsung AMOLED displays used in phones and tablets. Their design and choice of materials are specifically for low-cost, low-power, low size/weight, high brightness, low-usage-hours. Their requirements force them to make compromises that a manufacturer making monitors and TVs may not need to make.

Large screen Sony and LG OLEDs use their own designs and materials chosen for their respective target markets. You've given zero evidence that either of these designs have the same rapid burn-in as the Samsung version. You're just making assumptions.

Sure, burn-in and retention is inevitable - no disagreement there. That is why they have disclaimers in the manuals. Same was true for CRT and some IPS LCD. But in real-world usage, the burn-in was at a rate that still allowed these technologies to be used for monitors and televisions for many years. There is not enough data on these newer non-Samsung OLEDs to say if their burn-in risk is manageable or not. It might be like a CRT or as bad as Plasma.

As we all know, internet posting on forums like this one is dominated by people with problems since they are posting to find solutions. For every positive post/review, you find 10x or higher number of posts about issues/defects. People happy with their displays tend to post less. The fact that there is a handful of positive reviews and zero negative reviews of these OLEDs gives some evidence to the fact that their performance and defect rate might be very good. A poster on another forum came to a similar conclusion and posted some links:

http://www.overclock.net/t/1379893/...r-prototypes-shipping-in-may/70#post_19711329

You're also mistaken about how the broadcast monitors are used in the field. They frequently show hours of non-native aspect content with letterbox bars. They leave paused images, logos, time-counters, status displays, etc. on the screen all the time.

I'm not going to buy one of these monitors because they are unsuitable for desktop use for many other reasons besides burn-in risk - too expensive, too small, low resolution, lack of inputs, etc. But I'm not going to discourage others who can afford these displays from buying them strictly on speculation and assumptions.

You can have the last word. Since neither of us owns this display, there is no point in continuing this discussion. We need a real owner to post their experience.
 
So you're citing the same burn-in example I did - Samsung AMOLED displays used in phones and tablets. Their design and choice of materials are specifically for low-cost, low-power, low size/weight, high brightness, low-usage-hours. Their requirements force them to make compromises that a manufacturer making monitors and TVs may not need to make.

Large screen Sony and LG OLEDs use their own designs and materials chosen for their respective target markets. You've given zero evidence that either of these designs have the same rapid burn-in as the Samsung version. You're just making assumptions.

I use these examples because these are the only products that have made any consumer penetration. Any time consumer meets OLED we see burn-in.

There was also more evidence with the Sony XEL 1 that was tested by Displaysearch and found to only have 5000 hour life when displaying white, and 17000 when displaying video on an advertised 30000 hour life. They also had an extremely aggressive dimming circuit that kept turning down brightness. So aggressive it made testing the screen difficult. The Short life is indeed evidence of rapid burn in at Sony as well as Samsung.

All manufacturers have the same issues with their OLED durability. It isn't like Samsung is the poor stepchild of OLED. If anything Samsung is the industry leader in OLED, if anyone had better they would be competing. They aren't.

You assertion that that better OLED materials are available, but manufacturers are using inferior ones in phones/tablets is completely ABSURD conjecture, with no basis in reality.

You also have the math backwards the Phone/TV screen economics. The coming OLED TVs have 100 times the area and would require 100 times the OLED material to build a screen, IF they both had 100% yield. Given realistic yields we can expect OLED material usage to be several hundred times higher on a big screen TV.

$50 Cell screen * ~300 times material cost = $15000 for TV screen.

So if anything they would need to use cheaper materials on the TV to keep costs from going runaway.

Realistically they use the same material across the board. Everyone is fighting the same physics on OLED burn-in. There is zero evidence to the contrary.


There is not enough data on these newer non-Samsung OLEDs to say if their burn-in risk is manageable or not. It might be like a CRT or as bad as Plasma.
...
I'm not going to buy one of these monitors...
....
We need a real owner to post their experience

If anything, so far OLED has a way to go before it is even as good as Plasma.

I don't think anyone should recommend OLED for desktop usage until they were ready to be the first guinea pig, and I note you aren't.
 
i am once again considering this monitor

is it really that bad for desktop use?
 
i am once again considering this monitor

is it really that bad for desktop use?
Well, it has the OLED burn-in risk when used as a monitor. It's got great color, deep blacks, good looking motion (not zero blur) the motion blurring is pretty 'clean' even if not zero blur (no noisy blur like on plasmas). The OLED gamut is a bit different and a matter of personal preference. OLED does burn-in very easily at the moment (far more easily than plasma), so bear this in mind as a consideration. It would be a good HDTV video monitor...

Is this for computer use, not video use? If so, have you also considered the Eizo FDF2405W professional monitor? It's marketed for satellite map motion blur reduction, but if you're going to pay huge bucks for a Sony OLED monitor, then the same budget can also be aimed at the FDF2405W -- no burn in risk, has a bit less motion blur than the Sony OLED, has a VA panel refreshed at 240Hz (no interpolation) and strobed at 120Hz with a LightBoost-style strobe backlight (strobing is mentioned on page 15 of FDF2405W manual). 5000:1 native contrast ratio. Not inky blacks of OLED, but no burn-in risk. OLED is still very sensitive to that problem.

So if you really do have a huge budget, then...
For video -- consider trying out the Trimaster
For computer -- consider trying out other high-end alternatives
 
Last edited:
sometimes I wonder if lcd tech didn't exist how crts would be today, most likely lower power consumption, thinner and higher res ?

imo, lcds are a curse for pcs, I look at my old crt tv in awe at how images in motion are lag/ghost free compared to any lcd I ever seen..
 
imo, lcds are a curse for pcs, I look at my old crt tv in awe at how images in motion are lag/ghost free compared to any lcd I ever seen..
Obviously, you haven't seen the "It's like a CRT" and the "Less motion blur than my Sony FW900 CRT" testimonials of LightBoost LCD's...
(It really does reduce LCD motion blur by a full order of magnitude)
 
If CRT computer monitors had continued to develop I'd imagine we'd see an even more beautiful and tighter pitched successor to the GDM-FW900 among other wonders. (Sony had a 0.15 pitch prototype successor to its 4:3 line for example.)

LCDs have been both a blessing and a curse. However, for raw image quality for PC enthusiasts, I think more a curse...but we're just a niche...alas...

It will be some form of OLED it appears that will eventually get us back to what used to just be considered the fundamentals in terms of color, black level, dynamic range, and motion, in one screen...

In the meantime, one can go multi-screen or such, to try and combine displays with the particular strengths you seek, e.g., an LCD and a plasma...

(Would love to try the Sony OLED. However, I would not be able to live with the risk that I was one computer crash and having fallen asleep away from damaging a $5500 investment. Or maybe even at risk just from my regular computer usage...)
 
i am yet again considering it i have been off and on constantly for the past year but i want it

the price is manageable for me (using down payment for a long time)

i am by far no professional user i am just a 20 year old kid who wants the best for gaming movies and desktop use

i don't know proper callibration i just want something to beat my fw900 in absolutely everything and i am willing to pay everything i have to get that

i repeat i do not want this for professional use none what so ever purely entertainment i want to be blown away by the perfect black level i want to be blown away by strong bright colors which is something my fw900 does it is able to show strong bright colors

as mentiond i do not know calibration i just know what i like i have my fw900 set to 9300k color temperature unfortunatly it is dying after this one i am down to my last fw900 and that fw900 dosn't show as good picture as this one did i bought it several years ago but failed the process of removing the antiglare leaving glue i finally managed to get rid of it and it's been working just like my former fw900 did but sadly the picture blinks rarely once which is the first sign of death

i have yet again contacted a different store selling this monitor as they sell it cheaper they contacted sony and got the same response i got

they do NOT suffer from burnin issues worse than crt or plasma

i want this monitor really badly i have been on this for over a year now but i am worried in the end that i get dissapointed because it is not meant for entertainment use

what worries me is that their are things bad or missing professionals wouldn't care for as they wouldn't be useful for professional use

for example dull colors

low brightness

i want something that beats my fw900 in absolutely everything what i imagine when i think about this monitor is stronger colors stronger brightness sharper picture i want the difference to be clear enough to say i don't need my fw900 anymore for anything it's worthless to me now (ofcourse i will still always love it) my expectations are as high as they can be
 
Well SED monitors were basically like CRT monitors on steroids - alas it died with Canon :(

Sed_vs_CRT.jpg


The SED monitor utilises the collision of electrons with a phosphor-coated screen to produce light, as do CRTs. What makes the SED monitor unique is the incorporation of a very narrow gap, several nanometers wide, between two electric poles. When 10 volts of electricity are applied, electrons are emitted from one side of the slit. Some of these electrons spread to the other side of the slit, causing light to radiate when they clash with the phosphor-coated glass.

As the SED monitor works with the same light production theory as CRT monitors, it can provide a sharper, more dynamic color than LCDs and plasma displays. SEDs also have a faster video response time. As the SED monitor does not require electronic beam deflection, it is possible to make screens which are only a few centimeters thick.

Another major benefit of the SED monitor is low power consumption. The SED uses only two thirds of the power needed to run a plasma screen. It also has lower power consumption than LCDs and the traditional CRTs. The SED monitor will not only transform the way we view television and films, but because of its low power use, it will be earth-friendly too.
sedcrop-578-80.jpg
 
Last edited:
sometimes I wonder if lcd tech didn't exist how crts would be today, most likely lower power consumption, thinner and higher res ?

imo, lcds are a curse for pcs, I look at my old crt tv in awe at how images in motion are lag/ghost free compared to any lcd I ever seen..
What I will never understand is how it was widely believed LCD had better picture quality... in 2005

LCD be the death of CRT, SED, Plasma and just maybe OLED :(
 
FED? everyone knows the next big thing is 8K LCD :p

The thing I love most about plasma is the uniformity of motion, instant on instant off, the whole screen moves as one from edge to edge and OLED promises the same thing, it just feels like its going to be a long time until everything comes together
 
Back
Top