OLED video monitor burn-in notice.

Whoisthisreally

[H]ard|Gawd
Joined
Feb 14, 2009
Messages
1,143
Here's a notice from the manual of a BVM OLED monitor:

On Burn-in


Due to the characteristics of the material used in the
OLED panel for its high-precision images, permanent
burn-in may occur if still images are displayed in the same
position on the screen continuously, or repeatedly over
extended periods.

Images that may cause burn-in

• Masked images with aspect ratios other than 16:9
• Color bars or images that remain static for a long time
• Character or message displays that indicate settings or
the operating state
• On-screen displays such as center markers or area
markers

To reduce the risk of burn-in

• Turn off the character and marker displays
Press the MENU button to turn off the character displays.
To turn off the character or marker displays of the
connected equipment, operate the connected equipment
accordingly. For details, refer to the operation manual of
the connected equipment.
• Turn off the power when not in use
Turn off the power if the viewfinder is not to be used for
a prolonged period of time.

Screen saver


This product has a built-in screen saver function to reduce
burn-in. When an almost still image is displayed for more
than 10 minutes, the screen saver starts automatically and
the brightness of the screen decreases.

On a Long Period of Use

Due to an OLED’s panel structure and characteristics of
materials in its design, displaying static images for
extended periods, or using the unit repeatedly in a high
temperature/high humidity environments may cause image
smearing, burn-in, areas of which brightness is
permanently changed, lines, or a decrease in overall
brightness.

In particular, continued display of an image smaller than
the monitor screen, such as in a different aspect ratio, may
shorten the life of the unit.
Avoid displaying a still image for an extended period, or
using the unit repeatedly in a high temperature/high
humidity environment such an airtight room, or around the
outlet of an air conditioner.

To prevent any of the above issues, we recommend
reducing brightness slightly, and to turn off the power
whenever the unit is not in use.
Of course, the meanings of "extended period" and "long time" are left perfectly impenetrable. OLED recommendations remain the same: Use it only in an air conditioned room, on lower brightness.

Discuss.
 
I have a mouse with OLED display that shows the current DPI setting...After a year of use the pixels that have been "on" most are very much dimmer than those that were not in use.
sentinel_1.jpg

sentinel_2.jpg
 
Needs more R & D. What else? So far, it's the closest thing to an "ideal" display tech. Plasma had these issues as well, but they've gotten better, and it's still around. Hopefully, within 2-3 years they'll get this stuff sorted out.
 
Yes. This is the major fly in the ointment for anyone dreaming of an OLED computer monitor.

TV/Video is less harsh as your singnal tends to be always changing and is an average over time, so you just have to watch out for static logos or you watch out for too much programming with black bars.

But on a computer monitor there will be too many static elements to avoid.

This is why when people ask about waiting for OLED computer monitors, I usually point out how hopeless that is.

Some early adopters will re-purpose an OLED TV/Video monitor as a computer display, but they will be very expensive and they will burn in, and it won't be covered by warranty.

I really only have an interest in an OLED TV at this point. This is the one area where Absolute Blacks really stand out (watching movies in darkened room) and with a video signal, the OLED will incur little burn in if you are careful. But for a monitor, I value resilience more than absolute black. In my well lit room, there isn't much difference between my monitors bezel and black on it's screen. Having no worries about burn in is a huge bonus.
 
Last edited:
Needs more R & D. What else? So far, it's the closest thing to an "ideal" display tech. Plasma had these issues as well, but they've gotten better, and it's still around. Hopefully, within 2-3 years they'll get this stuff sorted out.
Plasma STILL has those issues though and is still a tech that is unsuitable for computer monitors. That's why I believe there is a real possibility the same will be true for OLED. Although they will get better, burn in will likely always remain a problem. Just look at the AMOLED screens used in phones. You can get uneven pixel "wear" after only a couple of months of regular usage.
 
This is a "feature" and will never get fully fixed. Think of the profits to be made from new sales.
 
Here's a notice from the manual of a BVM OLED monitor:

Of course, the meanings of "extended period" and "long time" are left perfectly impenetrable. OLED recommendations remain the same: Use it only in an air conditioned room, on lower brightness.

Discuss.

The devil is always in the detail.
From personal experience even LCDs suffer from burn-in:

1) Industrial control systems, of which I've seen many, often display a given graphic for hours at a time. On some systems the ghosts of these screens are clearly visible.

2) About 5 months ago I fell asleep leaving a game called Children of the Nile running for 8 hours, this on top of however long I had played that day. The ghost of the game UI was clearly visible for about 3 months and even now I can see some remaining outlines if care to look close. hp 2475w monitor, ips.

Of course, if you don't believe me you can always google 'lcd screen burn' and draw your own conclusions but of course, you should never believe anything you read on the internet ;)

Whether a result of panel type or high contrast graphics or 'who knows?', it ultimately comes down to the meanings of "extended period" and "long time". This applies to all screen technology.
 
The devil is always in the detail.
From personal experience even LCDs suffer from burn-in:

1) Industrial control systems, of which I've seen many, often display a given graphic for hours at a time. On some systems the ghosts of these screens are clearly visible.

2) About 5 months ago I fell asleep leaving a game called Children of the Nile running for 8 hours, this on top of however long I had played that day. The ghost of the game UI was clearly visible for about 3 months and even now I can see some remaining outlines if care to look close. hp 2475w monitor, ips.

Of course, if you don't believe me you can always google 'lcd screen burn' and draw your own conclusions but of course, you should never believe anything you read on the internet ;)

Whether a result of panel type or high contrast graphics or 'who knows?', it ultimately comes down to the meanings of "extended period" and "long time". This applies to all screen technology.

Yes, well while "extended period" in LCD may be an office work day, and may be reversible, who knows how this translates to OLED? 45 minutes cooking on a hot day? Six hours? Merely saying a cuss in its presence? :D
 
Last edited:
There have been some reported incidence of LCD image retention, but they are more the exception than the rule. You really need extreme industrial abuse to get permanent image retention on an LCD (like running 24/7, max brightness (more heat), with static patterns for years).

The reverse is true for OLED, they need to be babied to almost extreme degree.

I have > 12000 hours on my monitor with never the slightest hint of retention. I don't even use a screen saver or auto sleep the monitor. I often fall asleep forgetting to power it off. Leaving static images on the screen for hours.

Hard use and abuse and not even a hint of LCD image retention. I have never seen any personal LCD screen with IR in real life either and I know people running 10 year old 15" LCD monitors....

OLED OTOH is, and will likely remain, extremely fragile. Pretty much all emitter technologies wear out and the Organic nature of OLEDs have them break down faster than any other emitter technology we use. Faster than CRT/Plasma and this will likely always be the case.

Bottom line: Bringing LCD image retention into this, like some kind of disclaimer, is a red herring.

Getting LCD "burn in" is more like the chance of getting struck by lightning, getting OLED burn in, is more like the chance of getting struck by rain in Florida.
 
Last edited:
OLED OTOH is, and will likely remain, extremely fragile. Pretty much all emitter technologies wear out and the Organic nature of OLEDs have them break down faster than any other emitter technology we use. Faster than CRT/Plasma and this will likely always be the case.

There is a lot of complexity to this. Encapsulation so that it is sealed from external moisture, stability of the electrodes, phosphorescence and quantum efficiency, transportation, and light extraction will all affect the longevity of these devices. I don't believe burn in for OLED is unaddressable.

Degradation is really caused by applying a current to the device, and operation voltage is influenced by pretty much all of the above.
 
I don't believe burn in for OLED is unaddressable.

You can believe what you want, but OLED burn in will remain a fact. The only thing they can hope to do is slow the rate.

For the foreseeable future, it seems very likely to remain more fragile than Plasma in this respect.
 
You can believe what you want, but OLED burn in will remain a fact. The only thing they can hope to do is slow the rate.

For the foreseeable future, it seems very likely to remain more fragile than Plasma in this respect.

I'm not in the business of believing, which is why I used 'addressable' instead of 'fixable'.
 
Do you think it has been addressed for Plasma?

I don't know. How is that related to efficiency improvements in OLED?

I think you are erroneously presupposing that I enthusiastically believe that OLED will become inured to luminance degradation. I agree with what you have written in this thread.

However, you have made me regret the choice of 'addressable'. If you want to find words to hang me with, try:

There are strategies for increasing the operating lifetime and reducing luminance degradation in OLED.
 
OK. Dont' want to harp on minutia.

OLED has a major challenge migrating into the role of desktop monitor, beyond the major challenge of just making it to TV display. As such I don't think we will see to many of these on peoples desktop before 2020...
 
Do you think it has been addressed for Plasma?
Lets not play word games,

Yes it has been measurably improved for plasma, but you knew this already, or should have.

I personally foresee the same for OLED, that's unless mfr's begin to favor another tech for whatever reasons.
 
I recently bought a Wacom Intuos4 graphics tablet. It has little OLED displays that can be customized so you can remap the button functions and know what they are mapped to. In the manual for the tablet it also stated that over time, the pixels will get dimmer and if you never change the label for a button, those particular pixels will eventually degrade faster than the rest so if you do change the label, you will get uneven brightness like the example posted above.

But didn't CRTs have this same issue as well? And clearly those improved to the point where burn-in was not a significant issue for usage in a typical consumer environment. It's probably most accurate to say that burn-in with OLEDs will never be "solved", since it is a property inherent to the concept where the light is produced at each pixel location instead of uniformly via a backlight. However, I think given time, burn-in on OLEDs will be reduced such that it will not have significant practical implications on typical usage. When this will be, I don't know.

Ruahrc
 
OLED is in another league of screen burn-in that even plasma so it's currently deal breaker even for TV purposes. It will be sorted out eventually by some trick like storing how much subpixels had been used and adjust voltage than would work for OLED-TVs...

But there is also QLED - technology similar and with probably more durable pixels. Probably more expensive too but in monitors durability no.1 priority so it will be used instead OLEDs
 
I wonder how Playstation Vita will handle this. In a year of time from now we will get a lot of input from owners of PSV.
 
QLED sounds better since leds far as I know don't have problems like oled. But how long before we even see a QLED display at a trade show or out in the market. I'm thinking it'll be 5+ years. Everything is slow as hell in the flat panel industry.

For now sounds like it'll be used in phones, not holding my breath.
 
Printed OLED factories are starting to be built, making large displays much MUCH cheaper to build.

LCD Monitors are much cheaper than CRT's were to make ever. It's because there are less parts, its simple. Eventually OLED will be dirt cheap as there are even fewer parts in this production process. Affordable will happen between 2015-2020. Dirt cheap will happen between 2020-2025.

That feels really wierd to type, seen to many 80's movies about the future.
 
QLED sounds better since leds far as I know don't have problems like oled. But how long before we even see a QLED display at a trade show or out in the market. I'm thinking it'll be 5+ years. Everything is slow as hell in the flat panel industry.

The last I read QLEDs major problem was even shorter lifespan than OLED...
http://www.technologyreview.com/computing/26831/page2/
But currently, the best QLEDs have a lifetime of 10,000 hours—not long enough for a large display.

Low lifespan in emitter technology = burn in.
 
I bet display makers are licking their chops at the prospect of building a product with a guaranteed limited service life. No more LCDs, they want you to buy a new OLED TV every year because the old one looks like crap.
 
I bet display makers are licking their chops at the prospect of building a product with a guaranteed limited service life. No more LCDs, they want you to buy a new OLED TV every year because the old one looks like crap.

With how potentially cheap these can become to make, it may be the new norm.
 
I'm really mildly agitated to say the least. If they had spent the last 10-15 years taking the tech that we had (CRT) and making it thin (FED), we would have had magnificent displays right now and 99% of the world would be satisfied... and that much better of a place. You could come home, sit down on your couch and say "hot damn, I have an awesome TV, that's for sure. Look at that, 20000:1 ANSI contrast. Effin brilliant." They could spend the next 20 years working on something better for all I'd care, because I'd have a display that was good and worked perfectly right now.

Unfortunately, that wouldn't be very profitable, would it?
 
Too bad about the burn in... PQ, blacklevel, viewing angles, and color depth are STELLAR on my Samsung Galaxy S II.
 
With how potentially cheap these can become to make, it may be the new norm.

Which is why I expect 90% of their effort and research to be put toward lowering production costs, and 10% toward improving panel quality and service life. It's a race to the bottom and every display maker wins. Consumers? Well, not so much.
 
Lifespan, burn-in, this thread is depressing. My F520 CRT will turn 3 years old in terms of usage this November. I wonder how long it will need to last...

Maybe a more localized local dimming will find its way into an LED backed LCD monitor at some point. Or something....
 
Too bad about the burn in... PQ, blacklevel, viewing angles, and color depth are STELLAR on my Samsung Galaxy S II.

Only issue that I have on my S II is noticeable burn-in on the notification bar. If I load a white page (like Google) and go into landscape mode, that part of the screen has a very obvious blue tint.

I wish that Samsung would stop screwing with "Super" AMOLED on their phones and go to the PLS panels that they're using in many of their tablets now.
 
I am sure that manufacturers love the idea that those monitors do burn in, so that they don't have to put really bad components in them, to ensure continuous sell.

However, I want to ask something: What do you prefer, to live with a technology that will never have better pixel response time, and will never be motion blur free, or to live with a technology that might ask you to buy a new screen every 5 years, and after first 10 years, you will probably have to change it every 10 years.
Trust me, I really HOPE that I won't have to live the next 20 years of my life with a technology that has motion blur. Unfortunately, the entire LCD industry, is based on the fact that only a very small part of the customer base play fps games. And I can assure you, that anyone in this world that played fps games on a CRT will be able to say how different you could play them, simply based on the fact that you could look for enemies without having to stop your mouse to reduce motion blur. But who else cares...
 
Lifespan, burn-in, this thread is depressing. My F520 CRT will turn 3 years old in terms of usage this November. I wonder how long it will need to last...

Maybe a more localized local dimming will find its way into an LED backed LCD monitor at some point. Or something....

Just make it 20 years, as they will most likely destroy the market, and the result of their war will be another useless monitor with motion blur.
 
The Sony in my sig has no burn in so far. Hopefully it stays that way...
 
Only issue that I have on my S II is noticeable burn-in on the notification bar. If I load a white page (like Google) and go into landscape mode, that part of the screen has a very obvious blue tint.

I wish that Samsung would stop screwing with "Super" AMOLED on their phones and go to the PLS panels that they're using in many of their tablets now.

I don't! I LOVE the screen on my Note!
 
I don't! I LOVE the screen on my Note!

How long have you had your Note? I've had my SGS2 for more than a year. The screen is deteriorating faster than I expected. However, I don't expect to have it for more than another year. The device will officially be 2 years old in May (since release, not since I bought it).
 
Back
Top