Why OLED for PC use?

It's fine if you like that but it's only about 51 ppd which is pixel sizes more like what a 1400 - 1500p desktop sized monitor would have at their normal viewing distances. You don't hit 60 PPD until about 29" view distance (screen surface to eyeballs) on a 43" 4k . . but a bit higher PPD is even better especially for 2d desktop graphics and imagery that get no text-ss and game-aa to mask the actual pixel sizes/granularity. Higher PPD/smaller perceived pixel sizes is also better for any occasional DLSS+Frame generation edge artifcats because they will be tinier. Also if you are using an oled, higher PDD is better for their non-standard pixel structure making text fringing tinier and less obnoxious.

People used 1440p desktop sized screens for years though so it's not like they aren't usable like that or anything. It's just not optimal and not what a 4k fine pixel size would be. More like a larger field of 1440 - 1550p desktop screen sized pixels to your perspective.



https://qasimk.io/screen-ppd/


..At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD

..At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD

..At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD

..At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 20 PPD to 25 PPD


. . .

Personally I'd sit between 28" and 40" on a 55" 8k screen if I end up getting one.

I've always considered approximately 100ppi at typical desktop (~2ft) distances, essentially classic pixel density and distance to be as good as is necessary.

There are very few benefits to sharpness and/or clarity above that. If I were going to ether sit further away from the screen, or use a smaller screen I'd use something with a lower resolution.
 
I have, and haven't had a problem with my OLED. That said, anyone watching CNN for that long deserves burn-in.

Sounds like you'll have to wait for Micro-LED monitors to address the brightness and burn-in concerns you have with OLED
.

Nah, He's sticking to LCD for life.. Micro LED is just as susceptible to burn in as OLED is. No LCD with FALD either, because you could get burn in from uneven wear on the backlight array. The higher the zone count the higher the chance for burn in! He wants a plain LCD edge lit with a single LED (which ships with worse uniformity than RTINGS OLED burn in tests)
.
😂 Now you've gone too far let's lay off the copium drugs you're in another dimension 🤣



Just in case anyone isn't aware, 2000nit FALD LCDs (a samsung 4k, and 8k model) already resort to aggressive ABL so micro-led will still have to deal with heat somehow (re: brightness, at least in regard to ABL). The ~ 1500nit pro art FALD displays use boxy housing with grille venting along with active cooling fans (somewhat audible from reports) on a cooling profile. Kind of like like a gaming laptop, or a desktop gpu has fans and cooling profiles that ramp the fans up vs temps. We may have to resort to that and with heatsinks as things progress. I hope we do anyway rather than aggressive ABL if it was an either/or.

LG OLED for example uses a white sub-pixel to "cheat" higher perceived brightness at lower energy states (less heat ~ less "burn", though the white makes the screen less accurate at higher brightness). The micro lens array in their newer screens also optimizes light output vs energy/heat, and reportedly in 2025 they'll switch their blue emitters from fluorescent blue to phosphorescent which has a much longer lifespan. The lower HDR peaks and sustained HDR brightness durations people sometimes complain about compared to FALD LCD HDR screens is also a big reason why the OLEDs are not at high risk for burn in. They aren't letting you run them that hot, and what they do allow you is reflexively protected by ABL (like it or not). There is also logo dimming, pixel shift. You can also use the "turn off the screen trick" when afk or paused, not giving the screen attention too (see hyperlink). It just turns the emitters off kind of like minimizing the picture until you come back.


* The wear-evening routine buffer should last years with normal usage. That said most oleds are marketed as media + gaming screens, not static desktop/app screens.
LG's reserved brightness buffer. You aren't burning in because you are burning down that buffer first, for a long time (depending on how badly you abuse the screen).

From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s).

A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time.

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.

. . . .


This is what I'm saying. So I can spend the money on the other 20 hobbies that cost a lot also and not spend on the same thing over and over again lol Except I wouldn't mind 200hz would be even faster 🙂

I don't think there is any end stage for rigs/speeds~processing and display tech, etc. especially not in my lifetime (barring catastrophe of course :eek: )
You can drop out of upgrading though, even for just some year(s) . . no-one is forcing you to upgrade. However some game tech advances will leave you out at some points, like g-sync, vrr, higher rez, higher hz, lower input lag, larger sizes, aspect ratios and curves if into those, higher contrast and black depths, HDR, etc did already vs older gens of screens. Phones and then eventually AR tech will keep progressing too, as well as some other wearables incl. probably very slim lightweight exo suits/bar-frames at some point (at least for elderly and disabled at first, and prob some for military and construction workers but I'm guessing it will be a thing available to everyone at some point). Then there will probably be higher and better performing gens of those as they advance, like anything else. There will also be autonomous cars that get more advances over time, and probably eventually robots at some point similarly.

That said, in the now, I pick and choose when I upgrade different components and can go several years between some of them. On release, screens are usually considerably more expensive than 8 to 14 months later (for most, but especially samsungs have a high early adopter price tag). I think I got my 48cx OLED for $850 - $900 where at release it was $1500. The ark was just on sale for like $2k (minus a discount if you qualify down to around $1500 -1600). When it was released it was $3500 I think. The other ultrawides dropped a lot in price after their first year too. If you are a patient gamer you can save a lot of money on hardware, and on games usually get way higher performance by then, and better patched games, maybe more mods and expansions (and cheaper game prices to boot). So can work out well on both the hardware and software end for a lot of games outside of some multiplayer games where popularity/population are a factor.
 
Last edited:
About having difficulty spending money on upgrades anymore, I can relate. Moore's Law was really fast in the lifetime of any Gen-X'er like me, now it's time to optimize.

Let's not forget 4K was $10,000 in year 2001 -- the IBM T221. Now it's a $299 Walmart special.
That's it, it's the cost/benefit

I sold my OLED because with the burn in tests, I can't afford to replace a display because after 2000 hours starts to burn in when I use my monitors about 5000 hours yearly, but when they last 10.000 hours and cost 200€, won't be a problem to change it every 2 years.
 
That's it, it's the cost/benefit

I sold my OLED because with the burn in tests, I can't afford to replace a display because after 2000 hours starts to burn in when I use my monitors about 5000 hours yearly, but when they last 10.000 hours and cost 200€, won't be a problem to change it every 2 years.
If you are satisfied with 200 euro spec monitors, you should not consider OLED anyway.
 
If you are satisfied with 200 euro spec monitors, you should not consider OLED anyway.
read again what I wrote not what you want to understand.

An LCD can lasts over 40.000 hours, you can justify spending 1000€ for that and using it 3/4 years before changing it for the next flashy thing, and still have more than 50% of life expectancy left.

But 1000€ for something that's already burned with 2000 hours of use, putting over 5000 per year for the display? Yeah nope

2 years (10.000 hours) for 200/300? (still way more expensive than that 1000€ top tier LCD) yeah, 1000€? for 6 months of usage before it can starting to wear? nope.

So unless the OLED displays lower their prices to at least half of now and make a 4x in burn in time, they shouldn't be considered as a desktop display for 99% of the people
 
read again what I wrote not what you want to understand.

An LCD can lasts over 40.000 hours, you can justify spending 1000€ for that and using it 3/4 years before changing it for the next flashy thing, and still have more than 50% of life expectancy left.

But 1000€ for something that's already burned with 2000 hours of use, putting over 5000 per year for the display? Yeah nope

2 years (10.000 hours) for 200/300? (still way more expensive than that 1000€ top tier LCD) yeah, 1000€? for 6 months of usage before it can starting to wear? nope.

So unless the OLED displays lower their prices to at least half of now and make a 4x in burn in time, they shouldn't be considered as a desktop display for 99% of the people
When OLED are 200 euros, there will be some other far superior tech. Thus they should not be considered at that point. Just as 200 euro monitors are considered budget now, so then would OLED then.
If you are satisfied with 200 euro tech, don't bother with 1K euro tech.
 
Thus they should not be considered at that point
so, your point is as a new technology comes to the street, none of the old tech should be considered?

hope you change of gpu to the top tier every new gen, have a 7800x3d, an oled 240hz because the old 120 doesn't cut it, etc

must be great being so affluent and out of touch with society, but then, I don't know why are you using a 3060 12gb lmao
 
Burn-in doesn't mean it's broken though, in the case of OLED it just means degraded uniformity because some pixels are darker than the others, and it won't even be noticeable all the time if at all. Mostly you will notice by displaying a solid colour background, and it won't even show with all colours.

But you know people have been using LCDs with incredibly bad uniformity for a long time, right? Like in the real world those OLEDs that do develop some burn-in might still look better than a lot of LCDs manage out of the box. To even reach the burn-in point you have to use your OLED at max brightness with static content for a while, which I don't expect anyone to do in a dark room, that sounds terribly painful and tiring on the eyes.

Yes I think OLED is wasted for use in not-dark rooms, but in those rooms you wouldn't even benefit from the infinite contrast ratio to begin with, since very dark grey will look pitch black already. So why even consider OLED for that use case?
 
Oh surprise for everyone and pretty good news

QD-OLED is burned, and completely destroyed (cnn logo), but suddenly, what it looked like burn-in in the LG W-OLED pannel, it isn't anymore and it was just dirt on the camera lens used to take the photos by rtings

Jpnu2Ak.png



facepalm-annoyed.gif


So, for the moment, W-OLED remains as a possibly real contender for desktop usage, while QD-OLED clearly isn't and no one should buy a monitor with samsung pannels
 
I've always considered approximately 100ppi at typical desktop (~2ft) distances, essentially classic pixel density and distance to be as good as is necessary.

There are very few benefits to sharpness and/or clarity above that. If I were going to ether sit further away from the screen, or use a smaller screen I'd use something with a lower resolution.

Good as necessary for you and that's fine, just not nearly as fine so to speak. And little benefit going higher to your sensibilities. I heard the same kind of comments about 1080p vs 1440, 60hz vs 120hz, and even 4:3 vs 16:9. "All I ever need", which could be true for whoever the " I " is opinion wise but still....

I used at least one 1440p screen in my array for a very long time. They look pretty good with aggressive AA masking the actual pixel sizes and with text-ss tweaked. Not unusable or horrible by any means, they were some of the best resolutions you could get for a time - but 4k at optimal viewing angle has higher PPD which looks better, much finer (and has a lot more desktop/app real estate at 100% scaling 1:1 pixel mapping of course).

. . .

My point however, was that a lot of people are using a larger 4k gaming tv more like a 1400 - 1500p desktop monitor's pixel density when shoe horning one onto a desk so aren't getting the finer pixel density usually expected from a 4k screen, a pixel density that a lot of people can appreciate even if you don't so much, and what I think most people would expect from buying a 4k resolution screen as a resolution upgrade in relation to pixel sizes and not just a larger area of 1400 ~ 1500p like pixel size.

e.g. a 27" 1400p vs a 27" 4k set up on a desk, the 4k has a considerable difference in picture quality on native rez content. Really any 1440p of any size at optimal viewing angle will have the exact same PPD as any other size 1440 though, and any 4k of any size at optimal human central viewing angle will have the same exact PPD as any other 4k viewed optimally - so the sizes shouldn't make a difference. When you sit closer than optimal you are basically zooming the pixel grid in larger and making the pixel granularity worse. So instead of that "4k" fine pixel size, which is a big upgrade to a lot of people, you are back to the lower rez screen's pixel grid again more or less. It's fine if you are ok with that but it's measurably and perceptually a downgrade in image quality when viewed like that.

I figure a lot of people are probably doing it just because they couldn't get the same screen technology and deal on a screen more appropriately sized for desk (e.g. 27" up to maybe even 36" 16:9) since none with the same tech were made in those sizes. So they are seeing 1440-1500p desktop screen sized pixel sizes - and with oleds that's making the fringing look a lot larger so a lot worse. Any occasional fringe artifacts from DLSS + frame gen will also look larger so worse there too, and like I said the 2d desktop gets nothing to mask the pixel granularity. Vs fringing of any sort, bigger perceived pixel/subpixel sizes, bigger problems.




. . . . .

Worth pointing out that 4k rez (at optimal 60 to 50 deg viewing angle, 60 - 77 PPD) still needs anti-aliasing and text-ss in order to more or less eliminate fringing on highly contrasted edges, and the 2d desktop's graphics and imagery still don't have any pixel masking there. So it's not like the PPD is overkill at all in any of those facets at 4k. Even 8k will still need some AA and text-ss but at optimal viewing angles it will be much finer than even 4k so everything would look tighter and less granular. Text should look better as well as the completely uncompensated for 2d desktop's graphics and imagery importantly.


.

There is reason that when you check reputable sites for their listings of the best screens for coding in 2023 (lots of text so clarity matters vs eye fatigue), without low budget as a limitation, all of the top screens have relatively high PPI to resulting high PPD (as they are desktop sized 4k-ish screens, mostly ~ 27" form factor or uw equivalent mounted on a desk). They are all 4k or 5k, or 4k based ultrawides 5120 x 2160 so they all get 60PPD or higher viewed on a desk normally. I think few people today would choose a 1440p screen for heavy desktop/app use if they had higher options (and budget limitation and/or gaming was not a consideration for steering to 1440p).
 
read again what I wrote not what you want to understand.

An LCD can lasts over 40.000 hours, you can justify spending 1000€ for that and using it 3/4 years before changing it for the next flashy thing, and still have more than 50% of life expectancy left.

But 1000€ for something that's already burned with 2000 hours of use, putting over 5000 per year for the display? Yeah nope

2 years (10.000 hours) for 200/300? (still way more expensive than that 1000€ top tier LCD) yeah, 1000€? for 6 months of usage before it can starting to wear? nope.

So unless the OLED displays lower their prices to at least half of now and make a 4x in burn in time, they shouldn't be considered as a desktop display for 99% of the people
I don't think you understand what you're actually saying. You're conflating life expectancy with burn in and not even factoring in burn in only occurs with a static image for multiple hours. I'm not sure if you're really that clueless or trying to make yourself feel better about the poverty spec display you're running.
 
so, your point is as a new technology comes to the street, none of the old tech should be considered?

hope you change of gpu to the top tier every new gen, have a 7800x3d, an oled 240hz because the old 120 doesn't cut it, etc

must be great being so affluent and out of touch with society, but then, I don't know why are you using a 3060 12gb lmao
I am saying if you are satisfied with old tech, you should not consider new tech that will cost you 5x.
 
I don't think you understand what you're actually saying. You're conflating life expectancy with burn in and not even factoring in burn in only occurs with a static image for multiple hours. I'm not sure if you're really that clueless or trying to make yourself feel better about the poverty spec display you're running.
Their point was the speed at which burn in occurs on the Rtings test makes the OLEDs a bad value. Which is correct if you watch CNN for 20 hours a day. But cons8iderng you would only notice it during commercials, maube not an issue.
 
Oh surprise for everyone and pretty good news

QD-OLED is burned, and completely destroyed (cnn logo), but suddenly, what it looked like burn-in in the LG W-OLED pannel, it isn't anymore and it was just dirt on the camera lens used to take the photos by rtings

So, for the moment, W-OLED remains as a possibly real contender for desktop usage, while QD-OLED clearly isn't and no one should buy a monitor with samsung pannels

Ironic because before QD OLED came out everyone was under the impression that it was supposed to be more durable than WOLED hence the higher brightness specs and 3 year burn in warranties offered. Turns out LG's approach of limiting brightness combined with years of refinement on their WOLED tech has made it far superior when it comes to burn in.
 
I don't think you understand what you're actually saying. You're conflating life expectancy with burn in and not even factoring in burn in only occurs with a static image for multiple hours. I'm not sure if you're really that clueless or trying to make yourself feel better about the poverty spec display you're running.
I talk about the ratio of cost per longevity

for example, the QD oled burned to the point of being a trash display already without salvation because the CNN logo burned in in 1800 hours, costs 1300€

A 27 360 ulmb2 PG27AQN, that costs exactly the same 1300€, will last not 1800 hours, but about 30x more (lcd's are expected to last at least 50.000 hours)

With that data, I say that unless you are elon musk or use your monitor for movies, series, and some sporadic varied gaming, you just can't justify purchasing the qd-oled, because the cost is not 1300€, but 1300€ every 1800 hours, and that could mean 2/3 monitors per year.

And that's my point, I don't care spending 1300€ for a display that can last without any worry of how I use the display 3 or 4 years, but I can't spend 3000€ on monitors per year

I am saying if you are satisfied with old tech, you should not consider new tech that will cost you 5x.
And I'm talking about the long term costs as I explained again in this thread

1300€ for a display that can lasts 50.000 hours? no problem

1300€ for a display that lasts 1800 hours? Nope.

If something lasts only 1800 hours, should costs pennies, because it's not 1300€ and forget about it, its 1300€ every few months, so, unless you are 5% of the already 1% earners, you won't be able to maintain that kind of purchases

Ironic because before QD OLED came out everyone was under the impression that it was supposed to be more durable than WOLED hence the higher brightness specs and 3 year burn in warranties offered. Turns out LG's approach of limiting brightness combined with years of refinement on their WOLED tech has made it far superior when it comes to burn in.
yep, not even LG was expecting to outlasts Samsung, and they didn't even talk about longevity and burn in until the first tests of rtings came out demolishing samsung, which then LG took asap as marketing material

Let's see what brings second generation QD-OLED on 2024, but right now, no one should buy for a desktop computer a QD oled, being that kind of burned with 1800 hours, my god, I would be livid as a buyer
 
Last edited:
I talk about the ratio of cost per longevity

for example, the QD oled burned to the point of being a trash display already without salvation because the CNN logo burned in in 1800 hours, costs 1300€

A 27 360 ulmb2 PG27AQN, that costs exactly the same 1300€, will last not 1800 hours, but about 30x more (lcd's are expected to last at least 50.000 hours)

With that data, I say that unless you are elon musk or use your monitor for movies, series, and some sporadic varied gaming, you just can't justify purchasing the qd-oled, because the cost is not 1300€, but 1300€ every 1800 hours, and that could mean 2/3 monitors per year


And I'm talking about the long term costs as I explained again in this thread

1300€ for a display that can lasts 50.000 hours? no problem

1300€ for a display that lasts 1800 hours? Nope.

If something lasts only 1800 hours, should costs pennies, because it's not 1300€ and forget about it, its 1300€ every few months, so, unless you are 5% of the already 1% earners, you won't be able to maintain that kind of purchases


yep, not even LG was expecting to outlasts Samsung, and they didn't even talk about longevity and burn in until the first tests of rtings came out demolishing samsung, which then LG took asap as marketing material

Let's see what brings second generation QD-OLED on 2024, but right now, no one should buy for a desktop computer a QD oled, being that kind of burned with 1800 hours, my god, I would be livid as a buyer
But OLED lasts longer than 1800 hrs, without burn in. The Rtings test is not a realistic usage scenario, thus why LCDs are failing as well. However, if your use case requires max brightness and static imagery, OLED is not for you...obviously. For most PC users, hide task bar, dark mode, allow maintenance when powered down, and you are good for several years of leading display tech. Like anything else in life, make an informed purchase.
 
The Rtings test is not a realistic usage scenario,
rtings tests is a completely realistic scenario for desktop usage, it isn't for a media usage like a tv for netflix and some sporadic console gaming

but desktop? yeah, it's a totally fair scenario

hiding the taskbar won't save you from the programs UI elements which you will rack that without problem over 3000 hours in a year. You maybe won't have the taskbar, but you 100% sure, will burn the chrome/firefox bars, or the specifig competitive multiplayer game you like, etc.
we are talking about 1800 hours with a burn in that is already way past the acceptable level, not just "starting to show up", then you add on top of that, using the monitor not for 1800 hours, but for maybe 4/5000 per year without much problems in a lot of cases.

And on top of that, a lot of people purchase things to use them, not to have fear of use them and limit how they use them and when. Objects are there to serve us, and enjoy them, not to worry constantly about them
 
Last edited:
No, you don't use a display at max brightness unless you're in a bright room. But then you don't use OLED, it's the wrong tech for that. So not a realistic scenario.

Heat also matters a lot for OLED, using it at max brightness for so many hours non stop will make things worse especially in fanless models, and you won't be using your display 20h a day, so that's again not realistic.
 
So not a realistic scenario.
sure, thanks for telling me what's my kind of scenario is and what me and a lot of other people use and how we should interpret the results based on what you want to be and not what it really is

Any other advice of how people should live their lives and how wrong they are because it doesn't fit your views?

If it doesn't fit my view of the world, then the thing is wrong and everyone is mistaken.

just fucking lol with you people
 
rtings tests is a completely realistic scenario for desktop usage, it isn't for a media usage like a tv for netflix and some sporadic console gaming

but desktop? yeah, it's a totally fair scenario

hiding the taskbar won't save you from the programs UI elements which you will rack that without problem over 3000 hours in a year. You maybe won't have the taskbar, but you 100% sure, will burn the chrome/firefox bars, or the specifig competitive multiplayer game you like, etc.
we are talking about 1800 hours with a burn in that is already way past the acceptable level, not just "starting to show up", then you add on top of that, using the monitor not for 1800 hours, but for maybe 4/5000 per year without much problems in a lot of cases.

And on top of that, a lot of people purchase things to use them, not to have fear of use them and limit how they use them and when. Objects are there to serve us, and enjoy them, not to worry constantly about them
I have 4300 hrs on my C2, I have Firefox open for hours at a time, several times a day, zero image retention. I play the same four games, usually one for hours at a time, several times a day, zero image retention.

20 hours of CNN per daty over 3 months, is nowhere near normal usage, even with taskbar pinned, even with browser tabs open. Also, they may have not run the full pixel refresh yet, that should run every 2K-4K hours.
Still doesn't change the fact some LCDs look worse or have died as well. Again, if leaving browser windows pen all day, or spread sheets, or stock tickers, etc...at max brightness is your use case, look elsewhere. But in a light controlled environment, with reasonable brightness settings, and normal usage, OLED will do fine, and provide the best image quality/ performance. Grab a C242 next time they drop to $900, get the burn in warranty coverage, and don't sweat it.
Don't bother replying, you are ignored.
 
I have 4300 hrs on my C2, I have Firefox open for hours at a time, several times a day, zero image retention. I play the same four games, usually one for hours at a time, several times a day, zero image retention.

20 hours of CNN per daty over 3 months, is nowhere near normal usage, even with taskbar pinned, even with browser tabs open. Also, they may have not run the full pixel refresh yet, that should run every 2K-4K hours.
Still doesn't change the fact some LCDs look worse or have died as well. Again, if leaving browser windows pen all day, or spread sheets, or stock tickers, etc...at max brightness is your use case, look elsewhere. But in a light controlled environment, with reasonable brightness settings, and normal usage, OLED will do fine, and provide the best image quality/ performance. Grab a C242 next time they drop to $900, get the burn in warranty coverage, and don't sweat it.
Don't bother replying, you are ignored.

Yes, just use some OLED etiquette. Plus a 42" oled is often $900 or less at BB, and the 5 year warranty that covers burn in is $180, like $36 a yr, $3 a month. Then stop being scared, but you should still use best-use practices. They really aren't best for static desktop/apps though, rather media+gaming.


==============================================

. .

Burn in = "burn down" + restore as long as there is buffer remaining
=====================================================

Burn in is some risk but it's not like an oled phone left on with an app that prevents the screen from timing out. OLED tvs have a 25% reserved brightness/energize buffer. They even the wear on all of the emitters, then boost them back up to level again. This should last years unless you are foolishly abusive of the screen outside of normal gaming and media, dark themes, leaving the screen on static and paused/idle , etc. (there is a turn off the screen trick that just times out the emitters so no reason to leave it lit while afk for example). Still not a good choice for static desktop/app use imo though it's doable.

People are a bit more abusive of their oled and say look no burn in. . . but it's just burning down that much faster so using up more of the reserve buffer. It's a pretty clever system.

You can get the 42" LG C2 for $900 + tax currently at best buy. The 5 year best buy warranty on a c2 can be had for around $36 a year. That covers burn in if you are actually concerned about it but I doubt you'd burn in before 4+ years in normal media and gaming usage with some precautions taken. $36 a year insurance , $3 a month, $180 / 5 yr.

(The LG G series also comes with a LG 5 year burn in warranty by default but they start at 55")

. .

. .


. .


I used my LG CX 48" like that for two years. ~8h work and personal use on top of that. The display is still without burn in and working fine.

This will heavily depend on how you use your display. I had some mitigations in place:

  • Dark modes where available.
  • Autohide taskbar/dock/topbar. I use MacOS for work.
  • Turn off the display with the remote when taking a longer break.
  • Keep display connected to power so it can run its pixel refresh cycles.
  • Brightness calibrated to 120 nits.
  • Virtual desktops in use so there is some movement between content.
  • Blank screen saver going on in 10 minutes of idle. Faster to get out of that than display off.
  • Display off in 20 minutes of idle.
While this may seem like a lot, it's a one time setup. You really don't need the taskbar/dock for anything 99% of the time so even after returning to a smaller LCD I keep it hidden.

I don't use mine as a static desktop/app screen other than a browser once in awhile or something since I have side screens for static apps and desktop stuff. I've been using multiple monitors for years so it's normal to me to do so.

I think of it like the mainscreen in star trek.. they aren't doing all of their engineering and science work and experiments on their large main viewer typically. All of their data is on other workstation screens while the main screen is the big show. Or you might think of the oled screen as a "stage" for playing media and games.


That's my personal preference. Like kasakka said there are a lot of burn-in avoidance measures, many which he listed. If you keep asbl on it would be even less likely to burn "down" (see below), but most people using them for desktop/apps turn off asbl dimming via the service menu using a remote since it's annoying to have full bright pages dim down.




=======================================================================

Pasting some info from my comment history here for you in case you find any of it useful:

Some burn-in (burning through your "burn-down" buffer) avoidance measures
A few reminders that might help in that vein:

....You can set up different named profiles with different brightness, peak brightness, etc.. and maybe contrast in the TV's OSD. You can break down any of the original ones completely and start from scratch settings wise if you wanted to. That way you could use one named profile for lower brightness and perhaps contrast for text and static app use. Just make sure to keep the game one for gaming. I keep several others set up for different kinds of media and lighting conditions.

  • Vivid
  • Standard
  • APS
  • Cinema
  • Sports
  • Game
  • FILMMAKER MODE
  • iisf Expert (Bright Room)
  • isf Expert (Dark Room)
  • Cinema Home
....You can change the TV's settings several ways. Setting up the quick menu or drilling down menus works but is tedious. Keying the mic button on the remote with voice control active is handy to change named modes or do a lot of other things. You can also use the remote control software over your LAN , even hotkeying it. You can change a lot of parameters using that directly via hotkeys. Those hotkeys could also be mapped to a stream deck's buttons with icons and labels. In that way you could press a stream deck button to change the brightness and contrast or to activate a different named setting. Using streamdeck functions/addons you can set up keys as toggles or multi press also, so you could toggle between two brightness settings or step through a brightness cycle for example.

....You can also do the "turn off the screen emitters" trick via the quick menu, voice command with the remote's mic button, or via the remote control over LAN software + hotkeys (+ streamdeck even easier). "Turn off the screen" (emitters) only turns the emitters off. It doesn't put the screen into standby mode. As far as your pc os, monitor array, games or apps are concerned the TV is still on and running. The sound keeps playing even unless you mute it separately. It's almost like minimizing the whole screen when you are afk or not giving that screen face time, and restoring the screen when you come back. It's practically instant. I think it should save a lot of "burn down" of the 25% reserved brightness buffer over time. Might not realize how much time cumulatively is wasted with the screen displaying when not actually viewing it - especially when idling in a game or on a static desktop/app screen.

...You can also use a stream deck + a handful of stream deck addons to manage window positions, saved window position profiles, app launch + positioning, min/restore, etc. You could optionally swap between a few different window layouts set to a few streamdeck buttons in order to prevent your window frames from being in the same place all of the time for example.

... Dark themes in OS and any apps that have one available, web browser addons (turn off the lights, color changer), taskbarhider app, translucent taskbar app, plain ultra black wallpaper, no app icons or system icons on screen (I throw mine all into a folder on my hard drive "desktop icons"). Black screen saver if any.

... Logo dimming on high. Pixel shift. A lot of people turn asbl off for desktop but I keep it on since mine is solely for media/gaming. That's one more safety measure.

. .

Turn off the Screen (emitters only) trick

I use the "turn off the screen" feature which turns the oled emitters off. You can set that turn off the screen command icon to the quick menu so it's only 2 clicks to activate with the remote (I set mine to the bottom-most icon on the quick menu), or you can enable voice commands and then hold the mic button and say "turn off the screen". You can also use the color control software to set a hotkey to the "turn off the screen(emitters)" function, and even map that hotkey to a stream deck button if you have one. Clicking any button on the remote or via the color control software hotkeys wakes up the emitters instantly. I usually hit the right side of the navigation wheel personally if using the remote.

https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/

While the emitters are off everything is still running, including sound. This works great to pause games or movies and go afk/out of the room for awhile for example. I sometimes cast tidalHD to my nvidia shield in my living room from my tablet utilizing the "turn off the screen" (emitters) feature. That allows me to control the playlists, find other material, pause, skip etc from my tablet with the TV emitters off when I'm not watching tv. You can do the same with youtube material that is more about people talking than viewing anything. I do that sometimes when cooking in my kitchen that is adjacent to my living room tv. You can probably cast or airplay to the tv webOS itself similarly. Some receivers also do airplay/tidal etc directly to the receiver.

. . .
Distrust Screensavers

I wouldn't trust a screensaver, especially a pc screensaver. Not only do they fail or get blocked by apps - Apps can crash and freeze on screen, so can entire windows sessions or spontaneous reboots stuck on bios screen, etc. It's rare but can happen. Some apps and notifications even take the top layer above the screensaver leaving a notification/window there static.

While on the subject. I kind of wish we could use the LG OSD to make mask areas. Like size one or more black boxes or circles, be able to set their translucency, and move them via the remote to mask or shade a static overlay, HUD element, bright area of a stream, etc.

. .
LG's reserved brightness buffer. You aren't burning in because you are burning down that buffer first, for a long time (depending on how badly you abuse the screen).

From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s).

A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time.

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but you'll run out of charge faster using it that way.
 
sure, thanks for telling me what's my kind of scenario is and what me and a lot of other people use and how we should interpret the results based on what you want to be and not what it really is

Any other advice of how people should live their lives and how wrong they are because it doesn't fit your views?

If it doesn't fit my view of the world, then the thing is wrong and everyone is mistaken.

just fucking lol with you people


OLED is flawed, QD-OLED burns faster - absolutely true. But LCD is flawed too in different ways.Such as grey blacks in a dark room (or blooming with FALD). And slow pixel response times.

I just don't think you understand the tech and the whole "burn-in" thing very well. There is no flawless display tech good for everything all the time everywhere.
 
The problem with OLED is that you are forced to watch it in a dark room with the lights off or to put it in a different way "light controlled" meaning very very dim to the point where it's almost just dark enough that you can see and not trip over stuff but otherwise dark room.
Even then it's not bright enough on full brightness because the whites look Gray. When I brought the LG C2 home I was baffled at how dim it was on full brightness even in a completely dark computer room with all the lights turned off it looked miserably dark White had a shade of gray that were very unappealing to me. That's not even getting into how terrible it looked with the lights on or with the window shades open I literally felt like it was on 25% brightness meanwhile it was at 100%. I returned it in less than a week it was agitating I hated everything about it it's pros we're not enough to convince me over its massive shortcomings.
That was the C2, I'm not sure how the c3 is nowadays but hopefully they made massive improvements although I still don't trust the technology just yet. If it can get as bright as my qn90b and last as long as my qn90b and have kick ass powerful bright images like it then yeah I would be interested if it had a higher refresh rate like 144 or 240 in about a 48 in size, but until then for my taste nothing even competes with the qn90b especially since I got it for such a good deal back when it was new. And for this reason I'll be holding on to this monster for as long as I can.
 
The problem with OLED is that you are forced to watch it in a dark room with the lights off or to put it in a different way "light controlled" meaning very very dim to the point where it's almost just dark enough that you can see and not trip over stuff but otherwise dark room.
Even then it's not bright enough on full brightness because the whites look Gray. When I brought the LG C2 home I was baffled at how dim it was on full brightness even in a completely dark computer room with all the lights turned off it looked miserably dark White had a shade of gray that were very unappealing to me. That's not even getting into how terrible it looked with the lights on or with the window shades open I literally felt like it was on 25% brightness meanwhile it was at 100%. I returned it in less than a week it was agitating I hated everything about it it's pros we're not enough to convince me over its massive shortcomings.
That was the C2, I'm not sure how the c3 is nowadays but hopefully they made massive improvements although I still don't trust the technology just yet. If it can get as bright as my qn90b and last as long as my qn90b and have kick ass powerful bright images like it then yeah I would be interested if it had a higher refresh rate like 144 or 240 in about a 48 in size, but until then for my taste nothing even competes with the qn90b especially since I got it for such a good deal back when it was new. And for this reason I'll be holding on to this monster for as long as I can.
You have obviously ruined your vision then. Have you ever used a calibrated display, or just cranked brightness and contrast to max on everything?
 
The problem with OLED is that you are forced to watch it in a dark room with the lights off or to put it in a different way "light controlled" meaning very very dim to the point where it's almost just dark enough that you can see and not trip over stuff but otherwise dark room.
Guess I should stay out of the movie theaters too.
Even then it's not bright enough on full brightness because the whites look Gray. When I brought the LG C2 home I was baffled at how dim it was on full brightness even in a completely dark computer room with all the lights turned off it looked miserably dark White had a shade of gray that were very unappealing to me. That's not even getting into how terrible it looked with the lights on or with the window shades open I literally felt like it was on 25% brightness meanwhile it was at 100%. I returned it in less than a week it was agitating I hated everything about it it's pros we're not enough to convince me over its massive shortcomings
When you mean full brightness are you talking full white screen? I'm not following your terms. In any case, OLED is very much like Plasma of yester-generation. Self-emissive means better blacks (in this case pure black) but with the tradeoff of burn in and not getting as bright as LCD. You pick your tradeoffs.
.
That was the C2, I'm not sure how the c3 is nowadays but hopefully they made massive improvements although I still don't trust the technology just yet. If it can get as bright as my qn90b and last as long as my qn90b and have kick ass powerful bright images like it then yeah I would be interested if it had a higher refresh rate like 144 or 240 in about a 48 in size, but until then for my taste nothing even competes with the qn90b especially since I got it for such a good deal back when it was new. And for this reason I'll be holding on to this monster for as long as I can.
Fair enough. You've made this point several times already. Those who are okay with the OLED tradeoffs will go over there and enjoy their tradeoffs. Those who don't will buy something else. Why can't we just agree that we have preferences and call it a day? EDIT: And it will never get as bright as your QN90B AND last as long. Ain't happening. I think you know this and are using it as one more shot at OLED, as if somehow this would change any minds. We all know it won't get as bright as that TV and I would wager most who buy OLED don't care.
 
Last edited:
That's a big reason why you won't wear through the wear evening reserved buffer for a very long time. They won't let you run it bright enough to make that much of a risk that it would rip right through that whole buffer and leave you open to burn in.with ordinary usage, plus they have aggressive abl too.
 
Last edited:
Oh surprise for everyone and pretty good news

QD-OLED is burned, and completely destroyed (cnn logo), but suddenly, what it looked like burn-in in the LG W-OLED pannel, it isn't anymore and it was just dirt on the camera lens used to take the photos by rtings

View attachment 589582

Not surprised. I saw the images and I was not 100% convinced the faint blob was true burn in yet (wanted more months/intensity to be sure). Considering my own (paid!) personal longevity abuse of prototype 240Hz OLEDs, dogfooding my office with OLED for productivity.

Relevant:

1691808850260.png

https://www.theverge.com/23827701/lg-oled-burn-in-warranty-two-desktop-monitor-windows

Corsair beat LG to advertising burn in warranty (and offers 3 years).

By 2030s, I bet enough years will have passed that people will trust (more and more) certain OLED technologies for desktop use.

Don't forget the upcoming PHOLEDs. I'm hearing they will be brighter, more efficient, and even more burn-in resistant than WOLEDs and MicroLEDs. Alas, any direct-emission *LED display burns in -- including MicroLEDs, some prototypes of which burn in faster than the slowest-burning-in OLEDs now -- it's the same problem Jumbotrons have too (speckling, dim segments, etc). But all of them are improving!

The problem with OLED is that you are forced to watch it in a dark room with the lights off or to put it in a different way "light controlled" meaning very very dim to the point where it's almost just dark enough that you can see and not trip over stuff but otherwise dark room.
I have office lighting during the day and it's not a problem on my OLED.

I love the brighter QD-OLEDs for movies and games. But for normal use, you often don't want full brightness. You're not supposed to make a screen brighter than the fluorescent lighting above you! That's what the eye doctors tell you! I always adjusted my DELL/HP monitors (before I became Blur Busters) to a fairly comfortable brightness setting.

Brightness is not a problem as long as a window isn't behind you. I even had sun shining in on the floor 15 feet away and my 'Flex was still happily usable equally as bright (averaged out, for average content) as the 240Hz-360Hz monitors in the same room.

Right now I'm currently at 100% Brightness setting since two months ago, having upped it back again after 75%. There's apparently more safety margin built into the 240Hz OLEDs than 3-year-old LG televisions. Since I have two Flexes I can afford to drive one into the ground, but I'm failing so far. You can be damn assured that I don't care about burning in a free prototype in a paid-testing environment. While I do buy other monitors, or additional units from time to time, I am now merrily driving it harder than the average user with 60-70 hour workweeks. Even at 100% brightness, the pixels are still being underdriven on my WOLED, and there were times I noticed there's no "eat in" into the "reserve", the one that people talk about. Could literally last all decade possibly, but I'm not promising, something unexpected could still happen.

Anyway, with all the great stuff happening this year -- let's calm down the chickenlittling a bit.

Burn in is still a legit worry but the situation are clearly improving with more and more manufacturers now including burn-in warranties.

There is no perfect panel technology. LCD and OLED has their flaws, but there's room for both in the office.

Again, this isn't the bad old LG C6/C7 days. It's much better now.

That's it, it's the cost/benefit

I sold my OLED because with the burn in tests, I can't afford to replace a display because after 2000 hours starts to burn in when I use my monitors about 5000 hours yearly, but when they last 10.000 hours and cost 200€, won't be a problem to change it every 2 years.
Fair fair. Many people need *years* of burn-in tests. I'll do my bit too. RTINGS will probably do a third test a few years down the road.

Your buddies who owns 120Hz OLEDs, don't bash them -- let them burn in. If they don't, that's another data point that calms people still wary of OLEDs.

I'm driving a 'Flex into the ground at 100% brightness now on an office desktop, just for kicks -- I do have two units. It's a bit brighter than I want it to be at night now, but most office work is during the day, and I'm in the games where I want the bright highlights. Now, if mine does, indeedy, ofcoursey burn in, that's feedback I'm throwing back at them with a bullhorn, because y'know, I want OLED to become even better Blur Busting technology, where strobed LCDs still greatly outperform, but OLED is having a fast blur-improvement progress curve at the moment. They're the prime candidate for strobeless motion blur reduction at the moment, when used in conjunction with tomorrow's 8x-ratio lagless framegen tech. I still use bullhorns even under NDAs, only those who know me well, knows how loudly I (privately) scream at some manufacturers. The manufacturer-shaming I do behind the scenes! But it hasn't been anything to do with burn in, that's better than I expected.

I often turn off the screensaver for many reasons (e.g. warm up tests, diagnostic displays, developer tests I need to monitor). So from that, I even forgot to back on off the screensaver a few times, and one window was animating enough to prevent the auto-dim behavior. Staticness all night long that night at 100% brightness. No burn in.

I'll take one for the team and keep abusing.

There's room for everybody! Patience, peeps.
 
Last edited:
Just a reminder that when people say "burn in" on modern LG OLED tech it means that they have probably exhausted their wear evening buffer. (I read somewhere that they reserve something like 25% energize level on them, not certain the actual amount).


As I understand it - you are always "Burning down" oled screens, like millions of tiny very slow burning candles . When the "candles" are sensed as being uneven enough the firmware will burn them all down to level them out again, and then use a reserved energizing buffer to boost them back up . . repeatedly through the lifetime of the display. It's when you exhaust that buffer, usually years down the road under normal media and gaming usage scenarios, that you will be left buffer-less where the TV has no reserved energizing range left to even the emitters off and boost them back up to normal output levels again.

Think of it like a desert island scenario.
It's as if you had a single charge "lifetime" battery in a high performing phone or tablet that was rated for years of use, but that was incompatible with charge sensing (or that sensing broke for whatever reason) so you'd never know how much charge is left.
After you turned on your fully charged device and start using it you'd have no idea what your battery charge level is. You could use more power hungry apps and disable your power saving features, run very high screen brightness when you don't need to/aren't viewing content that benefits more from it, max the OS screen brightness because you choose to view the screen in bright daylight instead of in the shade, use high brightness/contrast backgrounds, no screen dimming kicking in, leave the screen on with a very long screen timeout or "always on" via OS or phone app even when you aren't looking at it etc. - and you'd still get full charge performance for quite some time - - > but eventually you'd burn through the nearly the entire battery to where the device was compromised, and you'd end up there a lot faster than someone who used the phone or tablet without those more abusive and faster draining practices.

When viewing less abusive media and gaming (static MTG at high brightness might be bad though lol) , the burn-in + burn-down mitigation tech, combined with safe-use practices outlined in many of these threads, plus forced ABL, (and enabling logo dimming and pixel shift) - should result in a much longer lifetime of the screen and buffer. Using an OLED for desktop/apps with a lower contrast+brightness named OSD picture profile or windows color profile would probably help some if you had to use one for desktop apps (switching to different named picture profile for media or games), and you could use something like displayfusion to do saved window position profiles that you could switch between to help move borders and such - but I'd just use a different screen for a workstation/desktop-app display personally and keep the OLED for media and games.
 
Last edited:
Actually - I just realized that my personal answer to the OP's question - "why OLED for PC use?" Because my CRT monitors are no more. :)

Exactly. Finally a proper successor. Or something that has the potential to be anyway...

The introduction of FALD was cool. I had the original Samsung 40" in 2008. I ended up returning it, because it was damaged. And I thought the next 40" would surely be even better, but there was nothing afterwards. (Instead, edge-lit had arisen from I guess, Hell, and began its reign of terror.)

Now FALD is back in force in smaller displays, but in 2023 just seems a bit too little, too late. Though that's probably jaded me talking... :)

(And the sad story of OLED BFI, which on the 2020/21 displays is awesome, but 2022 and here we go again...)
 
Last edited:
Oh surprise for everyone and pretty good news

QD-OLED is burned, and completely destroyed (cnn logo), but suddenly, what it looked like burn-in in the LG W-OLED pannel, it isn't anymore and it was just dirt on the camera lens used to take the photos by rtings

View attachment 589582


View attachment 589584

So, for the moment, W-OLED remains as a possibly real contender for desktop usage, while QD-OLED clearly isn't and no one should buy a monitor with samsung pannels
Good to hear it wasn't burn-in on the LG. I really want to make this work as a monitor.
 
Last edited:
Good to hear it wasn't burn-in on the LG. I really want to make this work as a monitor.
One thing to be careful of with the RTINGS test is that they are testing the TVs at mx brightness, not at a consistent brightness, and QD-OLEDs allow you to drive them much brighter full screen. That may account for some of the burn-in differences and unless you plan to max brightness, it isn't likely your use case. For real world use we'd want to see them set to the same brightness, probably somewhere in the 120nits realm.
 
One thing to be careful of with the RTINGS test is that they are testing the TVs at mx brightness, not at a consistent brightness, and QD-OLEDs allow you to drive them much brighter full screen. That may account for some of the burn-in differences and unless you plan to max brightness, it isn't likely your use case. For real world use we'd want to see them set to the same brightness, probably somewhere in the 120nits realm.

Only 120 nits? Pretty sure most people buying an OLED is interested in HDR. I would prefer they do an HDR torture test so we can see the lifespan when put under HDR load but instead of 20 hours straight maybe just 5 hours at a time followed by 1 hour of downtime. If it ends up getting burn in after just 1-2k hours of HDR content that would be a huge fail IMO.
 
HDR content still averages in the 100-120 nits range (you know how many people complain that HDR is "too dark" already). It shouldn't really burn that much faster than SDR at 120 nits even with the highlights. Of course if you pause the video or have a static overlay that is non HDR friendly it's another matter. Which reminds me of how terribly subtitles are handled in SOME video players and games - when running HDR they blast the subs with 100% brightness and it's absolutely horrible in a dark room and literally ruins the whole picture due to how our eyes work.
 
Last edited:
Only 120 nits? Pretty sure most people buying an OLED is interested in HDR. I would prefer they do an HDR torture test so we can see the lifespan when put under HDR load but instead of 20 hours straight maybe just 5 hours at a time followed by 1 hour of downtime. If it ends up getting burn in after just 1-2k hours of HDR content that would be a huge fail IMO.
HDR doesn't usually have things like static logos/bars which are the things you see burning in. While games and the desktop have that, they are normally, in my experience, not up in the HDR range. They are displayed in the SDR brightness range. You find most HDR content on a screen is actually not above SDR range, it really can't be otherwise ABL will start kicking in. There are spots where the brightness goes above the SDR range, but not everything and in normal content it'll move around. That kind of stuff is not what is risking burn in. Burn in comes from the same thing, displayed over and over in one spot.

The big risk of that, for monitor usage, would be things like the task bar, the edge of browser windows, etc. In other words, things in SDR. Probably you are not turning your monitor up to max when you are using it, nor your TV. That makes a difference. Burn in is caused by the fade in output of given elements and the amount a LED (any LED, even inorganic LEDs) fades depends on how bright you drive it.

What RTings is doing is putting CNN on the displays all the time, which has a static logo and a lower thirds ticker, and cranking them to maximum brightness. In the case of the Alienware QD OLED monitor, that's 250 nits, in the case of the LG 27" WOLED, that's only 150 nits.

That not only means that the damage you see is probably more than you'd see in real use, but also that you have to be careful comparing monitors (or TVs) because of it. Sure, maybe the QD OLED does burn in faster than the WOLED... when driven 66% brighter, but how might they compare at equal brightness? If 150 nits is enough for you, and it would have to be if the WOLED was an option, then it would be more useful to know what they'd burn in at that setting. I'd argue that for desktop use, you probably want it even less, more around 120 nits. For a brightly lit office, ya you probably want more, but at home you should keep the light level in the room controlled, it is easier on the eyes, and also looks better if there aren't bright reflections on the monitor.

I'm not saying people shouldn't consider burn in if looking at an OLED, but the RTings test needs to be viewed in light of how it is performed. Some are looking at it as some kind of gospel and saying "QD OLED sucks, it can't work as a monitor this test proves it!" I say no, it isn't a realistic test for the way you are actually likely to use the monitor.
 
Yea that's true, QD-OLED may not actually burn faster when driven at the same brightness level. That would require another set of tests, but I still like what Rtings is doing and it's very valuable. But at least we can say "if you run 100% brightness this is the result", since most people won't have something to measure nits at home.
 
Last edited:
According to rtings OLED can last 2,000 hours playing CNN before the logo burns in and you can't hide your shame from your uncle anymore because the CNN logo will appear over the FOX News logo when you change the channel before he comes over.

BUT if you stick it to your uncle and you never change it from CNN you'll never see the burn in. Easy win.


I present to you this equation to find how long an OLED TV will last for you.
You calculate how many hours you watch CNN each day.
2000 h / CNN h/d

So if you watch CNN 50 hours a day your TV will last 2000/50 = 40 days.

I watch CNN 0 hours a day so my TV will last 2000/0 = Infinite days.

Facts and Logic Checkmate libruls
 
HDR doesn't usually have things like static logos/bars which are the things you see burning in. While games and the desktop have that, they are normally, in my experience, not up in the HDR range. They are displayed in the SDR brightness range. You find most HDR content on a screen is actually not above SDR range, it really can't be otherwise ABL will start kicking in. There are spots where the brightness goes above the SDR range, but not everything and in normal content it'll move around. That kind of stuff is not what is risking burn in. Burn in comes from the same thing, displayed over and over in one spot.

The big risk of that, for monitor usage, would be things like the task bar, the edge of browser windows, etc. In other words, things in SDR. Probably you are not turning your monitor up to max when you are using it, nor your TV. That makes a difference. Burn in is caused by the fade in output of given elements and the amount a LED (any LED, even inorganic LEDs) fades depends on how bright you drive it.

What RTings is doing is putting CNN on the displays all the time, which has a static logo and a lower thirds ticker, and cranking them to maximum brightness. In the case of the Alienware QD OLED monitor, that's 250 nits, in the case of the LG 27" WOLED, that's only 150 nits.

That not only means that the damage you see is probably more than you'd see in real use, but also that you have to be careful comparing monitors (or TVs) because of it. Sure, maybe the QD OLED does burn in faster than the WOLED... when driven 66% brighter, but how might they compare at equal brightness? If 150 nits is enough for you, and it would have to be if the WOLED was an option, then it would be more useful to know what they'd burn in at that setting. I'd argue that for desktop use, you probably want it even less, more around 120 nits. For a brightly lit office, ya you probably want more, but at home you should keep the light level in the room controlled, it is easier on the eyes, and also looks better if there aren't bright reflections on the monitor.

I'm not saying people shouldn't consider burn in if looking at an OLED, but the RTings test needs to be viewed in light of how it is performed. Some are looking at it as some kind of gospel and saying "QD OLED sucks, it can't work as a monitor this test proves it!" I say no, it isn't a realistic test for the way you are actually likely to use the monitor.

That just means again, LG made the right call in limiting the brightness for the sake of longevity instead of chasing brightness figures for use as a selling point. There's also not much point in giving consumers a choice of 150 nits cap vs higher on the LG since of course they will always opt for the higher spec. If you allowed a car to rev to 6800rpm vs 7400rpm of course people are always going to choose to rev it out to 7400rpm.
 
According to rtings OLED can last 2,000 hours playing CNN before the logo burns in and you can't hide your shame from your uncle anymore because the CNN logo will appear over the FOX News logo when you change the channel before he comes over.

BUT if you stick it to your uncle and you never change it from CNN you'll never see the burn in. Easy win.


I present to you this equation to find how long an OLED TV will last for you.
You calculate how many hours you watch CNN each day.
2000 h / CNN h/d

So if you watch CNN 50 hours a day your TV will last 2000/50 = 40 days.

I watch CNN 0 hours a day so my TV will last 2000/0 = Infinite days.

Facts and Logic Checkmate libruls

When I first saw them testing by playing CNN I commented on their video that the monitor test warehouse was likely a large percentage of the daily CNN viewership, along with waiting rooms.

All the people that complain about burn-in being a thing probably use OLED phones with static icons on it all day every day too, just don't crank the brightness and change the wallpaper every now and again and 99% of people will be fine. I use mine like that, wallpaper rotates every 30 minutes, taskbar hidden, 15 minutes idle to the ribbons screensaver. Easy peasy. Mine is a 48" so for desktop use I have my programs windowed and kind of move them around the center area, but its mostly a gaming rig so its not a big deal.
 
Back
Top