The 32 inch 4k IPS 144hz's...(Update - this party is started) (wait for it...)

It would occur when I had a darker background with a bright object on it for example I would see it on the hardforums in dark mode. One way I remember replicating it for Asus support was go to a 60-70ish% grey scale screen and open a white box like notepad and move it around with the fald turned on. You would see you a yellow/orange glow like an inch off the sides as you move it around.

Asus support told me it was a fald anomaly that was essentially like a IPS glow caused by the extremely bright object bleeding into the near by grey scale zones on the panel. If that's accurate I have no idea just what they told me.

If they did dim it slightly it might no longer be an issue cause this was on both of my launch models that got painfully bright.
Would you not turn off FALD on desktop regardless?
 
I don't personally turn it off on the desktop because its not really intrusive enough to do so. Its not like Samsung FALD on the desktop that dims the cursor and adjacent zones around windows making the entire desktop look oddly cloudy. This monitor just looks like a normal IPS most of the time with a subtle glow around the cursor on dark backgrounds. Its probably the most usable desktop FALD implementation there is. I know the early launch versions got slaughtered by people for the bloom on desktop but IMO that is way toned down now. I'll try and post a example when its dark enough in my room.

I'll try and recreate the orange glow thing later tonight.
 
  • Like
Reactions: Hypez
like this
I don't personally turn it off on the desktop because its not really intrusive enough to do so. Its not like Samsung FALD on the desktop that dims the cursor and adjacent zones around windows making the entire desktop look oddly cloudy. This monitor just looks like a normal IPS most of the time with a subtle glow around the cursor on dark backgrounds. Its probably the most usable desktop FALD implementation there is. I know the early launch versions got slaughtered by people for the bloom on desktop but IMO that is way toned down now. I'll try and post a example when its dark enough in my room.
I still tend to leave mine off. You are right it usually isn't that big a deal, but I can see it particularly on sites like Hardforum that use middle gray colors. Also I don't find high contrast to be a real benefit on the desktop so generally I leave it off on the desktop, and only on in games.
 
Weird they decided to nerf the brightness. That was what made the Asus standout above the rest. Looks like that upcoming TCL, if it does release, will have an easier time dethroning the Asus now.
 
I still tend to leave mine off. You are right it usually isn't that big a deal, but I can see it particularly on sites like Hardforum that use middle gray colors. Also I don't find high contrast to be a real benefit on the desktop so generally I leave it off on the desktop, and only on in games.
I leave HDR on 24/7 which is one of the benefits of this monitor since so few handle HDR on the desktop well. With RTX HDR and the SDR to HDR video enhancement from Nvidia, it's too inconvenient to swap back and forth based on the content I'm consuming.
 
Weird they decided to nerf the brightness. That was what made the Asus standout above the rest. Looks like that upcoming TCL, if it does release, will have an easier time dethroning the Asus now.
I dunno if it really makes any difference in actual content. It still clips at right around 1600nits in the HDRCalibration app and highlights look incredibly bright in games.
 
I leave HDR on 24/7 which is one of the benefits of this monitor since so few handle HDR on the desktop well. With RTX HDR and the SDR to HDR video enhancement from Nvidia, it's too inconvenient to swap back and forth based on the content I'm consuming.
Fair. At this point my HDR use case is almost entirely games, so I just switch it on when I'm playing (actually most games know how to switch it on themselves). I also go for the more aggressive Level 3 dimming in HDR mode, which makes it more noticeable. I find that in actual content I prefer the brightness/contrast advantage it offers, despite the slight blooming in some cases.
 
It would occur when I had a darker background with a bright object on it for example I would see it on the hardforums in dark mode. One way I remember replicating it for Asus support was go to a 60-70ish% grey scale screen and open a white box like notepad and move it around with the fald turned on. You would see you a yellow/orange glow like an inch off the sides as you move it around.

Asus support told me it was a fald anomaly that was essentially like a IPS glow caused by the extremely bright object bleeding into the near by grey scale zones on the panel. If that's accurate I have no idea just what they told me.

If they did dim it slightly it might no longer be an issue cause this was on both of my launch models that got painfully bright.
I just tried this and do not see any yellowing on the edges.
 
  • Like
Reactions: Hypez
like this
I just tried this and do not see any yellowing on the edges.

Have you compared the PG32UQX versus one of the 32" 240 Hz 4K QD-OLEDS? I'm considering swapping out my AW32 OLED for another PG32UQX. I'm just curious if I have rose-tinted glasses when it comes to that monitor. Is it just the HDR impact that brought you back?
 
Just found a used zero scratch/zero dead pixel XG321UG out the door for $1,100, so thought it was worth the try for that price.



1714062793705.png
 
Last edited:
Have you compared the PG32UQX versus one of the 32" 240 Hz 4K QD-OLEDS? I'm considering swapping out my AW32 OLED for another PG32UQX. I'm just curious if I have rose-tinted glasses when it comes to that monitor. Is it just the HDR impact that brought you back?
Yeah I owned the MSI QD-OLED. Great motion clarity but looked like garbage to me in most HDR content. The first time I launched an HDR game I went in the OSD to see if maybe brightness was at like 20% and then realized that's all it was really capable of.

The reason I stick with the PG32UQX is because now anything can be made to have HDR including SDR videos and games so it brings life to everything in a way no other monitor can regardless of its other short comings. I'm someone who really doesn't care about bloom. That's just what FALD does.

If I was a sweatlord who just played competitive stuff in SDR/medium settings, I would stick with the 240hz models but all I play now a days is stuff is graphically impressive or includes RT.
 
Yeah I owned the MSI QD-OLED. Great motion clarity but looked like garbage to me in most HDR content. The first time I launched an HDR game I went in the OSD to see if maybe brightness was at like 20% and then realized that's all it was really capable of.

The reason I stick with the PG32UQX is because now anything can be made to have HDR including SDR videos and games so it brings life to everything in a way no other monitor can regardless of its other short comings. I'm someone who really doesn't care about bloom. That's just what FALD does.

If I was a sweatlord who just played competitive stuff in SDR/medium settings, I would stick with the 240hz models but all I play now a days is stuff is graphically impressive or includes RT.

The drop in brightness is pretty big coming from a 1000+ nit mini LED. I'm hoping the PG32UCDP can do 800 nits real scene just like the PG42UQ, that would at least be twice the brightness of my QD-OLED and I'd be fine living with that. I'm not asking for a million nits or anything, just give me 1000 or close to it with per pixel dimming and I'll be satisfied. The QD OLEDs are just way too dim to be satisfactory for me.
 
The drop in brightness is pretty big coming from a 1000+ nit mini LED. I'm hoping the PG32UCDP can do 800 nits real scene just like the PG42UQ, that would at least be twice the brightness of my QD-OLED and I'd be fine living with that. I'm not asking for a million nits or anything, just give me 1000 or close to it with per pixel dimming and I'll be satisfied. The QD OLEDs are just way too dim to be satisfactory for me.
I wish they could get the 10% window amount up around where TVs are, because that seem to correspond pretty well with the kind of demand you get in most content. Sure there are some high APL thing that'll try for a lot more, but most stuff there's only a few really bright areas and thus you find that the performance you see on a 10% patch is close-ish to actual performance. Well, the S95D is pulling 1600 in 10%, and 800 at 25%. That is going to be a nice bright image for most real content and I imagine would look very similar to the PG32UQX. That would do nicely, even if full field it only does 280ish.

But because of the size and cooling, we don't see that with monitors. I wonder if an active cooling solution would help, or if the wires are just too small to handle the current required.

We'll see where it goes, I'm going to stick with my FALD for this generation of OLEDs, but I'll be watching the next with great interest. Particularly if Monitors Unboxed doesn't find burn in with their test on this gen. They are doing a real world work test, using it for office type work every day, so if they hold up to that for a year that should be a good indication they won't have a problem for someone like me who does do plenty of desktop work, but also uses it for plenty of games.
 
We'll see where it goes, I'm going to stick with my FALD for this generation of OLEDs, but I'll be watching the next with great interest. Particularly if Monitors Unboxed doesn't find burn in with their test on this gen. They are doing a real world work test, using it for office type work every day, so if they hold up to that for a year that should be a good indication they won't have a problem for someone like me who does do plenty of desktop work, but also uses it for plenty of games.

Please post back here whenever their burnin test has results. I'd be cautious about making the jump to OLED after only a single year of data since my main display would primarily be used for desktop too. My NEC 3090 was my primary display for over 7.5 years, my Acer Predator XB321HK is 6.5 years old, and whatever 32" 4k 144/240 display I eventually replace it with will probably remain my primary until I can replace it with a similar sized 6k or 8k option at a faster refresh rate.
 
Thanks for the link SoCali! That is concerning, especially as all the TV tested had their panel saving features enabled. I guess time will tell if the 3rd gen QD-OLED show any improvement in longevity.
 
Please post back here whenever their burnin test has results. I'd be cautious about making the jump to OLED after only a single year of data since my main display would primarily be used for desktop too. My NEC 3090 was my primary display for over 7.5 years, my Acer Predator XB321HK is 6.5 years old, and whatever 32" 4k 144/240 display I eventually replace it with will probably remain my primary until I can replace it with a similar sized 6k or 8k option at a faster refresh rate.

I think asking an OLED to last that long with heavy desktop use is a real stretch here. What you could do though is purchase the Geeksquad 4 year warranty and then swap it out right at the end of the 4 years for a new one, that way you are kinda getting a display that lasts that long.
 
I think asking an OLED to last that long with heavy desktop use is a real stretch here. What you could do though is purchase the Geeksquad 4 year warranty and then swap it out right at the end of the 4 years for a new one, that way you are kinda getting a display that lasts that long.

I wasn't really hopeful they would; but I keep seeing people claiming that burn in isn't a thing anymore and was curious if anything had actually changed or if there was still an unwritten "as long as you never show any static content on it" attached to the claims.
 
I wasn't really hopeful they would; but I keep seeing people claiming that burn in isn't a thing anymore and was curious if anything had actually changed or if there was still an unwritten "as long as you never show any static content on it" attached to the claims.
That's why I'm so interested in Hardwar Unboxed's test. Basically what they are doing is using the monitor for productivity, leaving on any screen protecting features that are non-obtrusive (like pixel shift and compensation cycles) and turning off any that are a pain (like taskbar dimming). So saying "I'm going to use this like an LCD without compromise," but in a realistic setting not a contrived burn-in test like RTings does. Of course the downside is we just have to wait on results, and the better the monitor does, the longer the wait will be. But I like the idea of sort of a "worst case real world" scenario.
 
I saw that listing had no original box. Never trust randoms to ship a monitor without the original packaging.

I've learned the same thing. New or used only if you can see it in person or they're someone you trust to ship.

At least it was eBay so return for full refund is easy.
 
I saw that listing had no original box. Never trust randoms to ship a monitor without the original packaging.

Ya I told him to pack it REALLY well. It looked like he tried, a box inside a box, but the whole package basically had no rigidity so any boxes on top would add pressure to the bubble wrap on the panel. Normies just don't understand how much abuse packages go through. Oh well looks like he's going to refund quickly. Maybe I should just pull the trigger on one of those new $1,800 PG32UQX's on Amazon.
 
Ya I told him to pack it REALLY well. It looked like he tried, a box inside a box, but the whole package basically had no rigidity so any boxes on top would add pressure to the bubble wrap on the panel. Normies just don't understand how much abuse packages go through. Oh well looks like he's going to refund quickly. Maybe I should just pull the trigger on one of those new $1,800 PG32UQX's on Amazon.
I dunno man based on your preferences I don't think you would have liked the Viewsonic at all anyway. I think it might be better to stay in OLED land for your gaming monitor since you really value the 240hz and motion clarity.
 
Ya I set my OLED's to 144 Hz and it's just not fast enough for me. Once you play FPS at 240+ it's hard to go back. So no PG32.
 
Ya I set my OLED's to 144 Hz and it's just not fast enough for me. Once you play FPS at 240+ it's hard to go back. So no PG32.
I like high framerate, but I just can't see cranking down the detail enough to get 240Hz, even if I had a monitor that could handle it. I like shinies too much.
 
Ya I set my OLED's to 144 Hz and it's just not fast enough for me. Once you play FPS at 240+ it's hard to go back. So no PG32.
Having done the same a few times in the past, it is important that the difference might actually be as much the changes rather than it being 120 hz/144 hz. Ie going the other way might actually also make things worse to begin with, timing etc just get different. Same thing with going from like a 27" 240 hz to a 32" 240 hz.
 
  • Like
Reactions: Vega
like this
I like high framerate, but I just can't see cranking down the detail enough to get 240Hz, even if I had a monitor that could handle it. I like shinies too much.

Some games you won't be seeing 240fps no matter what you do anyway like Helldivers 2. I could play at 720p lowest settings and still probably won't see 240fps.
 
Some games you won't be seeing 240fps no matter what you do anyway like Helldivers 2. I could play at 720p lowest settings and still probably won't see 240fps.
Ya. While 120fps is something I can get in quite a few games, it is usually pretty closed to maxed on the GPU so, even CPU limits aside, there would just be no going to 240fps. Plenty won't even hold that these days with all the shinies turned up. Like Hogwarts Legacy it dipped down around 60fps plenty with modded RT to increase reflection resolution. I played with it back and forth and I just liked the way it looked too much so I took the FPS hit.

For me these days the sweet spot is 4k, DLSS quality (or maybe DLAA) with details as high as possible set to try and shoot for a 60+ experience, hopefully averaging around 90-120.
 
Ya. While 120fps is something I can get in quite a few games, it is usually pretty closed to maxed on the GPU so, even CPU limits aside, there would just be no going to 240fps. Plenty won't even hold that these days with all the shinies turned up. Like Hogwarts Legacy it dipped down around 60fps plenty with modded RT to increase reflection resolution. I played with it back and forth and I just liked the way it looked too much so I took the FPS hit.

For me these days the sweet spot is 4k, DLSS quality (or maybe DLAA) with details as high as possible set to try and shoot for a 60+ experience, hopefully averaging around 90-120.
Something like 160-180 Hz would probably be enough headroom for people like me who mainly play AAA single player titles. The only games in the past few years that I've played where 200+ fps were a reality were Doom Eternal and some Like A Dragon games. Almost through LAD: Infinite Wealth, and it runs at 4K DLAA 120 fps with dips to something like 80 fps. Moving to DLSS and FG it would probably go much higher, but my 4K 120 Hz TV can't show the frames so there's no point.
 
Back
Top