I only tinker on the first day or two then after that I never bother with it again. Seeing so many people talk about GPUs being too expensive so PC gaming is going to die or something like that is just absolute non sense though. Nobody needs to have a 4090 in order to enjoy PC gaming, it ain't...
You do not have to buy the top end card in order to have a good experience on PC. My nephew bought a 7900 XT for $650 and it gives him an experience far superior to consoles while the only major loss on PC vs a top end 4090 is RT performance which he doesn't care about. You used to be able to...
This has been discussed on this thread already. A node shrink is nice, but it is not a mandatory requirement for a major performance uplift. 780 Ti -> 980 Ti proves this. Same 22nm node but 50% performance increase.
That is exactly the point...just because it's 512 bit which is exactly double of 256, why would he even bring up the possibility of MCM if 512 bit is already possible on a monolithic die? The reason why MCM was brought up, to generate buzz. Which clearly worked.
I'm no engineer so I won't comment on how hard or easy it is to do accurate EOTF tracking, but yes you are right it is possible to do if some effort is made. As others have mentioned, the launch of the 32GS95UE was probably rushed in order to get something out to market before QD OLEDs take up...
We have had 512 bit memory GPUs in the past like the R9 290X. His tweet of saying he doesn't know whether it's MCM or monolith basically got everyone jumping on the possibility that it might be not monolithic which is expected. I'm pretty sure he said it on purpose to generate some buzz because...
I do have a mini LED lol. But I also have an OLED as well because mini LED doesn't get anywhere near 240Hz OLED in terms of motion clarity. Until OLED monitors get up to at least the current TV level brightness, it won't be the death knell of anything.
Well is it brighter because the EOTF is totally screwed when running in High Mode? The GR27 is also known of be one of the dimmest OLEDs and so LG had to refresh it with the GS27 after just 1 year so being brighter than the GR27 isn't really a huge accomplishment.
I don't own the LG monitor so I won't say too much, but I highly doubt it is "a lot brighter" than a C2 in real content. Everything that has been shown about it so far has all been limited to just a bunch of test slides. Once RTings gets one in then we will know for sure whether it is a lot...
If you're satisfied with it that's fine, but IMO yes it is quite subpar when compared to a mini LED. Peak1000 mode simply isn't useable due to the aggressive dimming so TB400 is the best mode for the latest OLED monitors which means peak brightness is capped to around 450 nits. That is barely...
This monitor isn't even the death knell for high end LCD monitors with mini LED, let alone cheapo LCDs. It simply doesn't have the brightness to deliver the HDR performance that mini LEDs can. If it had the same brightness levels as Apple's new iPad pro lineup then I might say the death knell of...
DSC switch added to the OSD to enable DLDSR and DSR modes.
MPG 271QRX QD-OLED: Added HDMI PC/Console switch in the OSD. (MPG 271QRX QD-OLED default setting is “Console” mode.)
Support for a variety of aspect ratios, MPG 321URX QD-OLED allows selection between a 24.5” and 27” option, MPG 271QRX...
It would have been a decent upgrade if LG didn't decide to nerf the brightness when running in game mode.
This is the 65" version but compared to the 65" C1 it is significantly brighter. I would expect a 42" C4 to be brighter than the C2 by a good margin as well, but once you turn on game...
That's kinda what I mean, at some point the price is simply too high regardless of how much of a bargain it is relative to the 5090. They will monkey around with the pricing for sure, I'm just expecting that they will have to once again backpedal when the sales numbers doesn't work out.
Started playing Dead Island 2 since it came to steam. This is one of the few games that really make use out of this monitor with fantastic HDR that has perfect black levels and being able to run anywhere from 160-220fps on high end hardware. Day time scenery is quite muted looking since the...
Well that only works to a certain degree. If Nvidia was to price the 5090 at $10,000 then the 5080 at $3,000 does that mean everyone is now going to rush out to buy a 5080 because it's technically a screaming bargain compared to the 5090? I have no doubts Nvidia will try to test the waters...
If they care enough to spend $1200 on it then that's on them for letting nvidia have their way. I personally have never expected a flagship halo product (5090) to be reasonably priced, nor have I ever expected people who buy such products to bat an eye at pricing. The non flagship products on...
They can't do that because the 5080 isn't the flagship. The flagship halo product can be priced at whatever ridiculous amount, it's been done before with the Titan lineup. We had the Titan V for $3000 and the Titan RTX for $2500. People will not buy a 5080 at $1200 because it is a bargain vs the...
It's quite annoying because this basically a case of 1 step forward and 1 step back for the C series. The G series has gotten a heatsink, MLA+, and the G4 now has no brightness drop when running in game mode while the C series gets none of that. The C series has always been the best bang for the...
This. The 5090 can cost whatever, but not the 5080 and under. If Nvidia suddenly priced the 50 series starting at $999 with the 5060 which goes up to $3000 for the 5090, just how many (or how little) sales do you think they will actually get? Sometimes all this price talk just starts getting...
I wonder why LG has decided to start cutting brightness in game mode on the newest OLEDs.
The C4 has a pretty sizeable lead over the C1, until you turn game optimizer on the C4 then the brightness gets massively cut down.
From 590 nits down to 415 nits, and from 265 nits down to 165 nits...
I'm counting on Asus to deliver the brightest 32" OLED, they currently hold the RTings crown for OLED HDR real scene brightness at 792 nits with the PG42UQ. Highly doubt LG will ever drive their panels hard because they focus too much on durability.
Honestly I could care less about SDR brightness. I never use SDR. HDR brightness is what matters to me so if they don't improve on that next year then I won't bother.
Yes and no. It has less aggressive ABL at the 50% and 100% window sizes but in real content they are practically identical. Real scene brightness of 424 nits vs 439 nits, hardly a difference.
The first gen does have MLA. HDTVTest has a video proving it with a macro shot.
As for QD OLEDs...
On the monitor side I'm thinking we won't see much brighter options until PHOLED comes around, but latest news is that there are issues with stability and that will delay adoption: https://www.oled-info.com/elec-udcs-blue-pholed-material-still-unstable-may-delay-market-introduction
Given this...
Techless compared with QD OLED as well
https://youtu.be/G5XUAaCBW9w?si=9HZKtuvBM-KGF4J9
In low APL scenes the QD OLED will deliver better color brightness while WOLED has brighter whites. Otherwise the two are pretty much identical. WOLED performs better in brighter lighting conditions as well.
Exactly. This is why the test patch brightness almost never lines up with the real scene brightness. Here is RTings real scene video, they take measurements on the upper left light source:
https://www.youtube.com/watch?v=dc6zafyvE1M
There is absolutely no way in hell these QD OLED monitors...
The two are completely different my guy. Max fullscreen brightness is almost as useless as the 1% window brightness. Both the PG42UQ and PG27AQDM have the poorest fullscreen brightness at like barely over 100 nits, yet they deliver almost the best HDR experience among OLED monitors. If you are...