Heat and driving hardware. The more dense you pack it, the harder it is to cool. Not impossible, but it is an issue you have to think about. Then there's the drivers, the things that actually set the brightness of the individual LEDs. They have...
Ya I've had no issues with it, at home or at work. At home I have it on my laptop. I didn't bother to get the SED feature working, so it is just software encryption. At work I've got it to work with SED a few times, but as I said, it is more...
I do wish that drive-based encryption was easier to make work. I don't know if MS, the drive manufacturers, or both need to change things but it DOES work right now, I've set it up, but it is harder than it should be and most people will end up...
You don't need to do that, a secure erase command (both SATA and nVME support it) will fully blank the disk, beyond any recovery. Ok I mean I suppose intelligence agencies could theoretically have a way to recover data, but the commercial data...
I really wonder how much of that is just legacy in engines and could be improved, and how much is just kinda inherently how games work. It seems like being single-core bound has been an issue in all kind of game engines, and of course we've had...
I dunno, it may not improve for a bit, but I also wouldn't count it out. There's been a lot of improvements in the TVs in a very short time. I'm sticking with my PG32UQX, but I'm eagerly looking forward to seeing what the next gen of OLEDs look like.
Your OLED nut hugging is getting way out of hand. Both technologies have their place and I do not consider one universally superior to the other currently.
What you're doing is just obnoxious. You're like the OLED version of that PG32UQX guy.
Crap like that drives me crazy. I'm not sure why an accurate EOTF is so hard. Maybe it really does require extensive software development or hardware support, but I know devices can do it. The Sony A95L and the ASUS PG32UQX both have bang on EOTF...
I mean, it wouldn't surprise me if they do. nVidia has been beating the "AI" drum since before the marketing buzzword became "AI". They've been working on and pushing machine learning for a long time now. Well these days, it is paying off in...
And to be fair, I think that's a legit way to do games. I can appreciate a game that is really big and open, but kinda shallow and a "make your own fun exploring and gathering" situation, just like I can appreciate a game that is a very tight...
Doesn't have to be as powerful as nVidia to still be a nice step forward. The thing is, the better the RT AMD has, the more games that'll implement RT. Right now RT is pretty much a PC-only, nVidia-only feature. Ya you CAN use it on other stuff...
True, but they make it difficult, it pushes it pretty hard... kinda like Windows these days. You don't have to use a MS account, I don't, they just make it a PITA not to. I don't love it, I'm not trying to defend it as a good thing, but I don't...
Nah man, there were LOTS of people (well, angry Internet nerds) who were upset. I saw more than a few posts here of "I'm never switching off 7, 10 sucks, MS sucks, your face sucks!" kind of thing. Has happened with every version of Windows I've...
Not a ton. I can give you a big list if you like but most of it is pretty minor. For users here, the biggest thing is better HDR support and general graphics pipeline improvements. If you game, particularly in HDR, Windows 11 is better. However...
Few will bother. Setting up a dual boot system isn't hard, but it is a lot harder than just updating your OS. Despite what the nerd rage here may tell you, the reason people are on 10 is because it works fine and they don't bother to upgrade, not...
It's always what happens with Windows. Most people are real apathetic about upgrades so it doesn't happen until it is forced or happens naturally through attrition.
Funny thing is, same shit happens with Linux a lot too. You don't tend to see it...
It's literally just using optical flow data to say "objects moved from here to there in these two frames so stuff should be... here-ish for an intermediate frame." It can actually, potentially, increase latency a bit as it needs to have a...
It works pretty well, particularly at higher frame rates. It isn't perfect, but you really don't notice things looking off too much. Again the higher the source FPS the less you'll notice. However it does have two big downsides:
1) The game...
Also ML may be less of a bitch with the multi-chip problems because it is less time sensitive. There are lots of things that scale not just to multiple chiplets or chips but multiple nodes real well. The problem with realtime graphics is that...
I mean, it shouldn't be a huge surprise given the issues with multiple GPUs. Sure that is going to be slower in some respects since you are connecting over an external bus which is going to be hard to make as fast as something on the same package...