If you find a controller you like for one handed play, I can't recommend Spiritfarer enough for a 2-player game. It has a very deep and touching story, the game play mechanics are simple but fun, and it has no failure state so in a case like yours where you are not playing at peak performance it...
In addition to all the good control suggestions people have made (the MS adaptive in particular) I can suggest some games to think about:
Xcom and Xcom 2 are both really fun turn-based games. I know you want non-turn based as well, but of course turn based makes things easier since it will wait...
Could be, I didn't know about that part of it, just the viewing angles, having observed that. It is amazing how wide the viewing angles are. Like WOLED is good, better than IPS, but it does drop off at the sides. QDOLED you can look at practically perpendicular and it still looks great. It is...
Part of that is the lighting and QD-OLED. QD-OLEDs don't have a polarizer in them, they don't need it. The advantage of that is their viewing angles are insanely wide. Like WOLED is good, but QD-OLED is just another level. Essentially perfect. The downside is that a polarizer helps tone down the...
I mean, that's never going to happen. For one according to the article he only controls 25% of it, meaning he doesn't get to make the decision, it is a group of people, but also because of for finances and taxes work it is pretty difficult to just "hand off" a company to someone. They have to...
That's probably part of the issue is that we learn what a given franchise is supposed to give us, and we want it, and then if it doesn't we aren't going to be happy. Mass Effect established that this was a story-driven world. That was the shining jewel of the first game. Like there were plenty...
For those wondering, that word vomit was from MS Copilot. I asked it to write a generic positive game review and then to make it more wordy and meandering.
:vomit:
Ah, my dear interstellar wanderers, gather 'round as I weave a cosmic tapestry of words to extol the virtues of the celestial marvel known as Galactic Odyssey. 🌌✨
A Celestial Prelude
Picture, if you will, a vast expanse of inky blackness punctuated by shimmering pinpricks of light—the cosmic...
Nah, they need to make back the money from the purchase, so the rates are going to go up. Have to pay more if you want them all in lock step, otherwise they'll designate some to slag your product.
Because, again, the loss, the bandwidth. The higher the bandwidth, the more interference and loss matters. The details get pretty technical pretty fast but simplified time and frequency are reciprocals of each other and as you increase the speed at which you pulse a signal, it spreads out in the...
They also may just be lying. Lot of that shit goes on, has been since the HDMI 2.0 days. Plenty of cables that claimed they'd support something but were not certified, and when you got them it was a crapshoot if it worked.
Nah, those are garbage. As toast pointed out, they are way lower bandwidth, and they use LEDs. They are also POF (plastic) not actual glass fiber which means the cables are really lossy. You actually can get S/PDIF way farther over coax than TOSLINK because the POF cables are so lossy. You...
Cost. While fiber has gotten a lot cheaper, it still costs a lot more than copper. Not so much the fiber itself (though it does cost more) but the transceivers. For a real basic price check, we can look at a Fiberstore 100gig QSFP module. That is 100gbits over 4 lanes of data, send and receive...
It's not bad these days. They still could and hopefully will improve it further, but as it stands it works pretty well. You can easily toggle it on and off, when it is on Windows applies corrections for SDR programs when HDR is on so they look right, and you just set the brightness with a slider...
Don't worry about it too much. Like many things you can fall down a rabbit hole and start to become overly concerned with specs on things and getting "the best" :). If you want an easy metric then get a UL listed one, with a lower voltage, higher joule rating, from a good brand. Tripplite and...
Heat and driving hardware. The more dense you pack it, the harder it is to cool. Not impossible, but it is an issue you have to think about. Then there's the drivers, the things that actually set the brightness of the individual LEDs. They have to be made smaller and more responsive to deal with...
Ya I've had no issues with it, at home or at work. At home I have it on my laptop. I didn't bother to get the SED feature working, so it is just software encryption. At work I've got it to work with SED a few times, but as I said, it is more difficult than it should be to get that to work, and...
I do wish that drive-based encryption was easier to make work. I don't know if MS, the drive manufacturers, or both need to change things but it DOES work right now, I've set it up, but it is harder than it should be and most people will end up doing it in software. As you say, not a big deal...
You don't need to do that, a secure erase command (both SATA and nVME support it) will fully blank the disk, beyond any recovery. Ok I mean I suppose intelligence agencies could theoretically have a way to recover data, but the commercial data recovery companies can't. It works excellent...
I really wonder how much of that is just legacy in engines and could be improved, and how much is just kinda inherently how games work. It seems like being single-core bound has been an issue in all kind of game engines, and of course we've had multi-core CPUs for a long time. It isn't like...
I dunno, it may not improve for a bit, but I also wouldn't count it out. There's been a lot of improvements in the TVs in a very short time. I'm sticking with my PG32UQX, but I'm eagerly looking forward to seeing what the next gen of OLEDs look like.
Crap like that drives me crazy. I'm not sure why an accurate EOTF is so hard. Maybe it really does require extensive software development or hardware support, but I know devices can do it. The Sony A95L and the ASUS PG32UQX both have bang on EOTF tracking. I think it is part of why I like the...
I mean, it wouldn't surprise me if they do. nVidia has been beating the "AI" drum since before the marketing buzzword became "AI". They've been working on and pushing machine learning for a long time now. Well these days, it is paying off in spades so no surprise they'd continue on that path.
And to be fair, I think that's a legit way to do games. I can appreciate a game that is really big and open, but kinda shallow and a "make your own fun exploring and gathering" situation, just like I can appreciate a game that is a very tight, curated, focused, experience. We just don't have...
Doesn't have to be as powerful as nVidia to still be a nice step forward. The thing is, the better the RT AMD has, the more games that'll implement RT. Right now RT is pretty much a PC-only, nVidia-only feature. Ya you CAN use it on other stuff, but performance is often not good enough. Makes...
True, but they make it difficult, it pushes it pretty hard... kinda like Windows these days. You don't have to use a MS account, I don't, they just make it a PITA not to. I don't love it, I'm not trying to defend it as a good thing, but I don't see why it is a dealbreaker when it really isn't on...
Nah man, there were LOTS of people (well, angry Internet nerds) who were upset. I saw more than a few posts here of "I'm never switching off 7, 10 sucks, MS sucks, your face sucks!" kind of thing. Has happened with every version of Windows I've ever seen. Nerds rage that the new one sucks, they...
Not a ton. I can give you a big list if you like but most of it is pretty minor. For users here, the biggest thing is better HDR support and general graphics pipeline improvements. If you game, particularly in HDR, Windows 11 is better. However in general it is a very minor update. A lot of OSes...
Few will bother. Setting up a dual boot system isn't hard, but it is a lot harder than just updating your OS. Despite what the nerd rage here may tell you, the reason people are on 10 is because it works fine and they don't bother to upgrade, not because 11 is bad. You aren't going to see many...
It's always what happens with Windows. Most people are real apathetic about upgrades so it doesn't happen until it is forced or happens naturally through attrition.
Funny thing is, same shit happens with Linux a lot too. You don't tend to see it in reporting since it usually all gets folded in...
It's literally just using optical flow data to say "objects moved from here to there in these two frames so stuff should be... here-ish for an intermediate frame." It can actually, potentially, increase latency a bit as it needs to have a "previous frame" and "next frame" before it can display...
It works pretty well, particularly at higher frame rates. It isn't perfect, but you really don't notice things looking off too much. Again the higher the source FPS the less you'll notice. However it does have two big downsides:
1) The game doesn't "feel" fast. This is more an issue at low...
Also ML may be less of a bitch with the multi-chip problems because it is less time sensitive. There are lots of things that scale not just to multiple chiplets or chips but multiple nodes real well. The problem with realtime graphics is that "realtime" bit. Want 120fps? Ok, that's 8.3ms, max...
I mean, it shouldn't be a huge surprise given the issues with multiple GPUs. Sure that is going to be slower in some respects since you are connecting over an external bus which is going to be hard to make as fast as something on the same package (or even board)... BUT it is still the same...
It can be pretty impressive for special effects. In Jedi Survivor when you get flash-banged it'll do as bright a full screen white as it can and it is really impactful. That said, it isn't that big a deal.
No it really wouldn't because you have to remember test patches are that bright with...
Ya. While 120fps is something I can get in quite a few games, it is usually pretty closed to maxed on the GPU so, even CPU limits aside, there would just be no going to 240fps. Plenty won't even hold that these days with all the shinies turned up. Like Hogwarts Legacy it dipped down around 60fps...
I like high framerate, but I just can't see cranking down the detail enough to get 240Hz, even if I had a monitor that could handle it. I like shinies too much.
Ya that's where I am as well. I just do not get the nerd rage over this. Particularly with Aloy. I don't see anything egregiously wrong with her character model, she does not, in fact, "look like a chick with a dick" or any of that. It just seems like a very, very stupid thing to be getting...
Well RTings has a huge issue in their test, for real world usage, which is that they run both at their max brightness, and the QD-OLEDs have a higher max brightness. Now while I understand their idea of "we run it at whatever the max is" that's not realistic to how it'd be used. If a WOLED...
That's why I'm so interested in Hardwar Unboxed's test. Basically what they are doing is using the monitor for productivity, leaving on any screen protecting features that are non-obtrusive (like pixel shift and compensation cycles) and turning off any that are a pain (like taskbar dimming). So...