Yea DLDSR can be used in any game. But I was talking about using it like DLSS, with a lower internal resolution rather than a higher internal resolution. Some games can look better using DLDSR and the internal resolution slider and its more flexible than NIS. But NIS generally looks ok at 77%...
It has to be added to the game because it requires temporal data. You can kind of get a slightly worse DL Upscaling experience in games that have a Image Scaling/Internal Resolution slider and using Nvidias DLDSR.
I've been playing casually the past few days (zero build, since I haven't played in 3+ years), the game does look incredible maxed out at 4K. Using TSR with 77% render scale, averaging around 90fps. One thing that irritates me is HDR only works on days BattleEye is being used instead of EAC.
I refunded for now. Gonna wait until it's in better shape, I doubt another quick patch is going to solve all the issues. The gameplay, visuals and HDR implementation are fantastic, I just don't want my experience it ruined by performance issues.
I hope so, Nvidia did include a profile in the latest driver for the game, but probably didn't put it in the release notes because it was incomplete or something.
From my few minutes of experience with the game, it seems like it has a solid foundation for a good game.
The game is unoptimized, not getting full gpu utilization at 4K Max Settings + RT averaging around 50FPS, CPU isn't being fully utilized as well. It's not just shader compilation issues. I think they wanted to rush it out the door before Dead Space Remake comes out. I'll mess around with it a...
Accurate to the best of my knowledge on my main rig:
Intel 386sx 12 MHz (16mhz turbo)
AMD K6 233 MHz
AMD K6-2 500 MHz (killed this one overclocking)
AMD Athlon XP 1800+
AMD Athlon XP 1700+ (for better overclocking)
AMD Athlon XP 2500+ Barton
AMD Athlon 64 4000+
AMD Athlon 64 X2 5000+ Black...
Strix X570-I Gaming has a update available. I'll check it out this evening it only says "Improve system compatibility" so not sure if it includes CO for the 5800x3D, pretty vague.
When games start using SER there should be a greater uplift in RT performance. But yea the penalty about the same as the previous generations on older games.
STALKER 2: Heart of Chernobyl
Elden Ring Expansion (unconfirmed, but likely)
Resident Evil 4 Remake
Dead Space Remake
Cyberpunk: Phantom Liberty
Forza Motorsport
Starfield
1.1122.0.0
Patch Notes
Added NVIDIA Reflex support for all compatible GPUs.
Added a Contrast slider.
Added a HDR Paper White slider.
Improved the way Finishers and other two button actions are handled.
Addressed visual bugs with shadows and sunlight that could occur with certain AMD Radeon...
The lighting looks entirely different in a good way. The anti-aliasing looks a lot better as well, but not sure if they shot the trailer at 8K then downsampled it.
Why does it have to be the display when you can just use RSR/NIS? This is why upscaling technology on the GPU side has been advancing in the past few years. FSR and DLSS (quality modes) would be 1440p internal resolution with 4K output and look close to 4K native. Even RSR/NIS get fairly...
For anyone interested the timings that DRAM Calculator for Ryzen provided after importing a the XMP data were significantly tighter than without importing. Not at home right now so I can't show the comparison, but if anyone is interested I will post screenshots. Without importing it was...
Anyone know how I can extract the XMP data from my RAM? I would like to import it into DRAM Calculator For Ryzen. I tried Thaiphoon Burner but I can't get it working on my system even with admin privileges and turning off Windows Security.
Edit:
Nevermind, figured it out.
The solution was to...
I would hardly call this a bad showing. Yea they didn't beat the 4090 in RT performance. If you calculate based on efficiency gains (assuming they are accurate) with the 355w TDP the card should be 1.659x the performance of the 6950XT on average which trade blows with the 4090 in rasterized...
My moneys on them renaming it to the 4080Ti and confusing people even more.
In all seriousness, I think they pulled it not because of the name or the community. I think they probably got information on the RX 7800 and the 12-GB 4080 wasn't competitive.
I think the scalping will stop within a month. Not many people are buying them and they will need to return the cards to the store within 2 weeks if they want their money back.
I thought about this earlier and I agree mainly because the card is really only good for 4K/100hz+ gaming. I don't think there are many of us out there. Guessing they want artificial scarcity to trigger people on the edge of buying one's FOMO. I didn't see the FE card in stock either I kept...
Suddenly I lost interest in the game...
I'll get roped back in before launch, but FFS why do they need to make this multiplayer. They barely know how to balance a single player game, can't imagine how unbalanced a multiplayer Witcher is.
It might come down to the PSU manufacturer, Corsair says their PSUs will be fine. My assumption is their PSUs won't run out of spec. This will limit some people on what they can buy, I myself will be limited to 4080-16GB unless I want to buy a 1000w/1200w PSU, which I likely wouldn't because...
I'm still deciding between the 4090/4080-16gb but I would have to get the 4090 Founders Edition because its the only one that looks like it will fit in my case. Gonna wait for reviews before finalizing my decision.