Ratchet & Clank: Rift Apart | RTX IO- GPU-accelerated Storage & Loading



Ratchet & Clank: Rift Apart | 4K DLSS 3 Gameplay Comparison
 
Wil be interesting to see how it does on regular sata ssd, HDD even, with gdeflate GPU decompressio you can get 1.5 GB-s effective bandwith from regular HDDs which is not too shabby at all
 
Wil be interesting to see how it does on regular sata ssd, HDD even, with gdeflate GPU decompressio you can get 1.5 GB-s effective bandwith from regular HDDs which is not too shabby at all
I have all of those in my PC. I don't have the testing suite to have benchmark numbers, but I can find out how it feels to play on each.
 
Which could be here the most relevant metric, not sure how well (even 1% low) benchmark will capture this.
With the way this game plays any stutter or interruption when rifts appear will be the most important from a gameplay perspective. If there is barely a noticeable interruption I think it will be acceptable.
 
looks like Sony only allows reviews to be posted for their PC ports right at launch time...I trust Nixxes but I wish they would lift the embargo at least a day before
 
Ratchet & Clank: Rift Apart Won't Support Ray Tracing on AMD GPU's at Launch

Ratchet & Clank: Rift Apart will not support ray tracing on AMD GPUs at launch, developer Nixxes Software confirmed...it seems like there are some technical issues behind the decision of disabling ray tracing, as the developer confirmed it is working closely with AMD to enable support as soon as possible...

https://support.insomniac.games/hc/en-us/articles/17812815585549-PC-performance-tuning
 
AMD driver issue no doubt since this is the first game supporting RT with DirectStorage 1.2 at the same time.

Portal Prelude RTX doesn't count since it's a mod and a converted engine to Vulkan no less.

4PM UK time for unlock,bgonna finish work earlier to get home at prime time!
 
I ran the Microsoft Bulk Load DirectStorage 1.2 Benchmark, with GPU Decompression...results are on Windows 10 with a Samsung 970 Evo Plus 1TB NVMe (PCIe 3.0)
 

Attachments

  • DS1.2.png
    DS1.2.png
    1.1 MB · Views: 0
AMD driver issue no doubt since this is the first game supporting RT with DirectStorage 1.2 at the same time.

Portal Prelude RTX doesn't count since it's a mod and a converted engine to Vulkan no less.

4PM UK time for unlock,bgonna finish work earlier to get home at prime time!
Could be the last minute addition of ray traced ambient occlusion that is causing issues. Correct me if I'm wrong, but I don't think there are any "AMD sponsored" games that include the effect, and it may be due to these technical issues Nixxes is referring to.
 
Ratchet & Clank: Rift Apart- Launch Trailer | PC Games



Ratchet and Clank: Rift Apart- PC Max vs PS5 vs PC Very Low- First Look


You can really see the improvements over the console version in this video. Also, the difference between SATA and NVME are bigger than I expected.

I preloaded the game onto my 7200RPM HDD with 256MB of cache to see how it runs on that first.
 
Also, the difference between SATA and NVME are bigger than I expected.
Some sequence were made on different CPU-system overall as well.

At:


It seem smaller but visible to human eye difference.

Early on, maybe couple of patches will change things, but still a game in the 7 gbs or 3.0 nvme 3.5 speed does not matter.
 
Reviews on Steam are mostly positive, but the reviewers who seemed to play the game mention crashes. The positive ones seem to have played it on PS5.
 
There are defo crashes, I had a couple already and I'm 2% into the story. Will write up a proper review shortly as am posting it on another board too.


Edit*


Just had a quick go and immediate things I noticed in settings is that sometime sthe settings don't remember. Also Full Screen is not Full Screen, it's Windowed Full Screen. Windowed mode is just labelled as windowed. So they got this naming convention wrong. I only noticed because my DLDSR res was not available until I set it to Exclusive Full Screen :rolleyes:

Also I had to quit and load the game twice another time for it to load the game using the DSR resolution and other settings else it loaded at native 3440x1440.

The advanced graphics presets toggle only goes up to Very High, which is strange because various GFX options go to Ultra, so by selecting the max preset of Very HIgh, you aren't actually maxing out the settings, so have to then manually increase those that do go to Ultra. unlike Cyberpunk, there isn't an indicator to tell you how many more settings beyond ultra each option has, which is annoying but hey ho.

Main settings I'm using:

fGUDbjw.jpg


lBQBhYc.jpg


RCnRFan.jpg


With DLSS set to Quality at 5160x160 on a 4090 and 12700KF I was seeing teh framerate bounce from above 60fps at a few times to 109fps other times, even though the scene hasn't changed which is weird, until I glanced at the RTSS overlay and saw that a lot of the time the GPU usage, even at 5160x160 was dropping below 90% whilst the CPU usage remained in the 40% range. So it looks like there's an optimisation issue with CPU utilisation resulting in the GPU having to wait around. It's not a CPU bottleneck issue because I am at 5160x160, and a 12700KF is more than beefy enough to handle anything.

I suspect we will be seeing optimisation patches fairly soon as also noticed numerous instances of frame drops and inconsistent performance when enabling frame generaton.

Screens from those inconsistent notes above, keep an eye on the RTSS overlays... All of these are with Frame Gen off except the last one. I should add that changing to 3440x1440 (native res) didn't really change the framerate much or the frametime behaviour. A definite bug it would seem, but I will do a system reboot shortly just to be double sure.

What is going on here???
J8fClO3.jpg


Why is this scene 68 fps??? Oh yeah that's why, the GPU is at 71%.,...
J50dKoX.jpg


But when it is working fine it's very good, look at the CPU use below, same as above, the GPU is also 98% which is where it should be:
V3FE7F9.jpg


But then walk nearby somewhere and suddenly the GPU use drops again resulting in:
oTOV6Wc.jpg


This scene was an eye opener, 15% CPU utilisation??? Multiple physical cores are either idling or parked...
WnAG6bv.jpg


And here we are back to normal, remember DLSS Quality only at 5120x2160 and when it's like this it runs so smooth:
LNPniBJ.jpg


Here the GPU utilisation dropped randomly again:
tdEt9IH.jpg


And lastly, this is the experience with Frame Gen enabled, whether it's at 3440x1440 or at DLDSR 5160x2160 doesn't seem to matter, the framerate just drops of there are consistent frametime stutters as below:
BWcn1HL.jpg


Oh I also got a crash to desktop at the start of the game too, which can be seen at the end of this clip:



Other than that, looks/great though and mouse and keyboard with the mechanics works well. There's no Raw input setting though and I did have to drop the x and y sensitivity to 4 from the default 5 as on the XM2we I found it too sensitive at 1000CPI and 1000Hz which is my 24/7 default everything.

Now, time to watch the DF review to see if what they found matches what I found.
 
Last edited:
robbiekhan , have you tried not using the overlay to see if it still crashes then? Sometimes the rtss overlay can cause them.
I did think this, I also had the Nvidia geforce experience enabled but not its overlay, as I use it to record game footage. But it seemed to be an issue with it enabled or off.

I have now used the preset Very High only, but also turned on all RT as Very High does not turn them on. In this preset with RT enabled and the same DLSS Quality only, everything is butter smooth and hovers around the 100fps mark almost all the time either dropping or gaining more fps just slightly.

I then manually upped the advanced settings back to Ultra where available, draw range for RT to 10 and anything else that can go higher. The fps range hasn't changed too much, 75fps+ with 80-90fps nominal. Weird, no idea what was going on before then with all the drops and spikes noted above. So for all intents and purposes, it appears to have resolved itself.... This is a mystery lol.
 
There are defo crashes, I had a couple already and I'm 2% into the story. Will write up a proper review shortly as am posting it on another board too.


Edit*


Just had a quick go and immediate things I noticed in settings is that sometime sthe settings don't remember. Also Full Screen is not Full Screen, it's Windowed Full Screen. Windowed mode is just labelled as windowed. So they got this naming convention wrong. I only noticed because my DLDSR res was not available until I set it to Exclusive Full Screen :rolleyes:

Also I had to quit and load the game twice another time for it to load the game using the DSR resolution and other settings else it loaded at native 3440x1440.

The advanced graphics presets toggle only goes up to Very High, which is strange because various GFX options go to Ultra, so by selecting the max preset of Very HIgh, you aren't actually maxing out the settings, so have to then manually increase those that do go to Ultra. unlike Cyberpunk, there isn't an indicator to tell you how many more settings beyond ultra each option has, which is annoying but hey ho.

Main settings I'm using:

View attachment 585754

View attachment 585757

View attachment 585760

With DLSS set to Quality at 5160x160 on a 4090 and 12700KF I was seeing teh framerate bounce from above 60fps at a few times to 109fps other times, even though the scene hasn't changed which is weird, until I glanced at the RTSS overlay and saw that a lot of the time the GPU usage, even at 5160x160 was dropping below 90% whilst the CPU usage remained in the 40% range. So it looks like there's an optimisation issue with CPU utilisation resulting in the GPU having to wait around. It's not a CPU bottleneck issue because I am at 5160x160, and a 12700KF is more than beefy enough to handle anything.

I suspect we will be seeing optimisation patches fairly soon as also noticed numerous instances of frame drops and inconsistent performance when enabling frame generaton.

Screens from those inconsistent notes above, keep an eye on the RTSS overlays... All of these are with Frame Gen off except the last one. I should add that changing to 3440x1440 (native res) didn't really change the framerate much or the frametime behaviour. A definite bug it would seem, but I will do a system reboot shortly just to be double sure.

What is going on here???
View attachment 585755

Why is this scene 68 fps??? Oh yeah that's why, the GPU is at 71%.,...
View attachment 585756

But when it is working fine it's very good, look at the CPU use below, same as above, the GPU is also 98% which is where it should be:
View attachment 585762

But then walk nearby somewhere and suddenly the GPU use drops again resulting in:
View attachment 585759

This scene was an eye opener, 15% CPU utilisation??? Multiple physical cores are either idling or parked...
View attachment 585763

And here we are back to normal, remember DLSS Quality only at 5120x2160 and when it's like this it runs so smooth:
View attachment 585758

Here the GPU utilisation dropped randomly again:
View attachment 585761

And lastly, this is the experience with Frame Gen enabled, whether it's at 3440x1440 or at DLDSR 5160x2160 doesn't seem to matter, the framerate just drops of there are consistent frametime stutters as below:
View attachment 585753

Oh I also got a crash to desktop at the start of the game too, which can be seen at the end of this clip:



Other than that, looks/great though and mouse and keyboard with the mechanics works well. There's no Raw input setting though and I did have to drop the x and y sensitivity to 4 from the default 5 as on the XM2we I found it too sensitive at 1000CPI and 1000Hz which is my 24/7 default everything.

Now, time to watch the DF review to see if what they found matches what I found.

In the first screenshot it looks like your video card is running really hot.

I also saw a tip that you should restart the game after changing your graphic settings. Seems like there may be a coding issue where the rendering thread isn't properly updated while the game is still running.
 
The settings screenshot? The GPU does not run hot, it runs at normal temps and there has never been any issues with the card running hot in any game. I can path trace Cyberpunk all day long without a single issue.

But if you mean the first in-game screenshot shown then that's at 78 degrees core temp, which is normal.
 
I forgot about the decryption process when the game unlocks, so now it's decrypting on my HDD since that is where I preloaded. Getting 10MB/s when I could have had 700-800 MB/s...
The settings screenshot? The GPU does not run hot, it runs at normal temps and there has never been any issues with the card running hot in any game. I can path trace Cyberpunk all day long without a single issue.

But if you mean the first in-game screenshot shown then that's at 78 degrees core temp, which is normal.
The forum compressed your images, so to me it looked like the top one said 91C.
 
I decided to check it out. I've maxed all settings (including RT) with DLSS Quality and I'm running exclusive fullscreen @ 60Hz with vsync enforced from Nvidia Control Panel (vsync disabled in game) and it's running smoothly. Not maxing out the GPU or CPU. 3440×1440, RTX3090, i7-9700K, 32GB RAM, NVMe PCIe 4.0 SSD, but on PCIE 3.0.

I never liked the variable frame rate mode on the PS5 version, I swear the physics of some objects look like they're tied to the frame rate or maybe it's just me; something doesn't look right when the frame rate ramps up. I think this game looks good capped at 60FPS, maybe because of motion blur or maybe the frame pacing is good.

Anyway, no Atmos, bummer.
 
Well after multiple runs now since my last post all seems well, 5160x2160 max everything and Ultra where possible. very impressive now. Rifts don't have the slight hitch I saw on the official RTXIO showcase video from yesterday either.
 
I have a pile of other games that just rolled out (or will be out soon), but this is a definite must buy for me in the future. The previous games were great and I can't wait to see what this is all about in 4K with all the bells a whistles.
 
if anyone but Nixxes had worked on the port it would have been a disaster...Nixxes worked their magic and released a very solid PC version...I'm still going to wait for a few patches but they deserve a lot of credit for even being able to work on a game specifically released for PS5 console hardware
 
Yeah overall it's a solid job, few technical hitches but otherwise seems all fine if you take heed of the quirks noted earlier and those by digital foundry!

I did a full video now too as had to restart the game due to the progress-breaking bug that has carried over from the PS5 (don't leave the club to explore when you have to follow the Phantom, the doors will close and you're stuffed unless you created a manual save point before the event).

This is a start to crash site with no other interruptions, no crashes etc. You can see various framerate drops here and there and it ranges from as low as 31fps at one time to over 100fps others. CPU/GPU utilisation remains whacky.

 
Yeah overall it's a solid job, few technical hitches but otherwise seems all fine if you take heed of the quirks noted earlier and those by digital foundry!

I did a full video now too as had to restart the game due to the progress-breaking bug that has carried over from the PS5 (don't leave the club to explore when you have to follow the Phantom, the doors will close and you're stuffed unless you created a manual save point before the event).

This is a start to crash site with no other interruptions, no crashes etc. You can see various framerate drops here and there and it ranges from as low as 31fps at one time to over 100fps others. CPU/GPU utilisation remains whacky.


Looks like the video is still processing because I'm only getting 320p in the quality option.
 
Compusemble shows that the game isn't using more than 2GB/s of bandwidth, even with Gen 4 or 5 drives. This is why it's not as fast as the PS5 version, which is fully utilizing the bandwidth and compression scheme of its SSD.

1690471735153.png

 
Decided to buy it to play on the Steam Deck Mostly. Will report back my finding. I will fire it on my desktop PC which should be good for 1440P Medium to high and the goodies minus RT turned on with the computer in signature. ETA Prime shows the Steam deck running it solid at a mix low/medium. I My deck is undervolted and memory is OCed. I plan to test with stock 15watt TDP and an unlocked one at 18watts. Goal is 45 FPS mix of low/medium and FRS 2.1 in dynamic mode.
 
is there really going to be any noticeable difference between PCIe 3.0 vs PCIe 4.0 NVMe drives in terms of DirectStorage? I mean maybe a few milliseconds or 1-2 seconds at the extreme most but I can't believe there will be any real-world difference...even the difference between NVMe vs SSD is milliseconds

Rift Sequence Loading- NVME Gen 3 vs SATA SSD vs HDD [DirectStorage 1.2]

 
With the regular HDD "winning" some of the round would probably be nice to have an average, seem to be something noisy going on (maybe very little from the drive if anything at all are needed in some transition).

Maybe with fast meteor Lake-Ryzen 9000, DDR5 9200, game patchs, direct storage 1.3. faster GPU on pci-e 5.0 it will start to show more difference from top to bottom or 128-256gig of DDR5 will become cheap and the whole game made to fit on a PS5 will simply be loaded in ram anyway.
 
Back
Top