5800X3D Desktop Replacement

Honestly, those CPUs are horrendous overkill for the GPU options, but I mean it is what it is.
 
Honestly, those CPUs are horrendous overkill for the GPU options, but I mean it is what it is.

I mean, you know what the kids are doing these days though right?

The YouTubers are telling them that framerate is everything, and that you have to minimize all quality settings (and even use config file and command hacks) to disable as much as possible to make sure they get 165hz or whatever their monitor supports constantly.

And these fools are buying it up. They are essentially ruining their own gaming experience in an obsession to have their framerates flatlined at the max their monitor can handle.

No sacrifice is too great. My stepson was running his fancy FreeSync2 1440p monitor at some low-ass blocky resolution with everything minimized in games, and he doesn't even need it, because he has a pretty decent GPU, and he won't change it back no matter how much I educate him, because the fools on youtube and twitch tell him otherwise, and shit, what do I know compared to them? I've only been doing this for 30 years...

It's stupid out there.
 
I mean, you know what the kids are doing these days though right?

The YouTubers are telling them that framerate is everything, and that you have to minimize all quality settings (and even use config file and command hacks) to disable as much as possible to make sure they get 165hz or whatever their monitor supports constantly.

And these fools are buying it up. They are essentially ruining their own gaming experience in an obsession to have their framerates flatlined at the max their monitor can handle.

No sacrifice is too great. My stepson was running his fancy FreeSync2 1440p monitor at some low-ass blocky resolution with everything minimized in games, and he doesn't even need it, because he has a pretty decent GPU, and he won't change it back no matter how much I educate him, because the fools on youtube and twitch tell him otherwise, and shit, what do I know compared to them? I've only been doing this for 30 years...

It's stupid out there.
While I hear you - I think "that crowd" is into competitive online FPS so they are looking for any edge they can get. My bro does that crap with Fortnite and I guess the thought is that you can see your enemies better when there's less foliage and all of that. :)

My approach is to use the highest end exotic parts possible and then max everything out and still play well - except now I am old so I need those edges even more. :)

Laptop GPUs are no slouch, especially with the level of power available to them on a laptop like this! But it's still nowhere near RTX 3090 Ti-level. And nevermind the 4xxx cards - laptop ones wont be out until early next year.
 
While I hear you - I think "that crowd" is into competitive online FPS so they are looking for any edge they can get. My bro does that crap with Fortnite and I guess the thought is that you can see your enemies better when there's less foliage and all of that. :)

My approach is to use the highest end exotic parts possible and then max everything out and still play well - except now I am old so I need those edges even more. :)

Laptop GPUs are no slouch, especially with the level of power available to them on a laptop like this! But it's still nowhere near RTX 3090 Ti-level. And nevermind the 4xxx cards - laptop ones wont be out until early next year.
I will admit my 3070ti laptop is a bit of a beast, just wish it had a display port output instead of an HDMI so it would work with my Rift S, but it doesn't, and the Oculus software doesn't recognize the USB-C to Displayport connection as a valid input source :'(
 
While I hear you - I think "that crowd" is into competitive online FPS so they are looking for any edge they can get. My bro does that crap with Fortnite and I guess the thought is that you can see your enemies better when there's less foliage and all of that. :)

My approach is to use the highest end exotic parts possible and then max everything out and still play well - except now I am old so I need those edges even more. :)

Laptop GPUs are no slouch, especially with the level of power available to them on a laptop like this! But it's still nowhere near RTX 3090 Ti-level. And nevermind the 4xxx cards - laptop ones wont be out until early next year.
Yep - There's a valid reason for it with competitive games. I remember when I was into StarCraft 2 in the early days of it, pretty much every pro would run on low settings to maximize FPS and also to reduce the extra effects on the screen that would just be a distraction during competitions. They kept the resolution high though, as that was important to maximize visibility of the map. They still do this today, as SC2 is still one of the most demanding titles on the CPU, one where v-cache really shines.
 
While I hear you - I think "that crowd" is into competitive online FPS so they are looking for any edge they can get. My bro does that crap with Fortnite and I guess the thought is that you can see your enemies better when there's less foliage and all of that. :)


Yeah, what I used to consider "settings hacking" back in the day.

If you are disabling things so you can see opponents in cover, like , whats the point of the damn game. My not just shoot at a square red box sitting on an open white plane surface.

With all the "auto-matchmaking" tech we have today, it would be nice if games matched players with similar settings. If you like a "realistic" experience with all the foliage and everything enabled then it would be nice if they matched you with others who like that experience as well.

THis is one of the reasons I've soured on playing multiplayer games over the last several years, but to be fair, some kids were doing this back in the CS 1.x days as well, but it was a lot fewer of them, and since I admined the servers, I could ban those who tried to get an unfair advantage by disabling things in settings that were meant to be enabled.

People like this have to go ruin everything for everyone. I'm happy they play Fortnite, so I never have to run into them.
 
Yeah, what I used to consider "settings hacking" back in the day.

If you are disabling things so you can see opponents in cover, like , whats the point of the damn game. My not just shoot at a square red box sitting on an open white plane surface.

With all the "auto-matchmaking" tech we have today, it would be nice if games matched players with similar settings. If you like a "realistic" experience with all the foliage and everything enabled then it would be nice if they matched you with others who like that experience as well.

THis is one of the reasons I've soured on playing multiplayer games over the last several years, but to be fair, some kids were doing this back in the CS 1.x days as well, but it was a lot fewer of them, and since I admined the servers, I could ban those who tried to get an unfair advantage by disabling things in settings that were meant to be enabled.

People like this have to go ruin everything for everyone. I'm happy they play Fortnite, so I never have to run into them.

Arguably if there were settings for them, they were meant to be disabled as well. Sounds harsh to ban players for options that were included as part of the game.
 
Arguably if there were settings for them, they were meant to be disabled as well. Sounds harsh to ban players for options that were included as part of the game.

I never banned anyone for using settings that were in the game menu, but if they dug into the console commands to try to disable stuff that wasn't readily apparent could be disabled, that got them the ban from me.

As time went on the server got the ability to enforce some of these settings, so it became less of an issue. But geez, this was a long time ago. I can't even remember the commands or the settings anymore. We are talking WON era pre-Steam GoldSource engine multiplayer.
 
Many commands weren't allowed in CS 1.x because it could give you an unfair advantage. You could disable those server-side, so it's never a problem.

As for more FPS = supposedly unneeded, there is no rule about that. In Quake Champions 250 FPS might give you better results than 280 FPS, because the engine is that bad. In fighting games, you'd think having more FPS isn't usefull since they all run at 60 FPS anyway, but there is a difference of 3 frames between a 60Hz monitor and a 360Hz one, which is a big deal in this genre. In many games having more FPS means having less input lag or being able to do things that are impossible with a lower framerate.
 
Many commands weren't allowed in CS 1.x because it could give you an unfair advantage. You could disable those server-side, so it's never a problem.

As for more FPS = supposedly unneeded, there is no rule about that. In Quake Champions 250 FPS might give you better results than 280 FPS, because the engine is that bad. In fighting games, you'd think having more FPS isn't usefull since they all run at 60 FPS anyway, but there is a difference of 3 frames between a 60Hz monitor and a 360Hz one, which is a big deal in this genre. In many games having more FPS means having less input lag or being able to do things that are impossible with a lower framerate.

I'd vehemently challenge the notion that anything above - say - 100fps ever offers any advantage.

For crying out loud we are talking a frame time of 10 milliseconds.

These are people playing games, not scientific instruments.
 
I'd vehemently challenge the notion that anything above - say - 100fps ever offers any advantage.
It does because there are jumps that require you to have more, way more FPS than 100. CoD had a notorious one. CSGO too.
For crying out loud we are talking a frame time of 10 milliseconds.
1 frame = 16ms~. So more like around 50ms, which is considered as a bad ping for today's standard. Players can feel a 1 frame difference in fighting games, not because they are reacting to it, but because there are combos that are called "just frames". With 3 free more frames that the monitor gives you, every action you do becomes easier.
 
Yeah, what I used to consider "settings hacking" back in the day.

If you are disabling things so you can see opponents in cover, like , whats the point of the damn game. My not just shoot at a square red box sitting on an open white plane surface.

With all the "auto-matchmaking" tech we have today, it would be nice if games matched players with similar settings. If you like a "realistic" experience with all the foliage and everything enabled then it would be nice if they matched you with others who like that experience as well.
A lot of it is on the developer, too. The settings in today's games allow you to turn the image quality down far enough that the game will get 60fps on a smartwatch. They do this because they want to capture the widest possible audience. Globally, 90% of hardware is always 3-generations behind at any given moment. That's a lot of potential customers to cut out. On top of this, they've also been unwilling to write engines where the underlying mechanics aren't fundamentally keyed to the frame rate.

At the same time, the gamers are an incredibly whiny group. There's less crying in a typical daycare. You'd think the developers had murdered entire families of gamers just because they set minimum requirements above a 386. On top of that, gamers are convinced that they're basically all global esports champions who make their living by winning tournaments and streaming on twitch. "An extra 500ns of lag is easily noticeable and makes the game totally unplayable" typed the gamer on his BT keyboard. Meanwhile, LTT has had legitimate pro gamers be completely unable to tell the difference between 90fps and 144fps in blind testing.
 
It does because there are jumps that require you to have more, way more FPS than 100. CoD had a notorious one. CSGO too.

1 frame = 16ms~. So more like around 50ms, which is considered as a bad ping for today's standard. Players can feel a 1 frame difference in fighting games, not because they are reacting to it, but because there are combos that are called "just frames". With 3 free more frames that the monitor gives you, every action you do becomes easier.

That's bullshit.

It's all in their heads. Cofirmation bias. They expect to experience a difference, so their brains manufacturw the feeling of a difference.

I bet in a blinded A/B comparison no one on the planet in any game on any platform can te the difference between 100fps and anyhting above 100fps.

Even 100fps vs 1 billion fps.

It would be fun to look at large sample size stats on game performance and measure this perceived difference in performance.

Time and time again once you do a blinded A/B comparison test stuff like this falls like a house of cards. Doesn't matter the topic. Golden eared audiophile nonsense, high resolution movie watching, and high framerate gaming.

It's all in people's damn heads.

With what we know about how unreliable our human brains (yes, every last one of us) are at interpreting the world around us, anyone who trusts their own eyes (or ears) is an ignorant fool.

Our brains are constantly trying to fool us, and unless you are actively taking steps to try to eliminate bias, you can't trust anything you experience.
 
Last edited:
I'd vehemently challenge the notion that anything above - say - 100fps ever offers any advantage.

For crying out loud we are talking a frame time of 10 milliseconds.

These are people playing games, not scientific instruments.

Entirely depends on the game engine. For single threaded game engines, it most certainly does make a difference, because a higher frame rate means that the engine runs faster, not just what's being drawn on the screen.

Using an ancient game I'm intimately familiar with, the Half-Life engine is entirely frame based. If you have a trash frame rate, everything in the game slows down, because the game logic has to be calculated before every frame is rendered. So if you have 10-15 fps, the game is going to run a whole lot slower than 60fps than 100 fps. This is why players in HL engine games like CS wanted the highest framerate possible, because your shots would register better, and you could perform advanced movement of your character easier. If your frame rate gets low enough, the engine can actually start to break, because it depends on game logic being updated at a minimum time span. If you go longer, game logic and NPCs can start having undesirable behavior.

This extends to more or less every engine with Quake lineage, including the Source engine. There are other single threaded engines out there that do the same thing, since you have to do certain functions in certain steps before rendering the screen.

For game engines that have multiple threads, it's less of a difference. If the physics and game logic are on different non-blocked threads than the rendering engine, the renderer can just keep rendering whatever is in the current frame buffer over and over if it doesn't get anything new. So high frame rates in games like this can be a waste of resources
 
I bet in a blinded A/B comparison no one on the planet in any game on any platform can te the difference between 100fps and anyhting above 100fps.
Linus tech tips did it, and for regular people on a snipper experiment they saw different result, there is a bunch of study on this.



At the time tag, they show difference between the tests at 144 FPS and 144 hz vs 240 FPS-240hz and it is giant to some and small but seem significative to others, high average hits rate, much less std.

Double-Doors-Test.png


If those experiment were well done (blind, enough trial, etc...), seem that for competitive shooter it would not just be voodoo, https://kr4m.com/high-fps-better-gamer/

It also show than past 144hz-fps, it is quite the diminishing return for good player it seem
 
Linus tech tips did it, and for regular people on a snipper experiment they saw different result, there is a bunch of study on this.



At the time tag, they show difference between the tests at 144 FPS and 144 hz vs 240 FPS-240hz and it is giant to some and small but seem significative to others, high average hits rate, much less std.

View attachment 516508

If those experiment were well done (blind, enough trial, etc...), seem that for competitive shooter it would not just be voodoo, https://kr4m.com/high-fps-better-gamer/

It also show than past 144hz-fps, it is quite the diminishing return for good player it seem


If some ignorant fool on YouTube is th ebest you can cite you are going to ahve to do better.

Linus doesn't know shit about shit. Nor does any YouTuber. At least any I've seen to date.

He wouldn't know his ass from his elbow when it comes to a blinded study.

He was a damn sales guy when he started making shitty online videos. Sales guys don't know shit.

Rule Number 1: If it's on YouTube, it's probably wrong.
 
He wouldn't know his ass from his elbow when it comes to a blinded study.
He did it for hard drive vs SATA ssd vs NVME ssd, it is something that became well enough in the population for about anyone that have some idea of what it is.

It is hard to find study not sponsered by NVIDIA (which tend to say that without a doubt it matters obviously) with a simple google, they often top at 60 fps:
https://www.researchgate.net/figure...l-conditions-Latency-indicates_fig3_266656039
http://web.cs.wpi.edu/~claypool/papers/fr/fulltext.pdf

at will often not involve high level players.
 
If some ignorant fool on YouTube is th ebest you can cite you are going to ahve to do better.

Linus doesn't know shit about shit. Nor does any YouTuber. At least any I've seen to date.

He wouldn't know his ass from his elbow when it comes to a blinded study.

He was a damn sales guy when he started making shitty online videos. Sales guys don't know shit.

Rule Number 1: If it's on YouTube, it's probably wrong.

Where is your published data at? I mean you seem to believe you are some authority on this.
 
The YouTubers are telling them that framerate is everything, and that you have to minimize all quality settings (and even use config file and command hacks) to disable as much as possible to make sure they get 165hz or whatever their monitor supports constantly.

And these fools are buying it up. They are essentially ruining their own gaming experience in an obsession to have their framerates flatlined at the max their monitor can handle.

I can't stand that because then it's also easier for them to see. 165fps with full details, foliage detail, particles, etc. Is at an obvious at a disadvantage compared to 256 colors, no particles, etc.
 
If some ignorant fool on YouTube is th ebest you can cite you are going to ahve to do better.

Linus doesn't know shit about shit. Nor does any YouTuber. At least any I've seen to date.

He wouldn't know his ass from his elbow when it comes to a blinded study.

He was a damn sales guy when he started making shitty online videos. Sales guys don't know shit.

Rule Number 1: If it's on YouTube, it's probably wrong.

Your distaste for Linus blinds you to data and facts.

You obviously didn't watch the video. This was well done and proper.

And you are wrong.
 
As configured this seems excessive. I would think that sticking with a 5600-5900
And just getting a pair of noise cancelling headphones. Because that thing must sound like a helicopter.
 
If some ignorant fool on YouTube is th ebest you can cite you are going to ahve to do better.

Linus doesn't know shit about shit. Nor does any YouTuber. At least any I've seen to date.

He wouldn't know his ass from his elbow when it comes to a blinded study.

He was a damn sales guy when he started making shitty online videos. Sales guys don't know shit.

Rule Number 1: If it's on YouTube, it's probably wrong.

Your anti-YT bias is going to bite you in the end as nowadays just about everything is on YT. Appliance repairs, minor vehicle repairs/changing filters, etc, computer part reviews, video game reviews, cell phone reviews. You can't change it just embrace it.

Clearly there is a difference up to a certain point and that point is going to vary by the individual user and their needs. I think "competitive gamers" are ridiculously stupid, but if they can make money at it who am I to tell them otherwise? I'm perfectly fine at 1440p/165Hz with a larger monitor vs some 24" 360Hz panel like some might want for competitive gaming. I have never made a dime off of gaming.

There are plenty of other examples with other hobbies. I play golf. My clubs probably cost $2500 and my CC membership is $6000. I have no aspirations of ever playing on the PGA Tour, but I still want to have equipment that maximizes my game. I have no aspirations of changing out the motor in my vehicle. I don't have thousands of dollars in tools either, but I know lots of people who do. So I don't fault people who are paying for the best gear they can afford for their hobbies, and clearly my preferences aren't universal and people spend money on things that I wouldn't...shocking I know!

Linus is a goof, but that video by itself is interesting and likely valid. I doubt Linus writes the content by himself. If he's making any real money, he probably has people do the writing and then fact check it for him. He's just the face of a brand and a presenter. The brand suffers when the content isn't factual or is grossly misleading.
 
Last edited:
Your anti-YT bias is going to bite you in the end as nowadays just about everything is on YT. Appliance repairs, minor vehicle repairs/changing filters, etc, computer part reviews, video game reviews, cell phone reviews. You can't change it just embrace it.

Clearly there is a difference up to a certain point and that point is going to vary by the individual user and their needs. I think "competitive gamers" are ridiculously stupid, but if they can make money at it who am I to tell them otherwise? I'm perfectly fine at 1440p/165Hz with a larger monitor vs some 24" 360Hz panel like some might want for competitive gaming. I have never made a dime off of gaming.

There are plenty of other examples with other hobbies. I play golf. My clubs probably cost $2500 and my CC membership is $6000. I have no aspirations of ever playing on the PGA Tour, but I still want to have equipment that maximizes my game. I have no aspirations of changing out the motor in my vehicle. I don't have thousands of dollars in tools either, but I know lots of people who do. So I don't fault people who are paying for the best gear they can afford for their hobbies, and clearly my preferences aren't universal and people spend money on things that I wouldn't...shocking I know!

Linus is a goof, but that video by itself is interesting and likely valid. I doubt Linus writes the content by himself. If he's making any real money, he probably has people do the writing and then fact check it for him. He's just the face of a brand and a presenter. The brand suffers when the content isn't factual or is grossly misleading.
Linus doesn't write the scripts. At best he reviews them with his writers. Which he has a entries team of now. His has 80 something employees and looking for more at the moment. He is building our a entire lab to do real testing on products also.
 
His analysis seems to make sense and the data they generated seems to support it.
It also makes sense that more frames are better and that diminishing returns exist because at some point your eyes and reflexes can't make up the difference.
It would also be reasonable to see that practice makes proficient, in that a practiced player who knows what they are doing will do better with better tools than somebody who is more casual, and that a practiced player with better reflexes and better hand-eye will make better use of the better tools, to a point.
I still think the existence of 360hz monitors to be insane though, but I suppose I can understand why some people think they need or want them.
 
I still think the existence of 360hz monitors to be insane though, but I suppose I can understand why some people think they need or want them.
The fighting game community is slowly but surely adopting 360hz monitors due to PC being the main platform now.
It's not rare to have two players pressing a button at the same time, and the player with the least input delay wins, and it's been proven than the higher the refresh rate, the lower the input delay.

Can players tell the difference between 244hz and 360hz in a fighting game? No. Though, can they see its benefits? Totally. That's what most casual players struggle to understand.

In Quake Champions projectiles are poorly registered at 250 FPS...
 
The fighting game community is slowly but surely adopting 360hz monitors due to PC being the main platform now.
It's not rare to have two players pressing a button at the same time, and the player with the least input delay wins, and it's been proven than the higher the refresh rate, the lower the input delay.

Can players tell the difference between 244hz and 360hz in a fighting game? No. Though, can they see its benefits? Totally. That's what most casual players struggle to understand.

In Quake Champions projectiles are poorly registered at 250 FPS...
Yeah. that's sort of my point there are lots of anecdotal things where it's shown that faster is better, everybody puts in terms of FPS games but yeah I never thought about the fighting game people I mean they break down moves to frame counts, and they will measure the travel distance of button switches because that .1ms makes an actual difference.
 
Last edited:
Haven't seen a desktop replacement in a while... I had one of these back in the Intel Northwood era... lol

https://videocardz.com/press-release/xmg-introduces-worlds-first-amd-ryzen-7-5800x3d-powered-laptop

These things are monsters despite the GPUs. XMG and Clevo try to downplay it, but they run the 16 core CPUs just fine. They're decent clevo models that are criminally under-supported. XMG has been nice enough to spearhead testing and driver updates on the previous gen (B450) and these new models.

I bought a barebones version of the previous gen (B450 model) and maxed it out as a work laptop for myself at the beginning of the pandemic times. Even two years on, it sounds retarded on paper - 3950X in a 15" laptop. I use it to run a handful of VMs, it's basically a portable server with a built-in UPS. Biggest downside to these laptops is the lack of thunderbolt / USB4 and the B450 version can't be updated to support Zen 3 (otherwise has the same specs except a 2070 instead of the 3070).
 
If some ignorant fool on YouTube is th ebest you can cite you are going to ahve to do better.

Linus doesn't know shit about shit. Nor does any YouTuber. At least any I've seen to date.

He wouldn't know his ass from his elbow when it comes to a blinded study.

He was a damn sales guy when he started making shitty online videos. Sales guys don't know shit.

Rule Number 1: If it's on YouTube, it's probably wrong.

So i guess back in the day "if it was written in a book it was probably wrong"

Making a blanket claim that "Rule Number 1: If it's on YouTube, it's probably wrong." is stupid and just being ignorant. Like any medium the world has seen and will see, you can find good legit and valid data and facts, and you can also find loads of crap to go with it.

However, most tests are not well done. a true blind test is all that should be done with these, the gamers know NOTHING about what they are playing on and let em go and collect data.
 
I mean, you know what the kids are doing these days though right?

The YouTubers are telling them that framerate is everything, and that you have to minimize all quality settings (and even use config file and command hacks) to disable as much as possible to make sure they get 165hz or whatever their monitor supports constantly.

And these fools are buying it up. They are essentially ruining their own gaming experience in an obsession to have their framerates flatlined at the max their monitor can handle.

No sacrifice is too great. My stepson was running his fancy FreeSync2 1440p monitor at some low-ass blocky resolution with everything minimized in games, and he doesn't even need it, because he has a pretty decent GPU, and he won't change it back no matter how much I educate him, because the fools on youtube and twitch tell him otherwise, and shit, what do I know compared to them? I've only been doing this for 30 years...

It's stupid out there.

You probably start most sentences with "back in my day" while yelling at kids for being on your lawn.
 
You probably start most sentences with "back in my day" while yelling at kids for being on your lawn.

Give it time...

"Back in my day, YouTube was the collective consciousness, none of this quantum-powered metaverse bullshit! If I wanted to connect with someone, I'd put my dongle in their female port, when protection meant GFCI outlets!"
 
These things are monsters despite the GPUs. XMG and Clevo try to downplay it, but they run the 16 core CPUs just fine. They're decent clevo models that are criminally under-supported. XMG has been nice enough to spearhead testing and driver updates on the previous gen (B450) and these new models.

I bought a barebones version of the previous gen (B450 model) and maxed it out as a work laptop for myself at the beginning of the pandemic times. Even two years on, it sounds retarded on paper - 3950X in a 15" laptop. I use it to run a handful of VMs, it's basically a portable server with a built-in UPS. Biggest downside to these laptops is the lack of thunderbolt / USB4 and the B450 version can't be updated to support Zen 3 (otherwise has the same specs except a 2070 instead of the 3070).
I have this i5 7500 XPS (700 bucks) gaming laptop with a 1050 in it and it was experiencing issues in stability after like 3-4 years. Had been primarily been using it as a Tax machine and the occasional on the flexible computer for gaming and just doing personal stuff at work. Picked up a cheapo HP 17" laptop (530 Bucks) that normally ships with an Athlon and down graded everything than bumped the CPU to a 5700u... I couldn't believe they even had the option on their website. While the graphics portion of the equation is about half what the 1050 is, the CPU portion just beats the ever living shit out of the not so old i5. I ended up tossing on hand 16 GB of 3200 Mhz RAM and a Samsung 970 Pro into it, now it just powers through anything I do with it. It's close to the speed of my desktop 5900X, ish. Use the HP for personal stuff at work and publishing my books. Only thing I miss is the backlit keyboard.

The Clevo models are cool as hell. I hadn't really considered them too much over the years. They always seem a bit too expensive for what you get out of em. And the support is nonexistant, so you kind of end up having to support the things yourself by hunting drivers and for the most part you can. Because they're almost desktops. The BIOS support tends to be iffy on these though. Like yours, all they really needed to do was upgrade the BIOS for you to have support for the 5000 series.

I would consider going this route if CPU architectures had any sort of staying power. Back in the day advancements didn't seem to come all that fast and the next flavor of CPU and GPU wasn't right around the corner. AND even if it was, you didn't care. Because the hardware you picked up would run everything and you were satisfied, for years. My Clevo with the Intel Northwood and an ATI 9600 (IIRC) ran damn near everything forever. I didn't even look at replacing it.

Now, the 5800X3D has some staying power for the previous generation, the current generation and maybe one or two more. SO, this is an interesting purchase. If I didn't already have what I do, I would consider it. It just came out a bit later than when I was looking...
 
DLSS 3 Frame Generation disagrees with you.
DLSS 3 isn't disagreeing with him, from the little information we have so far it's actually disagreeing with you.
Some latency numbers are starting to get published and they arent half bad.
Grain of salt as this is one pretty cherry picked title but still not bad. Just a note DLSS 3 does not let you run it with Reflex Off.
View attachment 515026

https://www.eurogamer.net/digitalfo...ia-dlss-3-ai-upscaling-enters-a-new-dimension
 
DLSS 3 isn't disagreeing with him, from the little information we have so far it's actually disagreeing with you.
Did you even read the article you just referenced?

1665169396869.png



Higher Framerate != Lower Latency (in DLSS 3 FG)
 
DLSS 3 Frame Generation disagrees with you.

Frame generation is going to make things worse for input lag by definition. You can't make up frames between actual frames unless you know the before and after frame, but then you have to delay the after frame to insert your made up frame. It might make the video look smoother and maybe nicer though.
 
I would consider going this route if CPU architectures had any sort of staying power. Back in the day advancements didn't seem to come all that fast and the next flavor of CPU and GPU wasn't right around the corner. AND even if it was, you didn't care. Because the hardware you picked up would run everything and you were satisfied, for years. My Clevo with the Intel Northwood and an ATI 9600 (IIRC) ran damn near everything forever. I didn't even look at replacing it.

Now, the 5800X3D has some staying power for the previous generation, the current generation and maybe one or two more. SO, this is an interesting purchase. If I didn't already have what I do, I would consider it. It just came out a bit later than when I was looking...
thats interesting that you chose around the ati 9600 gen as the staying power example, because i use around that date range as an example of when computing was exploding. like 1990 - 2007 cpus were doubling in speed every 2 years. i remember trying to use a pentium 2 when athlon palamino was already out, it was painful. that was like a 4 year difference. i had similar feelings using palamino when core 2 duo was out. these days i can use a 10 year old sandy bridge laptop and get most of what i need done. its obviously not great, but it's usable. 4 year old cpus dont even feel that old now. gpus the story is a little different.
 
Last edited:
Back
Top