Is a 10850K Fast enough for RTX 4090?

I say OP just buy a 4090 and see whether or not he's totally satisfied with what he's getting. If it's good enough, no need to upgrade anything else. If not, THEN he can start look at a CPU upgrade. Because regardless of whether or not he upgrades his CPU, the jump from a 2080 Ti to 4090 is going to be huge.
 
I say OP just buy a 4090 and see whether or not he's totally satisfied with what he's getting. If it's good enough, no need to upgrade anything else. If not, THEN he can start look at a CPU upgrade. Because regardless of whether or not he upgrades his CPU, the jump from a 2080 Ti to 4090 is going to be huge.
💯
 
So if you don't resell parts, what do you do? Just throw them in the dumpster? Been reselling my parts for over 20 years and never had a single issue and I can't even imagine the thousands of dollars that I would have wasted by not doing so. I'm about to get a new bicycle so I guess I'll just throw my old one in the dumpster too lol.
 
And please let's stop using Starfield as the new bench mark. I'm enjoying it and it's a fun game but it's a poorly coded buggy Bethesda release on a game engine that's well over a decade old.
 
Yes okay but how is the jump this high? as in when did video games start requring so much the latest and greatest that something that was top of the line 2021 takes a big hit on performance when back then even something like my i7-930 could last a decade and do just fine even at 1440 to 4k. Now it's like every new game requires moar faster cores. My wallet can't handle this
Welcome to the last 30 years of the 20th century, especially the 1990s.
At least it is just for games, though, as everything else is fine - back then, though, two years later and the computer in its entirety was obsolete for nearly all tasks, not just games.
 
  • Like
Reactions: TMCM
like this
So if you don't resell parts, what do you do? Just throw them in the dumpster? Been reselling my parts for over 20 years and never had a single issue and I can't even imagine the thousands of dollars that I would have wasted by not doing so. I'm about to get a new bicycle so I guess I'll just throw my old one in the dumpster too lol.
Typically turn the old machine into some other computing device for the household. Like my 6700k is now soley a media server/HTPC. OP putting in a 4090 in his rig will be a big jump in performance and save the money for intels next socket 12 to 18 months from now.
 
Okay so you stay above 100 FPS by playing games that are never very CPU limited. So the person that starts this thread probably needs to make it clear what type of games he's going to play because odds are someone that buys a 4090 is likely to be playing some of the extremely popular games which can be CPU limited. For example, Starfield will have a extremely large gap in performance compared to a current top end CPU.
So I mostly play older games because those typically have mods that allow me to play in 4K3D. The newest game I own is probably CBP 77 and Halo Infinite. I haven't seen any new titles that actually interest me. I do have CBP 77, bought it at launch and stopped playing like 1 hr into the game due to performance at 4k.

I have to say I don't care much for RTX at the moment. Maybe cause my 2080 Ti hasn't been able to handle it with grace. I also game in cycles. I will go like a whole 8 months without playing a game. then I'll circle back.
I guess I need to see if there are new titles I want to play but right now I have a back stock of older games that I need to play first lol.

It sounds like I will be losing some performace , up to 30% as you say. I'm not an fps monger. I just wanna be able to play my games at 4K with ultra textures at 50 to 60 fps smoothly. If that means RTX is off I don't think it bothers me. I haven't seen a night and day difference between RTX on and off but then again maybe I haven't looked closely enough or played games that implement it so well I see the difference
 
Last edited:
So I mostly play older games because those typically have mods that allow me to play in 4K3D. The newest game I own is probably CBP 77 and Halo Infinite. I haven't seen any new titles that actually interest me. I do have CBP 77, bought it at launch and stopped playing like 1 hr into the game due to performance at 4k.

I have to say I don't care much for RTX at the moment. Maybe cause my 2080 Ti hasn't been able to handle it with grace. I also game in cycles. I will go like a whole 8 months without playing a game. then I'll circle back.
I guess I need to see if there are new titles I want to play but right now I have a back stock of older games that I need to play first lol.

It sounds like I will be losing some performace , up to 30% as you say. I'm not an fps monger. I just wanna be able to play my games at 4K with ultra textures at 50 to 60 fps smoothly. If that means RTX is off I don't think it bothers me. I haven't seen a night and day difference between RTX on and off but then again maybe I haven't looked closely enough or played games that implement it so well I see the difference
if you dont care about ray tracing why not just get a 7900XTX? Its about 2/3 the price of a 4090 and has very close raster performance. Once the new intel socket comes out or if you like the performance of the Ryzen 8xxx series you can then upgrade the rest of your machine if you feel like you need to. You're not gonna be missing out on 30% of anything. Starfield is built on a shit game engine and lets face... its ugly for how much of a resource hog it is. My guess is within 6 months it will be better optimized and the performance difference will be more like 10% to 15%
 
if you dont care about ray tracing why not just get a 7900XTX? Its about 2/3 the price of a 4090 and has very close raster performance. Once the new intel socket comes out or if you like the performance of the Ryzen 8xxx series you can then upgrade the rest of your machine if you feel like you need to. You're not gonna be missing out on 30% of anything. Starfield is built on a shit game engine and lets face... its ugly for how much of a resource hog it is. My guess is within 6 months it will be better optimized and the performance difference will be more like 10% to 15%
Because I need to stick to Nvidia because of said geo-11 4K3D mods. And because the one thing I hate more than upgrading PC parts is buying something and it not being enough to last me a couple of years especially for 3D 4K VR. 4090 SHOULD last me for a bit while no?
 
Because I need to stick to Nvidia because of said geo-11 4K3D mods. And because the one thing I hate more than upgrading PC parts is buying something and it not being enough to last me a couple of years especially for 3D 4K VR. 4090 SHOULD last me for a bit while no?
IMO a 4090 will be a significant increase in performance from your 2080ti and your 10th gen i9 won't be some huge bottleneck the few others here are saying. It will be a bottleneck, but not a significant one. I'll give you an example... when I went from a gtx1080 to a rtx3090 in my 6700k I could play the games I wanted to at 4k with eye candy cranked at significantly higher frame rates then my gtx1080 could at 1080p... and I was able to squeek another 10 months out of my skylake before building my current 7800x3d last week...
 
Your 7 to 10% truly CPU limited estimations are quite a bit off. I'll be more than happy to link to some videos and reviews to show you that there are games where there is easily 20 to 30% loss compared to a 13900k or 7800 X3D. A 9900k and 10850k have pretty much the same IPC and I can tell you even just using a 4080 between my 9900k and 13700k system systems was absolutely night and day difference in CPU limited situations in several games. None of those games cared about having more than a few really fast cores. Anyway, ignorance is bliss as they say and I've seen people even use crazy mismatched systems and obliviously think nothing is wrong. FFS there's a completely oblivious clown on one of the other forums that has a 3570k with a 4070 TI.
Sure link some vids, but be sure that they are 4K+ VR rez like the OP stated.
 
Sure link some vids, but be sure that they are 4K+ VR rez like the OP stated.
Not all of them were 4k but they were to show his cpu struggling in a few cpu limited cases to even hold much over 60 while something like a 13900k was in the upper 80s. He has since said he is fine with only 50 to 60 fps so no point in posting that stuff now when I get home later.
 
Not all of them were 4k but they were to show his cpu struggling in a few cpu limited cases to even hold much over 60 while something like a 13900k was in the upper 80s. He has since said he is fine with only 50 to 60 fps so no point in posting that stuff now when I get home later.
I still would want to see it. Keeps me informed
 
I still would want to see it. Keeps me informed
No matter what I post here there will always be excuses from some people but unoptimized games are a real thing you need to deal with and cpu limited games are becoming more common. For now here is a link for Starfield performance that I mentioned earlier. In one of the most cpu intensive areas, the 10900k is only averaging 50 fps with minimums in the 30s where as the 13900k is averaging 110 fps with 80 fps minimums. Now this is of course not even close to normal for most games but it shows that having the brute force of a current high end cpu can help you get the most out of a 4090 in cases like that. And I said earlier it is night and day difference in Crysis remastered too where my 9900k was only getting in the 60s in several wide open areas my 13700 is now well over 100 fps in those same locations. And Witcher 3 with RT was a stuttery mess at times as my 9900k could not always hold 60 fps but its smooth performance with my 13700k.

https://www.pcgameshardware.de/Star...benchmark-requirements-anforderungen-1428119/
 
Last edited:
No matter what I post here there will always be excuses from some people but unoptimized games are a real thing you need to deal with and cpu limited games are becoming more common. For now here is a link for Starfield performance that I mentioned earlier. In one of the most cpu intensive areas, the 10900k is only averaging 50 fps with minimums in the 30s where as the 13900k is averaging 110 fps with 80 fps minimums. Now this is of course not even close to normal for most games but it shows that having the brute force of a current high end cpu can help you get the most out of a 4090 in cases like that. And I said earlier it is night and day difference in Crysis remastered too where my 9900k was only getting in the 60s in several wide open areas my 13700 is now well over 100 fps in those same locations. And Witcher 3 with RT was a stuttery mess at times as my 9900k could not always hold 60 fps but its smooth performance with my 13700k.

https://www.pcgameshardware.de/Star...benchmark-requirements-anforderungen-1428119/
We are looking for 4K or VR benchmarks like the OP stated. Delivering a single CPU intensive game benchmark review isn't contributing anything except your own narrative.
 
  • Like
Reactions: TMCM
like this
We are looking for 4K or VR benchmarks like the OP stated. Delivering a single CPU intensive game benchmark review isn't contributing anything except your own narrative.
Maybe read a little bit closer before commenting. I was talking about an example right there of a massive difference that would even impact playability and gave experiences with a couple of games I had personally played on a 4080 with a 9900k and 13700k.

Anyway I am done here. OP said he was fine with averaging 50 to 60 fps if need be so nothing else matters anyway.
 
Last edited:
Back
Top