Steam players are reverting back to GTX 1660 graphics cards, despite Nvidia's best efforts

The steam results don’t match people’s day to day experience with their friends, online community or sales charts. This is why people look for answers to explain away the data.
? how many people will have in their mind sales charts of all major market, China, Japan, Korea, Euros, Australia, North America,including pre-built PC

orea-DIY-PC-Market-GPU-Share-_-NVIDIA-AMD-Intel-_1.jpg


It doesn’t make sense intuitively that the three most populare gaming cards would be 8 years old.
1060 are 2016, 16xx card are from 2019

~40% of the market is using
I count around 26.5% of the DX12 gpu market on the latest steam hardware survey, about the same than Ampere if we exclude mobile (23.58%).

I think has long that a 1050-1060 are good enough to play some of the most popular title in the world, they are simply sold and goes to a kid-3rd world country gamer happy to get them and do not get removed from the poll.
 
  • Like
Reactions: Folly
like this
That left to be seen I think, low sample size (Aveum would go in that direction, but one title and an AMD sponsor, maybe over time performance will re-align)

View attachment 603350View attachment 603351View attachment 603352
Immortals of Aveum, Fortnite, and Remnant II are 3 real game examples, where AMD does better.
Layers of Fear does seem to show a slight lead on Nvidia. But, they aren't killing AMD there, either. Additionally, tests were from 3 -4 months ago. AMD drivers could very well have improved that. As that is right around when they really started fixing some of the strange RDNA3 performance issues in certain games.

Without hardware accelerated Lumen RT, AMD has a clear win in Fortnite with 'software' Lumen. and with HW Lumen RT, they even still slightly beat Nvidia. And in Fortnite at least: While the hardware accelerated Lumen does look a teensy bit better and also allows player model reflections in mirrors: I would take the very close looking/higher performing "software" Lumen, all day.
 
Last edited:
What does it say that more than three years after the Xbox / PS5 that ~40% of the market is using
Pascal / 16XX type cards?
It says that PC hardware is super expensive and its way more economical to buy a PS5.

Only in about the past 6 - 8 months, can you buy a GPU which is better than a PS5, for less than a PS5. But that's only a GPU. The entirety of the system is still about twice the price of a PS5.
 
Last edited:
If we compare PS4 active user to PS5 active user would it be that more than different than the 25% of pascal-16xx type cards on the steam survey ?
 
Have we seen that ?, look how well the 3060 12GB sell right now (or the giant amount nvidia can charge for extra 8gb on a the 4060TI) . All things being equal, consumer always go for the higher VRAM amount

I mean yeah, we are seeing that in a lot of tech discussion forums, including this one.

The reason the 3060 12GB sold well is because people could afford to buy it. The 1060 dominated the industry for years because most people are willing/able to spend in the $250-$350 range, and they're playing Fortnite and Mindcraft with that, so who cares. $500 will stretch a lot of people's budgets. This is [H] though, and at [H] we should be demanding more from GPU companies, not less.
 
That left to be seen I think, low sample size (Aveum would go in that direction, but one title and an AMD sponsor, maybe over time performance will re-align)

View attachment 603350View attachment 603351View attachment 603352

Lumen and Nanite don't tank AMD GPUs like RT does, which I think is his point, and in a lot of cases AMD can pull ahead where they would be getting destroyed by RT by comparison. No real surprise, because RT is designed for Nvidia's hardware, which is fair enough because they've done the work, but it does throw a bit of cold water on the line that "AMD can't do ray tracing". It can, it just depends on the implementation.
 
Only $10 cheaper than a 4060 I think.


People asking for less hardware at the same price ? How would that make sense.

Ah, I see what you're saying now, I misunderstood you comment RE the 3060. Yes, in general I do agree. My point was that there are a lot of people who are fervently trying to defend Nvidia's decision to limit VRAM on the 70-tier cards essentially by regurgitating Nvidia's marketing lines on the subject. My response to that was Nvidia isn't able to justify those design decisions based on the prices they want to charge for the product, and users shouldn't be justifying it for them because if we, as consumers, decide not to demand more, then companies will be happy to offer us less.

So yes, every time someone spends $800 on a card with 12GB of VRAM, they are effectively asking for less hardware at the same price. $800 is not a "mid-tier" price point regardless of what Nvidia wants you to believe, and 12GB of VRAM does not belong on a higher end card in 2023. Mid-tier, go for it, but not when you're spending $800.

For the record, the 4070Ti was the card I was planning to buy this time around to upgrade my ageing 1070Ti. I immediately removed it from consideration when it was announced with 12GB of VRAM. Sorry, but that's simply not good enough in 2023.
 
70watts means PCIe only no PSU connector, nice little low power server GPU with Tensor Cores and NVENC, better than the GTX 1630 currently in that bracket 🤔
 
70watts means PCIe only no PSU connector, nice little low power server GPU with Tensor Cores and NVENC, better than the GTX 1630 currently in that bracket 🤔

Yeah, only question is price and say, is it better than a 1660 (super, ti, or regular flavor).
 
holy shit! is that real?
Wccftech

& they absolutely do not cite any sources.

That said I was expecting a 6gb 4050. A cut down 3050 is a surprise, but it would make sense if it is a 70 watt card running purely on board power.
 
So yes, every time someone spends $800 on a card with 12GB of VRAM, they are effectively asking for less hardware at the same price. $800 is not a "mid-tier" price point regardless of what Nvidia wants you to believe, and 12GB of VRAM does not belong on a higher end card in 2023. Mid-tier, go for it, but not when you're spending $800.
Not sure how many posters here did that that recommand the 4070ti over a cheaper 7900xt (other than the case where not everything is equal do you need tensor core performance, rt core performance, etc...) or did not find the price of the 4070ti to not be bad.
My point was that there are a lot of people who are fervently trying to defend Nvidia's decision to limit VRAM on the 70-tier cards
I feel that what people simply want to say and make some hyperbole about it, about them wanting to have less Vram than more on a card to make a half-joke half sarcastic point, could be wrong and they could really believe that. A lot of the time the defense will not be about the 12GB vram decision it will be a response to someone that exaggerate (according to them) how much it is an issue.
 
I feel that what people simply want to say and make some hyperbole about it, about them wanting to have less Vram than more on a card to make a half-joke half sarcastic point, could be wrong and they could really believe that. A lot of the time the defense will not be about the 12GB vram decision it will be a response to someone that exaggerate (according to them) how much it is an issue.
I think the 4070 ti vs 7900xt is a distraction. If you want to play in 1440 medium/high then go for 4070 ti. For 4K low to 4K medium go for 7900xt. Only doubt / question is on 1440p ultra/max

Otoh, The people who pressed the trigger on the 4070 before the 7800xt release forced a price cut deserve to be hostages of nvidia ...

& the 4060 ti 16gb makes a nice card for $300-$330 range, else just buy the 12gb 3060 & use FSR 3 !!!
 
You choose to opt in or not when it randomly asks you.

Overall it's the closest thing we have to a large random sample of the PC gaming community.

People tend to take certain stances on it depending on how convenient the results are for their opinions.
QFT. Intel loses .0001% to AMD per steam survey and it’s time to break out the champagne bottles cause Intel is dead in 3 years. Then you scroll down two charts and see a similar loss on the GPUs side to Nvidia and the narrative changes to how the Steam survey is not a true representation of PC gamers.
 
AMD's superior performance in Unreal Engine 5 with Lumen and Nanite.
That left to be seen I think, low sample size (Aveum would go in that direction, but one title and an AMD sponsor, maybe over time performance will re-align)
A new entry in the very low sample:


performance-2560-1440.png
performance-rt-1920-1080.png
917352_performance-2560-1440.png
779082_fortnite.png
917351_4060ti_4070_UE_51.png


Jury still out and it could continue to go without any clear winner and stay a case by case.
 
Last edited:
A new entry in the very low sample:


View attachment 606579View attachment 606580
917352_performance-2560-1440.png
779082_fortnite.png
917351_4060ti_4070_UE_51.png


Jury still out and it could continue to go without any clear winner and stay a case by case.
Your Fortnite chart is with hardware accelerated Lumen. With 'software' lumen, AMD has a decent lead in Fortnite. With Hardware Lumen, it's remarkably close, considering Nvidia's usually much better hardware RT, for some other games.


LORDS OF The Fallen does indeed, currently perform better on Nvidia. And there is no option for hardware Lumen. Looks like AMD may not have a defacto lead in UE5, after all.

With LoTF, the developers have been doing a lot of patching and improvements. Will be interesting to see where performance ends up, when the game is no longer essentially a BETA!
 
With 'software' lumen
Will be interesting to see when Immortal get the update to support hardware Lumen if it reorder things (but it should stay close anyway I feel like), they said at launch that a future update will do it, but I am not sure if they will ever take the time.

Because of how reasonable VRAM usage seem to be with unreal 5 (at resolution card can play),

1440p ultra is just 6-7.5 gig on both Lords of the fallen and Immortal, they are the 2080ti-3070 performance being about the same (i.e. even 8 gig are not an issue in 30 fps and less scenario let alone for those the 8 gig card can run) and because how similar Lumen-RT seem to be between the 2 company, maybe a lot of the talks between the 2 alternative will feel a bit empty if UE5 become popular.

Just 2 title and 2 demo, obviously and not fully mature drivers-game patch for them has well.
 
Back
Top