MSI ends AMD GPU partnership due to poor sales

I'm not a huge fan given that MSI cards have been recently overly expensive but without the quality you'd get from something similarly priced; Gigabyte/AORUS had a similar problem but MSI had their SUPRIM X costing more than even Asus ROG Strix etc.

ASUS seems to have the most expensive options, though MSI has certainly gone up in price. To get a decent cooler you have to spent a good bit more over MSRP it seems.
 
AMD may have reduced their allotment to an unsustainable level. But unless the company comes out and tells us why, were just speculating and the title was clickbait as even in the article they had no answers.
 
AMD may have reduced their allotment to an unsustainable level. But unless the company comes out and tells us why, were just speculating and the title was clickbait as even in the article they had no answers.
It's pretty obvious why. Consumers have made their choices.
 
sale can be good but margin on them bad,

Yet reports are otherwise. Nvidia still dominates sales but AMD has taken some market share back.
Going back to 2019 level in market share is nice, but sales would be up from a historically low point, an over 100% jump from Q4 2022 could still be one of their worst quarter ever:
YBg9idSBLqvbN49xV59HFn.png


Specially outside the "good one", like XFX, PowerColor, Sapphire, on the Amazon top 100 best sellings GPU there is no AMD card that is not from those 3 (say the asus, msi, gigabytes for those it could be quite small business, low effort->low sales ->lower effort->even lower sales circle going on), with how they keep a relationship with AMD for motherboards, a situation were sales were not low would be surprising.
 
Last edited:
This doesn't surprise me at all given how crappy the MSI Radeon cards have been, they don't offer anything good and they don't offer anything that's a good value. I'm honestly not sure why anyone would buy one unless it was the only thing available or a really great special deal.
 
Companies go where the money is, it's as simple as that.

If the cards are not selling well, it doesn't make any sense to invest r&d, engineering, and support costs if the end result is small potatoes. So they invest those r&d and engineering dollars in products they believe will generate more profits. AMD's chips have to compete with old Nvidia chips as well as new ones. To me it is a sign that AMD overpriced themselves a bit. And while a $100 price cut across their top end cards would help sales, that's just another bite into the profits. Significantly higher sales would offset that, but you need a highly desired product for that. AMD needs to make a chip 4 years ahead of where they are to try and stand head to head with Nvidia. They've had nearly 2 decades to achieve this...
 
MSI is not on my Santa list. Have multiple 8 card mining boxes of 3060 Ti / 3080 Ventus product and their garbage runs minimum 10-15% more watt per hash vs. same make/model EVGA / Asus stuff with exact same profile settings for core/mem/pl/volts.

Not a loss, IMO. This isn't the MSI TwinFrozr HD 78/79xx company of old.
 
sale can be good but margin on them bad, but...


Going back to 2019 level in market share is nice, but sales would be up from a historically low point, an over 100% jump from Q4 2022 could still be one of their worst quarter ever:
View attachment 650605

Specially outside the "good one", like XFX, PowerColor, Sapphire, on the Amazon top 100 best sellings GPU there is no AMD card that is not from those 3 (say the asus, msi, gigabytes for those it could be quite small business, low effort->low sales ->lower effort->even lower sales circle going on), with how they keep a relationship with AMD for motherboards, a situation were sales were not low would be surprising.

Yep total shipments are way down for both, thus why the total market share matters here. I mean your own chart show Nvidia shipments dropping to their lowest point in 2023. I have the feeling margins are not great for graphic cards, even EVGA citied that as the reason they dropped Nvidia. I feel allotments are tight with both companies and you better be willing to buy big lots if you want any kind of deal where you can make a profit right now, which likely MSI and EVGA were not willing to do anymore.
 
IMHO, Nvidia has to have something "real" with regards to competition. If not AMD, it will become Intel.

Maybe MSI is truly betting on the latter (?)
 
IMHO, Nvidia has to have something "real" with regards to competition. If not AMD, it will become Intel.

Maybe MSI is truly betting on the latter (?)
Not really.

Nvidia does not need any external competition, They've pretty much dominated every market they chose to compete in. I hate monopolies, but even when Radeon was in its prime: Nvidia never had competition. They did not stop innovating and creating the markets and the rules to play by, where everyone else seems to be stuck trying to follow the rules in the markets Nvidia created.

Look at AI: I hear people say Nvidia got "lucky" with AI, yet you look at AI/ML/DL research dating back 10+ years and 2/3 of all research papers were published by Nvidia, or by teams funded by Nvidia. They were developing AI LONG before it was ever profitable, during a time when hardware wasn't anywhere near ready. They were innovating in a market that didn't exist and they made it exist, and now everyone has to play by their rules.

Video cards are the same. Nvidia sponsored so many design studios, innovated so many graphics techniques. Look at Siggraph papers, GDC presentations, dating back 20+ years it was 2/3 Nvidia and Nvidia sponsored teams. Nvidia were the ones who worked hand-in-hand with Epic on Unreal Engine all the way back in the early 2000's and STILL maintains that relationship today, and Nvidia was doing this, pouring tons of money and resources into sponsorships and research LONG before it was profitable.

Monopolies suck, but Nvidia was the only one playing the long game: while everyone else is too busy chasing that next financial quarter to be leaders. I'm not happy about it. I want there to be competition, but I don't have high hopes.
 
Nvidia does not need any external competition,
If they want to keep governments away (they are already trying to hurt them a bit) or the big company not wanting to have a critical part of their infrastructure in the hands of a single player and them all popping up in-house solutions, a 20% competitor on consumer level and less in the higher margin market could be the perfect scenario.
 
Not really.

Nvidia does not need any external competition, They've pretty much dominated every market they chose to compete in. I hate monopolies, but even when Radeon was in its prime: Nvidia never had competition. They did not stop innovating and creating the markets and the rules to play by, where everyone else seems to be stuck trying to follow the rules in the markets Nvidia created.
You may "hate monopolies", but companies usually do not want to be regulated as one. Nvidia will "provide the loan" to keep the competitor alive, if that's what it takes IMHO (but we'll see, I mean it could cost Nvidia billions if they are found to be a monopoly).
 
6700 XT is a massively better card than the 3060, is absolutely worth forty more dollars, and has been one of the best value cards for a couple of years.

View attachment 650355

The fact there isn't a strictly better replacement for it in its price segment, is insulting to consumers and also a giant flashing sign of what's happening: GPU makers are trying to get everyone to spend more money, for cards which aren't actually upgrades. So to upgrade, you've REALLY gotta spend.
I agree but the problem is that the $320 price for the 6700 XT does fall out of peoples prices range. For example, good luck finding a regular 6700, because you can't. You can't even find them used on Ebay, and the 6700 XT is the cheaper choice because it's abandonment. Anyone who bought a regular 6700 have no plans to let them go. You also have to deal with the mystique of Nvidia, which again have better Ray-Tracing performance and DLSS.
7700XT were ~$350 for about two weeks, without any financing tricks or cashback, early in the year. And I was hoping that would become permanent....
Again, it's too much for people. Great card, but only compared to Nvidia's 4070. I doubt Nvidia likes knowing their outdated RTX 3060 12GB is their best selling GPU, but for a while even the RTX 2060 sold really well. So well that Nvidia has to bring it back from the dead during the pandemic.
 
Man there is a lot of Nvidia bias in here...lol. The 7000 series are not bad cards to be honest, just not the best prices from either maker this time around. I just put together a 7900XT rig for a bud...Beast of a machine, performance sitting between a 4070ti super and a 4080, with 20gb of vram, at roughly $100-150 cheaper than a 4070ti super. Unfortunate they won't have a high end competitor this upcoming gen, but if they hit their goal of offering 4080 like performance at mid range prices with RDNA 4, they will definitely move inventory. This gen has been plagued with bad prices on both the Nvidia and AMD front, but far more egregious from Nvidia.

The biggest fight for AMD though is their software stack. As long as they position themselves well with vram amount and pricing, they can definitely take some market share away from Nvidia. But not competing directly on the software side is hurting them. ROCM took too long to release and needs time to mature, as Nvidia has a foot hold with Cuda and the AI/Pro market...DLSS is also superior, better frame gen, and they aren't slowing down on adding new features and pushing the envelope even further. The pure ray tracing and rasterized performance numbers are def within striking range this time around between the two (excluding the 4090). But the only way AMD can eat up some of the market is to offer a larger amount of vram coupled with better pricing. We've seen what happens when Nvidia skimps on Vram as they have for literally every card except the 4080/4090. For example, the PS5 has 16gb of shared memory with 12gb allocated to the gpu. Watching the mid and upper mid-range cards from nvidia struggle with texture issues in current AAA games is just laughable. Not sure why Nvidia is so damn cheap with Vram. I understand the need to not cannibalize their higher end cards as they want to keep AI offerings separate, but 8/12gb is not acceptable in 2024. The PS5 Pro leaks point to a bump in gpu ram allocation as well, and considering the nature of the majority of AAA games being console ports is problematic for low vram cards.

Overall, sad to see that both companies have gotten so greedy. Nvidia is just pure greed, and AMD has had a window to knock Nvidia off a peg or two if they just price accordingly, but they refuse to do so, it's theirs to lose. Just hope for more competition, and that intel can get their head's out of their asses and actually get their shit in gear, on the cpu and gpu front. More competition is a good thing.
 
Man there is a lot of Nvidia bias in here...lol. The 7000 series are not bad cards to be honest, just not the best prices from either maker this time around. I just put together a 7900XT rig for a bud...Beast of a machine, performance sitting between a 4070ti super and a 4080, with 20gb of vram, at roughly $100-150 cheaper than a 4070ti super. Unfortunate they won't have a high end competitor this upcoming gen, but if they hit their goal of offering 4080 like performance at mid range prices with RDNA 4, they will definitely move inventory. This gen has been plagued with bad prices on both the Nvidia and AMD front, but far more egregious from Nvidia.

The biggest fight for AMD though is their software stack. As long as they position themselves well with vram amount and pricing, they can definitely take some market share away from Nvidia. But not competing directly on the software side is hurting them. ROCM took too long to release and needs time to mature, as Nvidia has a foot hold with Cuda and the AI/Pro market...DLSS is also superior, better frame gen, and they aren't slowing down on adding new features and pushing the envelope even further. The pure ray tracing and rasterized performance numbers are def within striking range this time around between the two (excluding the 4090). But the only way AMD can eat up some of the market is to offer a larger amount of vram coupled with better pricing. We've seen what happens when Nvidia skimps on Vram as they have for literally every card except the 4080/4090. For example, the PS5 has 16gb of shared memory with 12gb allocated to the gpu. Watching the mid and upper mid-range cards from nvidia struggle with texture issues in current AAA games is just laughable. Not sure why Nvidia is so damn cheap with Vram. I understand the need to not cannibalize their higher end cards as they want to keep AI offerings separate, but 8/12gb is not acceptable in 2024. The PS5 Pro leaks point to a bump in gpu ram allocation as well, and considering the nature of the majority of AAA games being console ports is problematic for low vram cards.

Overall, sad to see that both companies have gotten so greedy. Nvidia is just pure greed, and AMD has had a window to knock Nvidia off a peg or two if they just price accordingly, but they refuse to do so, it's theirs to lose. Just hope for more competition, and that intel can get their head's out of their asses and actually get their shit in gear, on the cpu and gpu front. More competition is a good thing.
I agree with you on every point.

Including the point that AMD just lets them win. Every time. It's like on the off chance that they CAN compete, they wait right at the finish line just to let Nvidia run past.

Oh, people think this generation of Geforce is WAY too overpriced? And Nvidia released them way before us? We have the opportunity to swoop in and get some real MarketShare by releasing some killer-priced cards that may not make us tons of cash but will get our products into as many machines as possible and create fans of our products for future generations!

Lets just follow Nvidia's pricing instead.

Oh, We have a graphics architecture that seems to be on-par with what Nvidia can do, and the chiplet scalability to make them more efficiently than Nvidia! We can use our die-size advantage against them at the high-end and cut off a huge portion of their high-margin sales!

Lets cancel our high-end products instead,
 
  • Like
Reactions: Ikasu
like this
I agree with you on every point.

Including the point that AMD just lets them win. Every time. It's like on the off chance that they CAN compete, they wait right at the finish line just to let Nvidia run past.

Oh, people think this generation of Geforce is WAY too overpriced? And Nvidia released them way before us? We have the opportunity to swoop in and get some real MarketShare by releasing some killer-priced cards that may not make us tons of cash but will get our products into as many machines as possible and create fans of our products for future generations!

Lets just follow Nvidia's pricing instead.

Oh, We have a graphics architecture that seems to be on-par with what Nvidia can do, and the chiplet scalability to make them more efficiently than Nvidia! We can use our die-size advantage against them at the high-end and cut off a huge portion of their high-margin sales!

Lets cancel our high-end products instead,

Agreed. It's becoming aggravating watching AMD allow Nvidia to do as they wish, with no repercussions when they have the means to strike. Every chance they have at undercutting and causing Nvidia some harm they fumble up. Same shit happens in the CPU space, but luckily Intel has not been able to compete. Although who knows, they are actually going to have a new process node for once with Arrow lake...the gap will close, but question is how much. If I was in AMD's shoes, I would of released a 24-32 core consumer chip with zen 4. Intel has been literally grasping at straws trying to increase their core count, hence these efficiency core chips that inflated their core count to 24, at the expense of 3-400 watts of power. It's best to compete with yourself if there is no competition, placing your competitive rival further behind and forced to play catchup rather then give them a chance to surprise you and actually take market share. We've all seen what happens when you get complacent and milk your consumers with nothing but quad cores cpus at 7% ipc upticks each gen and not actually innovating...You become a company in Intel's position.

In regards to AMD's RDNA 4 high end, I've heard they had a lot of issues with their high end MCM and RDNA 4 architecture, hence why they canceled their high end for the upcoming generation instead of throwing money at it for a fix. Apparently didn't want to delay their roadmap and decided to cancel the high-end. Extremely unfortunate, but I believe as long as they offer 4080 levels of performance with a good amount of v-ram at sub 500 pricing, will def take some market share. Current rumors are sitting for 4-600 price range on the new RDNA top end lineup (mid range focus). If they can aim to keep it around 4-450, they'll have a winner....If they price it close to $600, another lost opportunity. Current info is pointing to 18gbps GDDR6, so looks like they are keeping it more cost effective instead of the 20gbps they were using in RDNA3. I'm really hoping this is a cost cutting measure to stack each card with a bare minimum 16gb of vram while keeping MSRP targets lower. If they hit 4080 performance at half the price with the same or more vram, they will definitely move lots of inventory...After all, the mass majority of gpu sales are always in the mid range.
 
Agreed. It's becoming aggravating watching AMD allow Nvidia to do as they wish, with no repercussions when they have the means to strike. Every chance they have at undercutting and causing Nvidia some harm they fumble up. Same shit happens in the CPU space, but luckily Intel has not been able to compete. Although who knows, they are actually going to have a new process node for once with Arrow lake...the gap will close, but question is how much. If I was in AMD's shoes, I would of released a 24-32 core consumer chip with zen 4. Intel has been literally grasping at straws trying to increase their core count, hence these efficiency core chips that inflated their core count to 24, at the expense of 3-400 watts of power. It's best to compete with yourself if there is no competition, placing your competitive rival further behind and forced to play catchup rather then give them a chance to surprise you and actually take market share. We've all seen what happens when you get complacent and milk your consumers with nothing but quad cores cpus at 7% ipc upticks each gen and not actually innovating...You become a company in Intel's position.

In regards to AMD's RDNA 4 high end, I've heard they had a lot of issues with their high end MCM and RDNA 4 architecture, hence why they canceled their high end for the upcoming generation instead of throwing money at it for a fix. Apparently didn't want to delay their roadmap and decided to cancel the high-end. Extremely unfortunate, but I believe as long as they offer 4080 levels of performance with a good amount of v-ram at sub 500 pricing, will def take some market share. Current rumors are sitting for 4-600 price range on the new RDNA top end lineup (mid range focus). If they can aim to keep it around 4-450, they'll have a winner....If they price it close to $600, another lost opportunity. Current info is pointing to 18gbps GDDR6, so looks like they are keeping it more cost effective instead of the 20gbps they were using in RDNA3. I'm really hoping this is a cost cutting measure to stack each card with a bare minimum 16gb of vram while keeping MSRP targets lower. If they hit 4080 performance at half the price with the same or more vram, they will definitely move lots of inventory...After all, the mass majority of gpu sales are always in the mid range.
I feel much the same with their CPU division. While I don't think anyone needs more than 16 cores... there was a time when nobody needed more than 4. then suddenly, AMD whipped Intel's ass into gear and 8-cores became the norm, and anyone with 4 cores was hurting, BADLY. Software devs LEARNED TO USE all this new power and took advantage of the new i5 and Ryzen 5 CPUs having 12-godamn-threads to utilize. And before that Ryzen launch, people argued that having more than 4 cores was worthless and nobody actually needed it, the bandwidth isn't there to take advantage of 4+ core counts, it's only for power-users who are better served by HEDT, it isn't smart to devote so much engineering effort into making so many cores etc. etc. The same thing people are saying about AMD's 16 core max today. The. Same. Exact. Excuses.
 
I feel much the same with their CPU division. While I don't think anyone needs more than 16 cores... there was a time when nobody needed more than 4. then suddenly, AMD whipped Intel's ass into gear and 8-cores became the norm, and anyone with 4 cores was hurting, BADLY. Software devs LEARNED TO USE all this new power and took advantage of the new i5 and Ryzen 5 CPUs having 12-godamn-threads to utilize. And before that Ryzen launch, people argued that having more than 4 cores was worthless and nobody actually needed it, the bandwidth isn't there to take advantage of 4+ core counts, it's only for power-users who are better served by HEDT, it isn't smart to devote so much engineering effort into making so many cores etc. etc. The same thing people are saying about AMD's 16 core max today. The. Same. Exact. Excuses.
Definitely showing my bias by wanting more cores. I work with 3D, so I can use all the cpu cores at my disposal...haha. Aside from the fact that AMD made Threadripper's Pro exclusive for a short bit which was annoying, how they release their HEDT cpus at the end of an architectures life cycle royally pisses me off. Spending that much money on a threadripper platform should come with the comfort knowing you have the best architecture money can buy. Similar to how Nvidia releases their 90 series cards first to allow the enthusiasts nearly 2 years of top end performance. If you buy a 32 core threadripper, you only have a few months to pass before a new architecture hits that has a 40-50 ipc uplift in multithreaded workloads. With the amount of money they are asking, it's infuriating.

AMD has their C core dies. They could easily release a 24 core (standard + c die), or even a 32 core dual C die consumer cpu if they wanted to. It seems the Zen 6 roadmap will have a core count bump. Although I'm hoping that we get a surprise announcement and get one this gen, with arrow lake around the corner on intel's new node, I'm hoping AMD will take it seriously. I disapprove of the design philosophy of stagnation and limiting development to milk the consumer. You should be aiming to deliver the best products you can, don't let your competitor surprise you and steal the spotlight, best to outdo yourselves and build a better product each and every release, shining the spotlight even brighter on yourself. It's what Nvidia has done, and had the foresight of positioning themselves appropriately to take advantage of this AI boom.

Just wish Nvidia wasn't so greedy though. Their complete and utter lack of respect and commitment to the gaming community pushes my buttons. Roughly a 2-3 billion dollar industry that got them where they are. Don't get me wrong, a company should aim to make their money, especially considering the AI market is ridiculous atm and analysts are putting Nvidia at 40 billion in sales for 2024. But it's a bubble (Although this market segment will endure unlike mining), the question is how long will it last. Yet they pull all this BS with vram scrimping and ridiculous pricing, shorting supply of the mainstream to focus on AI, allowing scalpers to have a field day while selling their "gaming" gpus directly to AI/Mining farms during their respective bubbles. Trying to skirt around and get as much into China as they can, etc.. Their contempt for their own gaming base is just sad to see, and AMD is bungling on their chance of winning over said gamers and enshrining a foothold in said space on market share grounds.

Just really frustrating to see.
 
Definitely showing my bias by wanting more cores. I work with 3D, so I can use all the cpu cores at my disposal...haha. Aside from the fact that AMD made Threadripper's Pro exclusive for a short bit which was annoying, how they release their HEDT cpus at the end of an architectures life cycle royally pisses me off. Spending that much money on a threadripper platform should come with the comfort knowing you have the best architecture money can buy. Similar to how Nvidia releases their 90 series cards first to allow the enthusiasts nearly 2 years of top end performance. If you buy a 32 core threadripper, you only have a few months to pass before a new architecture hits that has a 40-50 ipc uplift in multithreaded workloads. With the amount of money they are asking, it's infuriating.

AMD has their C core dies. They could easily release a 24 core (standard + c die), or even a 32 core dual C die consumer cpu if they wanted to. It seems the Zen 6 roadmap will have a core count bump. Although I'm hoping that we get a surprise announcement and get one this gen, with arrow lake around the corner on intel's new node, I'm hoping AMD will take it seriously. I disapprove of the design philosophy of stagnation and limiting development to milk the consumer. You should be aiming to deliver the best products you can, don't let your competitor surprise you and steal the spotlight, best to outdo yourselves and build a better product each and every release, shining the spotlight even brighter on yourself. It's what Nvidia has done, and had the foresight of positioning themselves appropriately to take advantage of this AI boom.

Just wish Nvidia wasn't so greedy though. Their complete and utter lack of respect and commitment to the gaming community pushes my buttons. Roughly a 2-3 billion dollar industry that got them where they are. Don't get me wrong, a company should aim to make their money, especially considering the AI market is ridiculous atm and analysts are putting Nvidia at 40 billion in sales for 2024. But it's a bubble (Although this market segment will endure unlike mining), the question is how long will it last. Yet they pull all this BS with vram scrimping and ridiculous pricing, shorting supply of the mainstream to focus on AI, allowing scalpers to have a field day while selling their "gaming" gpus directly to AI/Mining farms during their respective bubbles. Trying to skirt around and get as much into China as they can, etc.. Their contempt for their own gaming base is just sad to see, and AMD is bungling on their chance of winning over said gamers and enshrining a foothold in said space on market share grounds.

Just really frustrating to see.
That many cores would choke on only 2 memory channels, but if AMD increases that then they are stepping on their Threadrippers toes and they won't do that.
 
That many cores would choke on only 2 memory channels, but if AMD increases that then they are stepping on their Threadrippers toes and they won't do that.

2024-04-28 23_55_43-AMD Zen 5 _ 6 .jpg

Well they are. Although the related dates were pushed back as zen 5 was supposed to be released earlier, but is now on the verge of coming out. They will have a 16 core die (most likely just a C die). As well as pushing it to 32 for zen 6. They will have their "Standard" complex, "Dense Classic", and "Client Dense". The last being specifically for enterprise. We'll definitely get a core boost for Zen 6 on the consumer side...I'm just hoping AMD won't take any chances and release a higher core variant for Zen 5. Arrow lake has a good shot of actually being competitive, and if it does compete in power and performance for once this time around, best to cut their legs out from under them by releasing a higher core variant now to ensure a strong victory. Time will tell though.
 
Last edited:
https://www.tomshardware.com/news/a...t-sales-surge-in-china-following-rtx-4090-ban
https://www.techpowerup.com/316203/...in-china-following-nvidia-export-restrictions
And I will find the transcripts from their recent financials meeting but they were touting the large increase in sales figures to the 40 countries that the US banned Nvidia from operating in.

Sounds more like panic buying when the ban when into place, would be curious if that still is going on. I know the Chinese mostly wanted to get a 4090 for the AI capabilities which the 7000 series would not be as suitable for. I just think it's a bit sensational, because if AMD was truly selling everything they could in China, why would they introduce the GRE to other countries. Honestly though I feel the GPU market is in a mess due to capacity restraint more then anything. Perhaps Intel can make a competitive card that can break this stalemate of overpriced cards with limited availability.
 
Sounds more like panic buying when the ban when into place, would be curious if that still is going on. I know the Chinese mostly wanted to get a 4090 for the AI capabilities which the 7000 series would not be as suitable for. I just think it's a bit sensational, because if AMD was truly selling everything they could in China, why would they introduce the GRE to other countries. Honestly though I feel the GPU market is in a mess due to capacity restraint more then anything. Perhaps Intel can make a competitive card that can break this stalemate of overpriced cards with limited availability.
Maybe but supposedly even the 4060 series is near impossible to find in China.

https://m.ithome.com/html/760492.htm

For those not fluent it basically claims Nvidia is holding back the supply of mid and low-end cards to keep prices inflated as demand far outstrips supply. Prices are a bare minimum of 10% over MSRP and go up from there.

This is silly because Nvidia doesn’t get a cut of card sales, Nvidia makes its money by selling the chips to AIB’s not to the consumer so those cards could be 500% above MSRP and what Nvidia got paid doesn’t change.
 
Last edited:
Man there is a lot of Nvidia bias in here...lol. The 7000 series are not bad cards to be honest, just not the best prices from either maker this time around.
The 7000 series cards are stupid, but so are Nvidia's 4000 series cards. The problem with AMD and Nvidia is their best selling cards are the previous generation, and Nvidia is selling a lot of RTX 3060's. The 7600 XT is just barely faster than the 6650 XT, but so is the 4060 vs 3060 Ti. AMD didn't make more RX 6700's, so everyone is turning to the RTX 3060's. Nvidia silently released the RTX 3060 8GB model to capitalize on this, which as you'd expect runs slower than the 12GB model. Nobody else is making a sub $300 graphics card with more than 8GB but Nvidia. It doesn't matter if AMD or Intel's GPU are technically faster, because they're also technically more than $300.
The biggest fight for AMD though is their software stack.
It's their prices, not software. None of AMD's 7000 series are priced to compete with Nvidia.
As long as they position themselves well with vram amount and pricing, they can definitely take some market share away from Nvidia.
But they didn't. They did the same mistake as Nvidia and released 8GB cards with a 16GB model that costs more and barely performs better.
ROCM took too long to release and needs time to mature, as Nvidia has a foot hold with Cuda and the AI/Pro market...
Nobody buying these cards will care about ROCM. They buy them to play games.
DLSS is also superior, better frame gen, and they aren't slowing down on adding new features and pushing the envelope even further.
DLSS vs FSR is not a major reason to go Nvidia. DLSS3.5 with much better Ray-Tracing performance is a reason.
Not sure why Nvidia is so damn cheap with Vram.
Same reason Apple is. To get you to buy the more expensive and better product.
I understand the need to not cannibalize their higher end cards as they want to keep AI offerings separate, but 8/12gb is not acceptable in 2024.
8GB wasn't acceptable in 2020, let alone 2024. 12GB cards are fine today for 1080p, but you hardly see GPU's with even that much VRAM.
Overall, sad to see that both companies have gotten so greedy. Nvidia is just pure greed, and AMD has had a window to knock Nvidia off a peg or two if they just price accordingly, but they refuse to do so, it's theirs to lose. Just hope for more competition, and that intel can get their head's out of their asses and actually get their shit in gear, on the cpu and gpu front. More competition is a good thing.
If AMD doesn't start making realistic upgrades for a realistic price, then Intel could take away market share. AMD doesn't want a price war with Nvidia, but they will get it with Intel.
 
The 7000 series cards are stupid, but so are Nvidia's 4000 series cards. The problem with AMD and Nvidia is their best selling cards are the previous generation, and Nvidia is selling a lot of RTX 3060's. The 7600 XT is just barely faster than the 6650 XT, but so is the 4060 vs 3060 Ti. AMD didn't make more RX 6700's, so everyone is turning to the RTX 3060's. Nvidia silently released the RTX 3060 8GB model to capitalize on this, which as you'd expect runs slower than the 12GB model. Nobody else is making a sub $300 graphics card with more than 8GB but Nvidia. It doesn't matter if AMD or Intel's GPU are technically faster, because they're also technically more than $300.

It's their prices, not software. None of AMD's 7000 series are priced to compete with Nvidia.

But they didn't. They did the same mistake as Nvidia and released 8GB cards with a 16GB model that costs more and barely performs better.

Nobody buying these cards will care about ROCM. They buy them to play games.

DLSS vs FSR is not a major reason to go Nvidia. DLSS3.5 with much better Ray-Tracing performance is a reason.

Same reason Apple is. To get you to buy the more expensive and better product.

8GB wasn't acceptable in 2020, let alone 2024. 12GB cards are fine today for 1080p, but you hardly see GPU's with even that much VRAM.

If AMD doesn't start making realistic upgrades for a realistic price, then Intel could take away market share. AMD doesn't want a price war with Nvidia, but they will get it with Intel.
I'm sure there are some particular sales here or there. But, overall, there aren't many 3060 (12GB version) which dip well under $300. (current best I can find is $290. MSI Ventus 2x).

If you are REALLY pinching pennies, ok. Save ~$30. Otherwise----RX 6750 XT are actually now priced the same as 6700 XT. Both of which are massively better than a 3060. And worth spending the extra $30 to get one (and waiting on your build another paycheck or whatever, if needed). 6750 XT in particular, dang that's a great card. That's RTX 3070 performance.

$320 at Newegg
https://www.newegg.com/xfx-radeon-rx-6750-xt-rx-675xyjfde/p/N82E16814150887?Item=N82E16814150887


**Yeah and what Nvidia did with the 3060 8GB is straight up predatory. Its a bait and switch. They want people to buy it thinking they are getting 3060 performance for less money. Which is cool, because 12GB isn't that useful for a gaming, for a card of that performance.

But its actually just barely better than a 3050, which is a pretty poor card to have in 2023.
 
Last edited:
The 7000 series cards are stupid, but so are Nvidia's 4000 series cards. The problem with AMD and Nvidia is their best selling cards are the previous generation, and Nvidia is selling a lot of RTX 3060's. The 7600 XT is just barely faster than the 6650 XT, but so is the 4060 vs 3060 Ti. AMD didn't make more RX 6700's, so everyone is turning to the RTX 3060's. Nvidia silently released the RTX 3060 8GB model to capitalize on this, which as you'd expect runs slower than the 12GB model. Nobody else is making a sub $300 graphics card with more than 8GB but Nvidia. It doesn't matter if AMD or Intel's GPU are technically faster, because they're also technically more than $300.

I wouldn't go as far to call them stupid. They are not bad products, they are just priced badly. If they released the mentioned cards at a lower price point than the previous gen while offering similar performance, that would of been a win...But they didn't for the majority of them. It is indeed quite insulting that they believe they can force feed the gaming community with what they offered. The whole 4080 12gb and 16gb fiasco was infuriating, and they got off easy considering all the other bs they pulled in performance/pricing.

It's their prices, not software. None of AMD's 7000 series are priced to compete with Nvidia.

It's not just pricing. People see AMD as inferior due to lack of competitive features such as DLSS and it's plethora of sub features, Cuda compatibility/ecosystem, etc. They are also constantly behind on new features, always a generation or two behind. For example, inferior hardware video encoders which hits gamers in the streaming department...Although their AV1 encoding has stepped up considerably and is a good candidate unlike the h264/h265 encoding. When buyers look at the whole package, they tend to go with Nvidia specifically because of this and bias. You get so much capability when you go Nvidia. Personally I would not mind at all going to a AMD card...But I LITERALLY CAN'T. All my software is designed and optimized around cuda in the workstation angle. A lot of my 3D software suite is optimized and built around Nvidia, and therefore will be buying a 5090 this upcoming gen when it hits to replace my 3090.

But they didn't. They did the same mistake as Nvidia and released 8GB cards with a 16GB model that costs more and barely performs better.

You can't compare the two equally. AMD has consistently offered more vram on their entire lineup this gen compared to Nvidia. It's the one saving grace on the AMD side, you get more vram for the price...Even though prices are elevated on both sides.

Nobody buying these cards will care about ROCM. They buy them to play games.

It's not just about games man. GPU's are multipurpose. Gpu encoding for workstations, AI workloads, Physics simulations, as well as games. There are many people that won't give AMD GPU's a shot as they won't work with certain AI repositories due to software compatibility, competitive performance in said apps, etc. There are quite a lot of gamers that also mess around with AI, and using AMD is just not possible for a lot of the repos out there. This ties into AMD being behind the gaming software side as well, not able to compete with the pace of Nvidia's gaming advancements on the AI front, frame gen, upscaling, raytrace AI improvements, etc...Uphill battle.

DLSS vs FSR is not a major reason to go Nvidia. DLSS3.5 with much better Ray-Tracing performance is a reason.

You are literally proving the point I made earlier. Nvidia's software stack is superior. Newer features on DLSS just keep leaving AMD behind. DLSS upscaling is superior, their frame generation is better, AI assisted ray tracing, etc. Software....AMD is playing catchup, and there are DEFINITELY people who buy Nvidia specifically because of the advantages on Nvidia's software side....Saying there isn't is Naive, no offense.

Same reason Apple is. To get you to buy the more expensive and better product.

Yea...But PC users aren't mindless apple lemmings. The fact that Apple users still buy the entry laptops with 8gb of ram is laughable to me. We saw the backlash on the 4080 12/16gb models, and the rejection of Nvidia's 4000 series in terms of sales numbers as the prices are too damn high on both sides. Nvidia and AMD both know they have to reduce their prices to sell quantity in the gaming segment, but with the AI boom the gaming market isn't as big on their agenda. They are pushing the majority of production to their AI lineup, as it's making them billions.

8GB wasn't acceptable in 2020, let alone 2024. 12GB cards are fine today for 1080p, but you hardly see GPU's with even that much VRAM.

Well, 8GB was fine for 1080p in 2020. 12GB also fine for 1080p today, but let's be honest....When consoles and their respective titles are being built for 4k, you can't skimp on vram on the PC side.They are intrinsically tied considering the ported nature of AAA titles. Even if the majority of PC gamers are still using 1080p, you can't use that fact as a standard to judge what is the right amount for PC's. The base PS5 has about 12.5GB of addressable memory, with the pro variant bumping that up to 13.75gb. Consoles and PC gaming aren't mutually exclusive when the development rolls into the other territory. 1440p user base and above counts for roughly about 30% based on steam data, and that will continue to grow....Especially with how common it is for 1440p 144hz displays around $200 now-a-days.

If AMD doesn't start making realistic upgrades for a realistic price, then Intel could take away market share. AMD doesn't want a price war with Nvidia, but they will get it with Intel.

Agree with you on this. But honestly this is a dumb move from AMD's side. The best method they can utilize to gain market share is to undercut and offer more vram with better performance at a cheaper price point. Considering how the gaming segment is being ignored by Nvidia provides an opportunity for AMD...But they are letting it pass them by.

I really do hope intel stays in the GPU game. They made some great strides on their driver side and shoring up their software. They just need some competitive architectures and timely releases. The fact that Battlemage is going to be rolling out along with RDNA4 and Blackwell is not good for Intel. My worry is Intel will pull out early as they have shown to do in the past and put themselves behind in the future. Competition is a good thing, and I would love to have a competent 3rd competitor in the GPU field. The more the merrier, the consumer wins.
 
Last edited:
I upgrade my 6800 XT with a 7900 XT, just because nice offer. I didn't expect it, but I was impressed by how fast it loads games.
Some people think that the 7000 series is crap because of the lack of gimmicks this lead to low sales.

MSI jumped outside the ship - cool, other partners will sell more.
MSI coolers on Radeon cards are bad so anyway.
 
I wouldn't go as far to call them stupid. They are not bad products, they are just priced badly. If they released the mentioned cards at a lower price point than the previous gen while offering similar performance, that would of been a win...But they didn't for the majority of them. It is indeed quite insulting that they believe they can force feed the gaming community with what they offered. The whole 4080 12gb and 16gb fiasco was infuriating, and they got off easy considering all the other bs they pulled in performance/pricing.
The 7600 uses 6nm, while the 7700 and up uses 5nm. The 7900 XTX name is so stupid that Gamers Nexus made a joke about it. While the 7700 and up look like upgrades over their predecessors, the 7600 still seems to be barely a performance upgrade over the 6650 XT. And yet, like Nvidia, the 7600 XT still came with 8GB of VRAM.
It's not just pricing. People see AMD as inferior due to lack of competitive features such as DLSS and it's plethora of sub features, Cuda compatibility/ecosystem, etc. They are also constantly behind on new features, always a generation or two behind. For example, inferior hardware video encoders which hits gamers in the streaming department...Although their AV1 encoding has stepped up considerably and is a good candidate unlike the h264/h265 encoding. When buyers look at the whole package, they tend to go with Nvidia specifically because of this and bias. You get so much capability when you go Nvidia. Personally I would not mind at all going to a AMD card...But I LITERALLY CAN'T. All my software is designed and optimized around cuda in the workstation angle. A lot of my 3D software suite is optimized and built around Nvidia, and therefore will be buying a 5090 this upcoming gen when it hits to replace my 3090.
Yet, AMD still doesn't price their products competitively.
You can't compare the two equally. AMD has consistently offered more vram on their entire lineup this gen compared to Nvidia. It's the one saving grace on the AMD side, you get more vram for the price...Even though prices are elevated on both sides.
I can even throw Intel under the bus for the VRAM. While the A770 has a generous 16GB, the rest of the lineup has 8GB. AMD did initially charge $300 for a GPU with 8GB of VRAM. They have since lowered the price, but still.
It's not just about games man. GPU's are multipurpose. Gpu encoding for workstations, AI workloads, Physics simulations, as well as games. There are many people that won't give AMD GPU's a shot as they won't work with certain AI repositories due to software compatibility, competitive performance in said apps, etc. There are quite a lot of gamers that also mess around with AI, and using AMD is just not possible for a lot of the repos out there. This ties into AMD being behind the gaming software side as well, not able to compete with the pace of Nvidia's gaming advancements on the AI front, frame gen, upscaling, raytrace AI improvements, etc...Uphill battle.
Most people buying GPU's are doing it for games. If AMD wants to sell to those who do productivity, then yes their software sucks for it.
You are literally proving the point I made earlier. Nvidia's software stack is superior. Newer features on DLSS just keep leaving AMD behind. DLSS upscaling is superior, their frame generation is better, AI assisted ray tracing, etc. Software....AMD is playing catchup, and there are DEFINITELY people who buy Nvidia specifically because of the advantages on Nvidia's software side....Saying there isn't is Naive, no offense.
I'm not saying Nvida's software wasn't superior, but it isn't the primary factor. Pricing is the main factor.
Yea...But PC users aren't mindless apple lemmings.
Why you think GPU sales are so bad?
The fact that Apple users still buy the entry laptops with 8gb of ram is laughable to me.
We'll see.
We saw the backlash on the 4080 12/16gb models, and the rejection of Nvidia's 4000 series in terms of sales numbers as the prices are too damn high on both sides. Nvidia and AMD both know they have to reduce their prices to sell quantity in the gaming segment, but with the AI boom the gaming market isn't as big on their agenda. They are pushing the majority of production to their AI lineup, as it's making them billions.
I doubt they price GPU's this high because they couldn't care if they sell GPU's since the boom of AI. This has been going on since the R9 290.
Agree with you on this. But honestly this is a dumb move from AMD's side. The best method they can utilize to gain market share is to undercut and offer more vram with better performance at a cheaper price point. Considering how the gaming segment is being ignored by Nvidia provides an opportunity for AMD...But they are letting it pass them by.
I agree. The 128-bit memory bus cards need to be $200 and under, while the 192-bit bus cards should be $250 to $400.
I really do hope intel stays in the GPU game. They made some great strides on their driver side and shoring up their software. They just need some competitive architectures and timely releases. The fact that Battlemage is going to be rolling out along with RDNA4 and Blackwell is not good for Intel. My worry is Intel will pull out early as they have shown to do in the past and put themselves behind in the future. Competition is a good thing, and I would love to have a competent 3rd competitor in the GPU field. The more the merrier, the consumer wins.
Would be crazy for Intel to leave. Without a good GPU, they won't be selling many CPU's. Also they missed the crypto boom and now the AI boom because they stayed out of GPU's.
 
Agree with you on this. But honestly this is a dumb move from AMD's side. The best method they can utilize to gain market share is to undercut and offer more vram with better performance at a cheaper price point. Considering how the gaming segment is being ignored by Nvidia provides an opportunity for AMD...But they are letting it pass them by.
Considering how cheaper to make NVIDIA card seem to tend to be, this assume that Nvidia would not adjust its price to not loose share and than it change nothing outside both making less profit.

Not saying we could know or your wrong, but how would we know ?

If they get to cards that cost less to make than nvidia, entering a price war could become a more interesting plan and seem to be the rumored goal of the next release.
 
The 7600 uses 6nm, while the 7700 and up uses 5nm. The 7900 XTX name is so stupid that Gamers Nexus made a joke about it. While the 7700 and up look like upgrades over their predecessors, the 7600 still seems to be barely a performance upgrade over the 6650 XT. And yet, like Nvidia, the 7600 XT still came with 8GB of VRAM.

lol. I agree with you on the naming schemes. Would of been better if they just left it as 7900 and 7900XT. AIB partner's aren't helping either with their naming schemes layered on either. The entry and pre mid range card line ups do need a serious uplift. AMD did release the 7600XT 16gb, albeit later in the release cycle...But the card is not really strong enough to handle 4k gaming, so was a bit moot to load up that card with that much vram.

Yet, AMD still doesn't price their products competitively.

Agree...Both have been priced poorly. AMD had a chance to eat up market share by undercutting Nvidia, but chose not too. Which in my eyes was a profound mistake. To make AMD cards more appealing since they lack on the software/ai side, they need to have a decent advantage in performance to attract more buyers. The Nvidia bias is too strong in gamers minds.

Most people buying GPU's are doing it for games. If AMD wants to sell to those who do productivity, then yes their software sucks for it.

I'm not saying Nvida's software wasn't superior, but it isn't the primary factor. Pricing is the main factor.

Price is just one side of the coin though. AMD's software on the productivity and workstation side is indeed lacking, but they are also lacking in the gaming segment on software as well. We all know that Nvidia's DLSS and it's features are superior compared to AMD's offerings of the same like. Couple that with their poor 264/265 encoding made them not a good choice for streaming as well, as Nvidia's gpu encoding has been far better and as a result a better candidate for twitch streamers. Although as stated in my previous post, AMD stepped it up with their AV1 encoder, too bad Twitch still hasn't adopted AV1 encoding yet.

Why you think GPU sales are so bad?

Pricing, 100% whole-heartedly agree with you on this. The pricing has been far too greedy on both fronts

I doubt they price GPU's this high because they couldn't care if they sell GPU's since the boom of AI. This has been going on since the R9 290.

Both Nvidia and AMD are supply constrained atm. Backlog of AI orders piled up, and with limited fab capacity have pushed AI gpu's as their focus. This was reported a few months ago...

https://wccftech.com/nvidia-gaming-...ortages-as-ai-chips-take-production-priority/

It's a scummy move, but it all benefits Nvidia. They need to fill as many AI orders as possible, considering their is competition now with MI300, and Intel entering the ring (although they won't eat up as much market as the other two)

Would be crazy for Intel to leave. Without a good GPU, they won't be selling many CPU's. Also they missed the crypto boom and now the AI boom because they stayed out of GPU's.

It would, but intel has time and time again shot themselves in the foot and leave too early in markets they enter in. We saw it with their prior Larabee GPU's, 3D Xpoint (true shame), etc. Considering how they are losing money on the gpu side, I wouldn't be surprised if they make another dumb move. But considering how vital the GPU market is atm, they'll most like stick with it.

Considering how cheaper to make NVIDIA card seem to tend to be, this assume that Nvidia would not adjust its price to not loose share and than it change nothing outside both making less profit.

Not saying we could know or your wrong, but how would we know ?

If they get to cards that cost less to make than nvidia, entering a price war could become a more interesting plan and seem to be the rumored goal of the next release.

AMD's gpu's should definitely cost less than Nvidia, considering their true MCM chiplet design. The fact they are aiming to penetrate the mid range this upcoming gen makes me definitely believe we will have a price war on our hands. I'm hopeful that AMD knows to capture the market and build a following they need to cater to the below 500 market. Although based on the prices we saw from Nvidia, I doubt Nvidia will want to reduce their prices massively. Having a price war with AMD will lead to more gpu sales for both sides on the gaming segment, and funny enough, that means less profit as they have to focus more production on the gaming side. Nvidia wants to focus on AI as they are making boatloads of cash compared to the gaming segment.

Based on recent estimates, Nvidia is set to hit 40 billion in sales for 2024 in AI, and considering the gaming market roughly pulls in 2-3 billion for them is chump change compared to the AI market. They'd rather keep prices high, have slow adoption of their newer GPU line, while clearing out there mass surplus of 3000 series card they had built up for mining which went caput. It completely favors Nvidia...Which is exactly why they are rushing out Blackwell this year. We'll have a 5080 and 5090 launch by earliest end of Q3, or latest Q4. Although these are going to be literal paper launched products. Very low inventory, but the need to get them out there as the AI Blackwell GPU's use the same chips, so effectively makes sense to launch both. Get those big benchmark numbers out to claim even more dominance in gaming, even though they skimp the supply, while gobbling up as much AI market as they can with their new lineup.

They've already pulled ahead their Vera Rubin AI architecture up and are aiming for a first half 2025 release if they can hit it, and have switched to a yearly cadence on their AI gpu front. They want to milk and dominate the AI market as much as possible, as they know how profitable it will be in the short term. It's a very calculated and strategic move....As a result AMD will try everything they can to bring forward MI400 which has also been accelerated.

All this points to one thing.....Bad for gamers in regards to supply/pricing. The AI Market will eventually stabilize, what the stabilized market outlook will be in terms of profit isn't known. But securing the favor of gamers can ensure a stable 2-4 billion that is not going anywhere. AMD is letting that slip by, hopefully Intel will step up to that plate....But all of them are blinded by the green on the AI front =*(.
 
AMD's gpu's should definitely cost less than Nvidia, considering their true MCM chiplet design.
MCM is only Navi 31/32, everything else and what should be the bulk of the market is monolithic I think

And even on the MCM models, Take a 7900xt versus a 4070ti super, that 530mm of die with 20 GB of vram with a 320bits bus can it really be cheaper to make than a bottom of the barrel 380 mm die on 256 bits, 16gb of vram.

Or a 7700xt/7800xt vs a 4070/4070 super, 346mm/16GB/256 bit vs 294mm/12GB/192bits, how much cheaper if at all could they be to make?
 
MCM is only Navi 31/32, everything else and what should be the bulk of the market is monolithic I think

And even on the MCM models, Take a 7900xt versus a 4070ti super, that 530mm of die with 20 GB of vram with a 320bits bus can it really be cheaper to make than a bottom of the barrel 380 mm die on 256 bits, 16gb of vram.

Or a 7700xt/7800xt vs a 4070/4070 super, 346mm/16GB/256 bit vs 294mm/12GB/192bits, how much cheaper if at all could they be to make?
Navi 33 can also be MCM. The 7700 is Navi 33 and is a MCM package. The 7600 however is monolithic, doesn't multi up.

It's not only about die size. AMD utilized two processes for RDNA3...Combination of 5nm and 6nm. the GDC was 5nm while the MCD's were 6nm. Definitely expecting a similar route with RDNA 4...Time will tell on this front although they are slated to use the N4 process. Also using lower speed 18gbps GDDR6 compared to the 20gbps used on the 7900xt, whereas the 5090/5080 are aiming for GDDR7. So AMD is definitely taking cost cutting in mind to couple with this new architecture...They aren't aiming to outdo their current gpu's by a large margin, but reduce the prices and offer similar performance at a cheaper price point to hit the mid range.

Time will tell if they are successful in this endeavor though. Honestly I'm a bit bummed there won't be a high end this gen from AMD. It just allows Nvidia to do what they want with their pricing, which is unfortunate for me as I'm looking to buy a 5090 for my workstation...haha.
 
Back
Top