MSI ends AMD GPU partnership due to poor sales

It's not only about die size. AMD utilized two processes for RDNA3...Combination of 5nm and 6nm.
sure but if you end up using so much more of it, it eat your node rebate and yield gain, and then you have a larger bus and more ram that eat not using gddr6x.

One way to think about it, many people felt the 4080 was a 4070, the 4070 a 4060 and the new 4060 a new kind of 4050 in term of hardware, would it have any truth to that, that would be because they are quite cheap to do for where they end up to be in the stack and would it be hard to price war them with how RDNA 3 ended up performing.

. Definitely expecting a similar route with RDNA 4...
Rumours are that it would be all monolithic, maybe that influencing my vision that the cost-cutting operation was not as good as anticipated
 
Last edited:
MCM is only Navi 31/32, everything else and what should be the bulk of the market is monolithic I think

And even on the MCM models, Take a 7900xt versus a 4070ti super, that 530mm of die with 20 GB of vram with a 320bits bus can it really be cheaper to make than a bottom of the barrel 380 mm die on 256 bits, 16gb of vram.

Or a 7700xt/7800xt vs a 4070/4070 super, 346mm/16GB/256 bit vs 294mm/12GB/192bits, how much cheaper if at all could they be to make?
The interposer all those chiplets sit on isn’t free and it’s the size of the other chips combined. It might be on an older node but it’s still another wafer getting used up. Wafer prices have gone up a lot.
 
The 7700 is Navi 33 and is a MCM package.
No 7700 xt is a cut down Navi 32
AMD utilized two processes for RDNA3...Combination of 5nm and 6nm. the GDC was 5nm while the MCD's were 6nm. Definitely expecting a similar route with RDNA 4.
All rumours point to navi 48 & navi 44 being monolithic
AMD had power issues with MCM chips when gaming. I am thinking that is why they will go monolithic with RDNA 4
 
Rumours are that it would be all monolithic, maybe that influencing my vision that the cost-cutting operation was not as good as anticipated
You have the chiplets which get good yields but then need packaging and assembly which is constrained and getting more expensive. Then you have the interposer those chiplets sit on which is yet another node and another series of wafers.
TSMC’s processing fees are relatively flat right now, so the cost increase of moving to 3N from 4N for example is relatively proportional to the amount of wafer you save.
Wafers themselves have doubled in price over the last 4 years, the yield improvement isn't necessarily worth the raw material costs at this stage unless you are moving a lot of units, AMD was not moving a lot of units, factor in the additional costs on the packaging facilities to put those chiplets on their interposer and you very well could be looking at something with numerous downsides and virtually no cost benefits or worst a cost increase.
 
No 7700 xt is a cut down Navi 32

Apologies, you are correct. When I did my initial search I mistakenly clicked the RX 7700S for tech specs.

All rumours point to navi 48 & navi 44 being monolithic

Ohh snap, you're right. Monolithic for RDNA 4, can't believe I missed that.
AMD had power issues with MCM chips when gaming. I am thinking that is why they will go monolithic with RDNA 4

Yea...Hence no RDNA 4 high end. Apparently they didn't want to throw money at the wall to fix the issue, and delay their roadmap...So we'll have to wait for RDNA 5. MCM in GPU is a lot more complicated compared to what they've done on the CPU front. Probably explains why Nvidia went the route they did for their MCM Blackwell chips...Although you really can't call them chiplets in the same sense as what AMD has done.
 
Yea...Hence no RDNA 4 high end. Apparently they didn't want to throw money at the wall to fix the issue, and delay their roadmap...So we'll have to wait for RDNA 5. MCM in GPU is a lot more complicated compared to what they've done on the CPU front. Probably explains why Nvidia went the route they did for their MCM Blackwell chips...Although you really can't call them chiplets in the same sense as what AMD has done.
Not to be all Nvidia couldn't do it so no way AMD could, but Nvidia spent a lot of money on trying to make it work, and each time it came back with results that showed the cost to implement something capable of managing the data between the different chiplets was extremely prohibitive, and the penalties involved with not using it were too detrimental, high latency, lower clock speeds, etc. Both issues AMD faced with the 7900 series. Nvidia showed in multiple papers that an affordable chip to manage the data coordination between the GPU chiplets had to come first, and they put up a large sum of money to make that happen. Blackwell is sort of cheating at this from what I understand, borrowing on the work done by Apple and TSMC for the M series Pro and Max chips, but it's still a temporary stopgap because it adds a lot of money and time to the assembly side of the chip construction, which is less than ideal.
 
Not to be all Nvidia couldn't do it so no way AMD could, but Nvidia spent a lot of money on trying to make it work, and each time it came back with results that showed the cost to implement something capable of managing the data between the different chiplets was extremely prohibitive, and the penalties involved with not using it were too detrimental, high latency, lower clock speeds, etc. Both issues AMD faced with the 7900 series. Nvidia showed in multiple papers that an affordable chip to manage the data coordination between the GPU chiplets had to come first, and they put up a large sum of money to make that happen.
I mean, it shouldn't be a huge surprise given the issues with multiple GPUs. Sure that is going to be slower in some respects since you are connecting over an external bus which is going to be hard to make as fast as something on the same package (or even board)... BUT it is still the same fundamental problem of trying to sync up multiple rendering units that has never worked particularly well. Also on a single board you get the added "bonus" of the chiplets competing for memory bandwidth, whereas old school multi-GPU they all got their own memory to play with.

It is a difficult problem to solve and as you note: cost is a factor. The reason to want chiplets is to reduce cost, but if getting a controller/interconnect that can make good use of them ends up driving the price up more than just one big chip, well then why do it?

I mean shit, this goes all the way back to 3dfx and their failure: One of their big issues was that they were designing things with a multi-chip design in mind. The VSA-100 was designed with the whole concept of "whack many chips on a board, do many polygons." It worked, mostly, but the issue was cost. nVidia and ATi were producing chips that were very fast and cost less as they were only single chip. In theory 3dfx could have thrown even MORE VSA's on a board and made something faster, it was supposed to scale to like 32-chip solutions, but in practice their shit was already too expensive to compete.

While chiplets aren't all the same issue as multi-chip solutions it is the same basic tradeoff: It is only worth it if it gets you more for less. If it ends up costing more, then it isn't worth it, just do a monolithic design.
 
I mean, it shouldn't be a huge surprise given the issues with multiple GPUs. Sure that is going to be slower in some respects since you are connecting over an external bus which is going to be hard to make as fast as something on the same package (or even board)... BUT it is still the same fundamental problem of trying to sync up multiple rendering units that has never worked particularly well. Also on a single board you get the added "bonus" of the chiplets competing for memory bandwidth, whereas old school multi-GPU they all got their own memory to play with.

It is a difficult problem to solve and as you note: cost is a factor. The reason to want chiplets is to reduce cost, but if getting a controller/interconnect that can make good use of them ends up driving the price up more than just one big chip, well then why do it?

I mean shit, this goes all the way back to 3dfx and their failure: One of their big issues was that they were designing things with a multi-chip design in mind. The VSA-100 was designed with the whole concept of "whack many chips on a board, do many polygons." It worked, mostly, but the issue was cost. nVidia and ATi were producing chips that were very fast and cost less as they were only single chip. In theory 3dfx could have thrown even MORE VSA's on a board and made something faster, it was supposed to scale to like 32-chip solutions, but in practice their shit was already too expensive to compete.

While chiplets aren't all the same issue as multi-chip solutions it is the same basic tradeoff: It is only worth it if it gets you more for less. If it ends up costing more, then it isn't worth it, just do a monolithic design.
AI on the other hand? Where you have companies paying top dollar for that extra 10% to be faster than their competitors is another matter, an extra $1000 in costs per board is pretty tiny when you are charging $60,000 for the board, and tagging on a support contract is ultimately nothing if it means you've locked them in for another 3-5 years of using your product. There the expensive chips needed to make a multi-chip or multi-chiplet product work actually do make sense, but in a consumer product, that is a ways off.
 
AI on the other hand? Where you have companies paying top dollar for that extra 10% to be faster than their competitors is another matter, an extra $1000 in costs per board is pretty tiny when you are charging $60,000 for the board, and tagging on a support contract is ultimately nothing if it means you've locked them in for another 3-5 years of using your product. There the expensive chips needed to make a multi-chip or multi-chiplet product work actually do make sense, but in a consumer product, that is a ways off.
Also ML may be less of a bitch with the multi-chip problems because it is less time sensitive. There are lots of things that scale not just to multiple chiplets or chips but multiple nodes real well. The problem with realtime graphics is that "realtime" bit. Want 120fps? Ok, that's 8.3ms, max, you have to deal with everything, including sync from your two chips and given that you are probably spending 8.29ms of that rendering, there's not a lot of room for "oops this didn't get us the data in time" which leads to jank frame pacing at a minimum. Having to deliver the data in real time, at regular very short intervals, is just a bitch when you try to sync multiple units. However if you have a workload that isn't so sensitive, then maybe it isn't such a problem.
 
“Longer term, we see AI at the edge as a large growth opportunity that will drive increased demand for compute across a wide range of devices,” Su said.

Gaming revenue, on the other hand, declined sequentially by 32.6 percent and 47.5 percent year-over-year to $922 million, due to lower demand for PC GPUs. AMD’s CFO, Jean Hu, said the company doesn’t expect the situation to improve this year.

“Based on the visibility we have, the first half […] we guided down sequentially more than 30 percent. We actually think the second half will be lower than the first half,” she said.

https://www.crn.com/news/components...g-product-teases-new-ai-chips-later-this-year

I knew AMD GPU sales were down I didn’t know they were that far down.
 
The oldest names for ATi / AMD is Power Color / Sapphire / XFX because they only make video cards!
 
I knew AMD GPU sales were down I didn’t know they were that far down.
They are down for NV and AMD, market share has remained stable globally. NV's reporting on it's earnings have always been somewhat "creative" on the gaming desktop category.

Still seeing huge purchasing impacts from the Coof buying spree. Pricing impacts, and buyers waiting for next-gen as well.

As an aside, Su talking about gaming in 2H'24, I do not think we will see RDNA4 till Q1'25.
 
They are down for NV and AMD, market share has remained stable globally. NV's reporting on it's earnings have always been somewhat "creative" on the gaming desktop category.

Still seeing huge purchasing impacts from the Coof buying spree. Pricing impacts, and buyers waiting for next-gen as well.

As an aside, Su talking about gaming in 2H'24, I do not think we will see RDNA4 till Q1'25.
That sucks. I was hoping to be underwhelmed by AMD riding Nvidia's financial wake with a Minimum-viable-product a bit sooner than that.

Despite my gripes, I like having options... even if nobody really choses the distant second option...
 
That sucks. I was hoping to be underwhelmed by AMD riding Nvidia's financial wake with a Minimum-viable-product a bit sooner than that.

Despite my gripes, I like having options... even if nobody really choses the distant second option...
If Nvidia does launch the 5090 and 5080 I’m 2024 Q4 as rumored I doubt AMD wants to follow that up in the sa,e news cycle with their 8700 and 8600 as AMD already said they are skipping the high end this gen. It’s just bad optics.
 
If Nvidia does launch the 5090 and 5080 I’m 2024 Q4 as rumored I doubt AMD wants to follow that up in the sa,e news cycle with their 8700 and 8600 as AMD already said they are skipping the high end this gen. It’s just bad optics.
AMD could launch before. I believe the hardware is already done
 
If Nvidia does launch the 5090 and 5080 I’m 2024 Q4 as rumored I doubt AMD wants to follow that up in the sa,e news cycle with their 8700 and 8600 as AMD already said they are skipping the high end this gen. It’s just bad optics.
They'll probably release something they performs like the 5060Ti and call it the RXX 8999.99 XZTXXX RAGE MAXIMUM BLACK Edition and price it to compete with the 5080
 
They'll probably release something they performs like the 5060Ti and call it the RXX 8999.99 XZTXXX RAGE MAXIMUM BLACK Edition and price it to compete with the 5080

That's strange because the card I bought from AMD is called "RX 7900XTX". I think you are confusing AIB "duplex whammy phase shifter" card names with the actual AMD product names.
 
Long time in the making. Hence why the overpriced gaming X trio classic was a dud. Nobody wanted it unless it was given away. AMD is putting more emphasis on their premier partners such as Asrock, Powercolor, XFX and Sapphire. Gigabyte and Asus will follow the same path as MSI. Notice no strix model of AMD anymore? No more AERO or Aorus AMD GPU as well. It will happen in due time.
Gigabyte has the Windforce (they called it the 'Elite?) and Gaming series still with AMD. Although, yes, AMD gpus have probably been falling in quality and tech lately - the latest gen looks like a failure to me. With the 7900 xtx, all I come across - constantly - is that there's something called 'thermal paste pump out' on practically every card and that includes every brand - with the XFX and maybe Asrock - the most prevalent? It appears this design is flawed or just bad - and these companies used cheap paste or the design is just bad and even after re-pastes, it happens again. Ultimately, ppl are replacing it with thermal pads - graphene thermal pads, for e.g. or thermally conductive Phase Change Material (PCM) - to solve repetitive/ongoing temperature/heat problems. The fact you have to take a relatively brand new gpu apart to fix a defect or bad design speaks volumes about AMD's latest gpu generation and I'm only going to briefly mention the previous vapor chamber debacle. Supposedly, that (mostly with the reference models) was solved a while ago?
 
The oldest names for ATi / AMD is Power Color / Sapphire / XFX because they only make video cards!
Power Color and Sapphire are definitely the oldest players, XFX wasn't making ATI cards until ~2008.

It's a bit disingenuous though to say those companies only make video cards. Sapphire currently also makes AIO CPU coolers and embedded systems (fully assembled or just the motherboard) and they used to have a fair amount of consumer motherboards. Powercolor sells some miscellaneous accessories like keyboards and mousepads. XFX currently only sells GPUs I believe, but they used to make motherboards and power supplies.
 
AMD could launch before. I believe the hardware is already done
And launch a product before one of their keynote presentations??? I think not, it would leave them nothing to present. What would Lisa do without new graphs to point at?
 
That sucks. I was hoping to be underwhelmed by AMD riding Nvidia's financial wake with a Minimum-viable-product a bit sooner than that.

Despite my gripes, I like having options... even if nobody really choses the distant second option...
Well going by the production schedule there is a solid chance that Nvidia is only going to get the 5090 (or what ever they call that part) out this year and the rest f the stack will be coming in 2025. Even if AMD was putting out a successor to the 7900xtx would it hold a candle to the 5090??? I wouldn’t want to be the marketing team trying to spin that match up.
 
Well going by the production schedule there is a solid chance that Nvidia is only going to get the 5090 (or what ever they call that part) out this year and the rest f the stack will be coming in 2025. Even if AMD was putting out a successor to the 7900xtx would it hold a candle to the 5090??? I wouldn’t want to be the marketing team trying to spin that match up.
5090 only this year in retail.

No high end from AMD. Two chips, with top model being around ~7900XT perf. Q1'25.
 
AMD could launch before. I believe the hardware is already done
They could but there are still a lot of RX 7600 through 7800’s kicking around on shelves, they are already quite below MSRP the AIB’s would shit a brick if AMD did that while ample stock remained, they would insist that AMD pay them the difference and nobody comes out happy in that situation, except us… and that’s bad for business. AMD will wait until after they have launched their new gen CPU’s so there is “inspiration” out there to build new PCs.
 
5090 only this year in retail.

No high end from AMD. Two chips, with top model being around ~7900XT perf. Q1'25.
Which is not bad at all, something like a Ryzen 9600x paired with what ever they call that card would be a solid performer, 4K 60 shouldn’t be an issue there upscaling will keep that doable there for a good while and I do believe AMD will have gotten a much better handle on Ray Tracing this time around. System like that on a decent 4K TV or any mainstream monitor and you’re a happy gamer.

Personally I am looking forward to the upcoming APU’s been wanting to build up a plex box for a while and I think it would do nicely.
 
5090 only this year in retail.

No high end from AMD. Two chips, with top model being around ~7900XT perf. Q1'25.
That'll be an interesting turn of events. For a while now, both AMD and Nvidia has released high end cards first, and then later mid to low end. Nviida only releasing their RTX 5090 while AMD releasing their mid range products, would be an interesting turn of events. Especially when you have Intel's Battlemage coming out later this year or 2025, this could leave AMD all alone with the mid range market.
 
For a while now, both AMD and Nvidia has released high end cards first, and then later mid to low end. Nviida only releasing their RTX 5090 while AMD releasing their mid range products
Seem to be really close if not exactly like the 5700xt generation followed quickly by the generation that had 6800xt-6900xt in them.

If the 5090 launch in 2024, the rest of the stack should be close to Q1 2025 and it is a bit impossible in today GPU world to ever be alone in the mid range market, the generation jump are never big enough to make the previous generation weaker than the new one mid-range, can always do some 4070 super Ti Super new launch of the same old same with more vram if needed.
 
Last edited:
That'll be an interesting turn of events. For a while now, both AMD and Nvidia has released high end cards first, and then later mid to low end. Nviida only releasing their RTX 5090 while AMD releasing their mid range products, would be an interesting turn of events. Especially when you have Intel's Battlemage coming out later this year or 2025, this could leave AMD all alone with the mid range market.
As fun as that sounds that would be putting AMD’s latest and greatest up against the Nvidia 4080 and 4070 lineup, those for Nvidia are done and paid for R&D wise, Nvidia could afford to get very competitive there price wise to keep consumers complacent. Hell give them a rebrand 4085 and the 4075 series with a change in the HDMI and Display Ports to keep it interesting. But AMD will be far from alone in the mid range.
 
As fun as that sounds that would be putting AMD’s latest and greatest up against the Nvidia 4080 and 4070 lineup, those for Nvidia are done and paid for R&D wise, Nvidia could afford to get very competitive there price wise to keep consumers complacent. Hell give them a rebrand 4085 and the 4075 series with a change in the HDMI and Display Ports to keep it interesting. But AMD will be far from alone in the mid range.
I'm ready for a price war. AMD certainly needs to price lower as AMD has confirmed that "Radeon GPU sales have nosedived". I can see why MSI left AMD.
https://www.pcgamesn.com/amd/radeon-gpu-sales-nosedived

amd-radeon-gpu-sales-nosedived-graph.jpg
 
I'm ready for a price war. AMD certainly needs to price lower as AMD has confirmed that "Radeon GPU sales have nosedived". I can see why MSI left AMD.
https://www.pcgamesn.com/amd/radeon-gpu-sales-nosedived

View attachment 651724
I love that in concept, but AMD can go down 15 maybe 20% and they will be selling these for break even at best, Nvidia could happily match they have a vastly better margin, but how long could AMD continue that game? Two generations maybe before the investors were calling for blood. And for all that effort what you would likely see is AMD cards out of stock everywhere except from scalpers, Nvidia could leave their MSRP alone and let the market correct itself, while Lisa starts eyeballing a golden parachute.
For a price war to happen you need a competitor who can challenge their entrenched competition in a battle of attrition, and AMD sadly does not have that capacity.

So instead we will have to settle for AMD shifting focus, they abandon the high end and focus on a solid mid-range where they can work to optimize their silicon and margins on a market that will gobble the cards up. AMD is not fighting Nvidia right now, AMD is fighting Intel.
Intel has the cash flow and fabrication facilities to take those losses against AMD and Nvidia, and losses they are, no way is Intel even close to breaking even on the GPU front, but they are gaining market, they are improving drivers, and they are getting those game optimizations in place.
AMD is losing key ground to Intel on the budget end of the spectrum, and AMD is tightening up to address that fight.
 
Last edited:
I love that in concept, but AMD can go down 15 maybe 20% and they will be selling these for break even at best, Nvidia could happily match they have a vastly better margin, but how long could AMD continue that game? Two generations maybe before the investors were calling for blood. And for all that effort what you would likely see is AMD cards out of stock everywhere except from scalpers, Nvidia could leave their MSRP alone and let the market correct itself, while Lisa starts eyeballing a golden parachute.
For a price war to happen you need a competitor who can challenge their entrenched competition in a battle of attrition, and AMD sadly does not have that capacity.
Nobody truly knows how much it costs AMD to make these cards, but it can't be as much as they're asking for. The problem is that AMD and Nvidia are asking a lot more than what consumers are willing to pay for. For several years it was the GTX 1060 that dominated Steam, and now it's the RTX 3060. Not the RTX 4060, but the 3060. AMD barely makes a scratch on Steam. Besides Radeon graphics which probably just built in graphics, the RX 580 is AMD's most popular GPU. Anything above $300 doesn't sell very well.
So instead we will have to settle for AMD shifting focus, they abandon the high end and focus on a solid mid-range where they can work to optimize their silicon and margins on a market that will gobble the cards up. AMD is not fighting Nvidia right now, AMD is fighting Intel.
AMD needs to fight Nvidia because right now Intel has even less market share compared to AMD.
Intel has the cash flow and fabrication facilities to take those losses against AMD and Nvidia, and losses they are, no way is Intel even close to breaking even on the GPU front, but they are gaining market, they are improving drivers, and they are getting those game optimizations in place.
AMD is losing key ground to Intel on the budget end of the spectrum, and AMD is tightening up to address that fight.
It's just the RTX 3060 12GB that's killing AMD and Intel right now. I don't think Intel is selling that many GPU's to make it profitable. Intel is learning and adjusting as they go along, hoping to win consumers. For $280 would anyone rather buy an RX 7600 8GB or A770? I'm sure Nvidia isn't happy that the RTX 3060 is cannibalizing their 4060 sales, but it is what it is. Everyone is trying to entice consumers to spend $40 or $100 more for the much faster product, but people are spending money on overpriced rent and food and can't afford it.
 
Steam, and now it's the RTX 3060. Not the RTX 4060, but the 3060.
That not sales, but the field, the 3060 is not the type of card to be throw away (in unused / not for gaming machine or completely) they do not get removed from the field, why would their share get lower than the 4060 even if the 4060 outsell them say 2:1 in 2024 ?

To look at what sale in 2024, a delta from the last survey can give some better clue

DEC/APR
NVIDIA GeForce RTX 3060: 5.29% / 5.71%
NVIDIA GeForce RTX 4060: 1.07% / 2.41%

To not loose share you need to sell a lot I imagine, but the 4060 sold more and I am not sure if it is the price tag, the best selling 3060 on Amazon is $290, when 4060 can be found at $300-$310, could be the 12GB of vram and that would make Nvidia and AIB augh a lot in their way to the bank either way, selling 4 years old GPU close of their original MSRP...

I'm sure Nvidia isn't happy that the RTX 3060 is cannibalizing their 4060 sales, but it is what it is
If the rumours about how much 3060 stock there is, I am not sure why they would care for one or the other in particular being sold, even if the 4060 is yes much cheaper to make, the 3060 already exist and the laptop version of the AD107 could be doing better relative to their Ampere counterpart.

If it was because reseller-aib were selling 3060 at a cheap price that would be something, at basically the same price than the new one... and they are still able to sell the old stock, seem like pure good news in that regard.
 
Nobody truly knows how much it costs AMD to make these cards, but it can't be as much as they're asking for. The problem is that AMD and Nvidia are asking a lot more than what consumers are willing to pay for. For several years it was the GTX 1060 that dominated Steam, and now it's the RTX 3060. Not the RTX 4060, but the 3060. AMD barely makes a scratch on Steam. Besides Radeon graphics which probably just built in graphics, the RX 580 is AMD's most popular GPU. Anything above $300 doesn't sell very well.
Well, these are their operating margins according to AMD.
1714851411297.png

So unless they are lying on their financials they don't have a lot of wiggle room.
 
Back
Top