RDNA — Bulldozer or Zen ?

Marees

2[H]4U
Joined
Sep 28, 2018
Messages
2,258

What the future holds for RDNA​

If we take stock of what AMD has achieved with RDNA in four years, and appraise the overall success of the changes, the end result will fall somewhere between Bulldozer and Zen. The former was initially a near-catastrophic disaster for the company but redeemed itself over the years by being cheap to make. Zen, on the other hand, has been outstanding from the start and forced a seismic upheaval of the entire CPU market.

Although it's too hard to judge precisely how good it's been for AMD, RDNA and its two revisions are clearly neither of those. Its market share in the discrete GPU sector has fluctuated a little during this time, sometimes gaining ground on Nvidia, and losing at other times, but generally, it has remained the same.

2023-09-28-image-7.png

The Gaming division has made a small but steady profit since its inception, and although margins seem to be declining at the moment, there's no sign of impending doom. In fact, in terms of margins only, it's AMD's second-best sector!

And even if it wasn't, AMD makes more than enough cash from its embedded section (thanks to the purchase of Xilinx) to stave off any short periods of overall loss.


But where does AMD go from here?


AMD seems to have all of the engineering skills and know-how & they need to stay on the current course of minor architectural updates, continue to accrue small margins, and hold a narrow slice of the entire GPU market.


https://www.techspot.com/article/2741-four-years-of-amd-rdna/
 
Imo, the one thing that AMD first needs to do — to retain their market share — is switch their release cycle.

Nvidia goes top down xx90 first then xx60/xx50 last

AMD should go reverse to hold on to their sales. x600 first, then x700, & finally x800, x900 etc.
 

What the future holds for RDNA​

If we take stock of what AMD has achieved with RDNA in four years, and appraise the overall success of the changes, the end result will fall somewhere between Bulldozer and Zen. The former was initially a near-catastrophic disaster for the company but redeemed itself over the years by being cheap to make. Zen, on the other hand, has been outstanding from the start and forced a seismic upheaval of the entire CPU market.

Although it's too hard to judge precisely how good it's been for AMD, RDNA and its two revisions are clearly neither of those. Its market share in the discrete GPU sector has fluctuated a little during this time, sometimes gaining ground on Nvidia, and losing at other times, but generally, it has remained the same.

View attachment 615306

The Gaming division has made a small but steady profit since its inception, and although margins seem to be declining at the moment, there's no sign of impending doom. In fact, in terms of margins only, it's AMD's second-best sector!

And even if it wasn't, AMD makes more than enough cash from its embedded section (thanks to the purchase of Xilinx) to stave off any short periods of overall loss.


But where does AMD go from here?


AMD seems to have all of the engineering skills and know-how & they need to stay on the current course of minor architectural updates, continue to accrue small margins, and hold a narrow slice of the entire GPU market.


https://www.techspot.com/article/2741-four-years-of-amd-rdna/
IMO, Ryzen wasn't outstanding from the start. They certainly brought the core advantage. Which got them wins in multicore. But the lightly threaded performance for Zen and Zen+, didn't really compete. Zen 2 was a much better step in that direction. And the even greater core counts helped overshadow that, as Intel couldn't answer a 3950x multicore performance-----until 12th gen!!. They were also nearly a year ahead to market with PCI-E 4.0, with Zen 2.

Zen 2 refresh was a dud.

Zen 3 finally got them essentially on par with Intel for lightly threaded. And further increased their lead for multicore, as 10th gen's 10 cores couldn't compete with a Zen 3 12 core, let alone the 16 core. And 11th gen regressed and only offered 8 cores.

etc.


As for RDNA--------its weird how that has turned out. RDNA1 was a solid reboot for AMD GPUs.

RDNA2 was great. They really brought it and were able to match Nvidia on raster performance. And most RDNA 2 GPUs are more power efficient for that performance, as well.

I guess they had real issues with RDNA3. But, to hear that problems will persist until at least RDNA5.....dang. I wonder what's keepin them from skipping ahead to RDNA5. Unless that isn't sure to work yet either....
 
Last edited:
RDNA2 was great. They really brought it and were able to match Nvidia on raster performance. And most RDNA 2 GPUs are more power efficient for that performance, as well.

I guess they had real issues with RDNA3. But, to hear that problems will persist until at least RDNA5.....dang. I wonder what's keepin them from skipping ahead to RDNA5. Unless that isn't sure to work yet either....
RDNA 2 was great, but only really looked good against Nvidia because they were on the way worse Samsung node. The fact that Nvidia was at all competitive is really a testament to their architecture. Now that Nvidia is back on a good node, we really see just how far ahead of RDNA they really are.

Everything else you said is spot on and I agree with though.
 
I wonder what's keepin them from skipping ahead to RDNA5. Unless that isn't sure to work yet either....
AMD's RDNA 4 & 5 designs seem to be complicated.

RDNA 4 cards planned to be released, could just be RDNA 3.5 on monolithic 4nm

While RDNA 5 could be true next gen (chiplets) on 3nm
 
Things always get complicated when you change to a different design, often times you run into issues you did not expect. AMD has great engineers that will get it figured out, just a matter of how long that takes. Not everything works perfectly, all manufactures run into a issue sooner or later. I am more concerned about prices these days then how much faster the cards are.
 
But where does AMD go from here?
AMD needs to do a few things. Firstly, they need to be one step ahead of Nvidia when it comes to graphics tech. They were behind with Ray-Tracing, upscaling, as well as AI. The second thing is putting their latest GPU tech in their APU's. They recently started to put RDNA3, but they need to keep up with their own GPU tech. For a very long time they put Vega graphics. The third thing is not to price their products based on Nvidia's pricing. They need to price is based on what it costs for them to manufacture as well as a decent profit. They won't gain market share if they keep being the shadow of Nvidia. Especially now that Intel is actually trying to make affordable GPU's.
 
IMO, Ryzen wasn't outstanding from the start. They certainly brought the core advantage. Which got them wins in multicore. But the lightly threaded performance for Zen and Zen+, didn't really compete. Zen 2 was a much better step in that direction. And the even greater core counts helped overshadow that, as Intel couldn't answer a 3950x multicore performance-----until 12th gen!!. They were also nearly a year ahead to market with PCI-E 4.0, with Zen 2.

Zen 2 refresh was a dud.

Zen 3 finally got them essentially on par with Intel for lightly threaded. And further increased their lead for multicore, as 10th gen's 10 cores couldn't compete with a Zen 3 12 core, let alone the 16 core. And 11th gen regressed and only offered 8 cores.

etc.


As for RDNA--------its weird how that has turned out. RDNA1 was a solid reboot for AMD GPUs.

RDNA2 was great. They really brought it and were able to match Nvidia on raster performance. And most RDNA 2 GPUs are more power efficient for that performance, as well.

I guess they had real issues with RDNA3. But, to hear that problems will persist until at least RDNA5.....dang. I wonder what's keepin them from skipping ahead to RDNA5. Unless that isn't sure to work yet either....

RDNA1 wasn't a reboot. It was a successor.
 
I think we need a whole gaming design reboot.

"The textures are too big."

Not a problem, with Nvidia they come as 2x2 pixel elements that are upscaled using AI to full glory. 8GB ftw!

"Too many elements to render in time."

Not a problem, with Nvidia, we just skip the unimportant ones, or draw some fake ones whenever we can.

"Performance drops significantly at 4K."

Not a problem, first we downscale, then we upscale, downscale, then upscale, insert some AI, and upscale again, bringing you the perfect image. It's better than the original. For example, if you play some crappy game like Half-life 2, with Nvidia, it becomes Half-life 3, automatically, when viewed at 8K.
 
Actually RDNA 3 is ok. It's not great because of lack of support in development. It's ready as a hardware not as an API with full features in development tools. And this is bad.
New season we'll have RDNA4 (Which should be called RDNA3++ since it's a optimization upgrade from RDNA3 as reported) without the big Navi 41 which probably won't be missed since I don't thing they will be ready with their development tools. Maybe in beta stage debunking big bugs.
Now all should be ready with RDNA 5, and a big Navi 51 is to be expected and functional at launch.
 
Imo, the one thing that AMD first needs to do — to retain their market share — is switch their release cycle.

Nvidia goes top down xx90 first then xx60/xx50 last

AMD should go reverse to hold on to their sales. x600 first, then x700, & finally x800, x900 etc.
AMD doesn’t really get to dictate their release schedule. AMD’s release schedule is dictated by their TSMC manufacturing windows. So release gets to happen a few months after their TSMC production of a particular product.
Otherwise we get paper launches if it’s made during a production window where parts trickle to market a few pallets at a time or worse it happens just as the production cycle starts and we have no availability at all for a month or two after the supposed launch.

If AMD wants to shake up their launch windows they need to move somewhere else or have someone else give up their slots. TSMC books their windows more than a year in advance and companies pay hundreds of millions if not billions for it and TSMC gets really upset if you mess with their schedules.

And in regards to their launch order start low and the news will run you into the ground. Nvidia launches with a 5090 and AMD puts out some 8600xt, it would be like your dad’s midlife crisis garage band following after …. I’m too old to actually think of a relevant large modern headliner … Because I was going to write Kiss, but this isn’t the 80’s, Jesus I need a nap.

Either way they would get buried in the news and Nvidia would dominate while every YouTube reviewer was and at least AMD bothered to show up. And as each faster card launches it would be predicted with some variation of “where was this X months ago”
 
Last edited:
I wonder what's keepin them from skipping ahead to RDNA5. Unless that isn't sure to work yet either....

GPUs and CPUs have a long design pipeline, often tied to process improvements that are expected to be ready when the design is ready to be fabricated.

This is why Intel CPUs were stagnant for so long when Intel 10nm didn't work like it was supposed to. A giant pipeline stall in development because designs in progress were assuming the fab would work, and then they had to go back and make new 14 nm designs and had no solid fab timelines, etc, etc. What I'm saying is they can't do much to hurry up RDNA5, and dropping RDNA4 definitely won't help ... so the question is, is it better to release RDNA4 or linger on RDNA3?
 
AMD needs to do a few things. Firstly, they need to be one step ahead of Nvidia when it comes to graphics tech. They were behind with Ray-Tracing, upscaling, as well as AI. The second thing is putting their latest GPU tech in their APU's. They recently started to put RDNA3, but they need to keep up with their own GPU tech. For a very long time they put Vega graphics. The third thing is not to price their products based on Nvidia's pricing. They need to price is based on what it costs for them to manufacture as well as a decent profit. They won't gain market share if they keep being the shadow of Nvidia. Especially now that Intel is actually trying to make affordable GPU's.
AMD launched new Vega based APU’s within this Calendar year, so yeah…
 
Actually RDNA 3 is ok. It's not great because of lack of support in development. It's ready as a hardware not as an API with full features in development tools. And this is bad.
New season we'll have RDNA4 (Which should be called RDNA3++ since it's a optimization upgrade from RDNA3 as reported) without the big Navi 41 which probably won't be missed since I don't thing they will be ready with their development tools. Maybe in beta stage debunking big bugs.
Now all should be ready with RDNA 5, and a big Navi 51 is to be expected and functional at launch.
You realize that if you are correct and the root of the issue is incomplete software and terrible documentation that makes it worse right?
 

What the future holds for RDNA​

If we take stock of what AMD has achieved with RDNA in four years, and appraise the overall success of the changes, the end result will fall somewhere between Bulldozer and Zen. The former was initially a near-catastrophic disaster for the company but redeemed itself over the years by being cheap to make. Zen, on the other hand, has been outstanding from the start and forced a seismic upheaval of the entire CPU market.

Although it's too hard to judge precisely how good it's been for AMD, RDNA and its two revisions are clearly neither of those. Its market share in the discrete GPU sector has fluctuated a little during this time, sometimes gaining ground on Nvidia, and losing at other times, but generally, it has remained the same.

View attachment 615306

The Gaming division has made a small but steady profit since its inception, and although margins seem to be declining at the moment, there's no sign of impending doom. In fact, in terms of margins only, it's AMD's second-best sector!

And even if it wasn't, AMD makes more than enough cash from its embedded section (thanks to the purchase of Xilinx) to stave off any short periods of overall loss.


But where does AMD go from here?


AMD seems to have all of the engineering skills and know-how & they need to stay on the current course of minor architectural updates, continue to accrue small margins, and hold a narrow slice of the entire GPU market.


https://www.techspot.com/article/2741-four-years-of-amd-rdna/
There is some revisionist history there. Zen was not a hit out of the gate. What AMD did was under promise and over deliver. Zen was better than it claimed it would be by being conservative. The main change here isn't that AMD delivered a better product, but that it delivered better than it said it would rather than getting a hype train going based on deceptive marketing statements. AMD historically said a lot of things about its CPUs by overpromising and underdelivering. This has usually had a disastrous result in terms of the public perception of its CPU's. Sure, you had AMD apologists but the general industry would lambast AMD for its lies. This is what changed with Zen. It wasn't that the Ryzen series was that good, its that it didn't disappoint because AMD was honest from the start. Not only that, but it created a foundation for AMD to build on, something it claimed it could do with previous CPU's like Phenom and Bulldozer but never delivered on those promises. This is most likely a result of a change in leadership. AMD used to change CEOs more often than people changed the oil in their cars.

Sure, Zen brought additional cores to the desktop but Zen based CPU's and their successors had a number of weaknesses. Primarily, Zen, Zen+ and Zen 2 were all relatively weak as gaming CPU's. The internal latency between CCX complexes and CCD's was a problem. You can even argue that a lot of these problems are still present today, though AMD has mitigated them over time with more and more L3 cache.

RDNA PR crap aside, the fact is AMD's GPU marketshare is well behind NVIDIA's and really only ever competes at the midrange and only do so at the price / performance metric as they have to shuffle offerings around frequently to compete with NVIDIA in this range. At the top, they don't really compete at all having to slot their best GPU and its pricing somewhere around NVIDIA's second best or right behind even that. AMD also lacks the brand recognition and the reputation in this segment. While its not really true anymore, AMD still has a reputation for bad drivers and furthermore, on the software it does lag behind NVIDIA in a number of ways. FSR versus DLSS, etc.

I hear the argument about the mid-range market being the bread and butter market for sales and while this is true, its the top of the stack that drives brand perception for the general public. People assume that if you are the fastest at the top, then this trickles down to the rest of the product stack whether that's true or not. When most people think of General Motors, they think of the GMC pickups or they think of the Corvette. Not the POS economy cars and shitty crossover SUV's that people actually buy. This is where AMD struggles as it hasn't ever really been able to decisively take the performance crown from NVIDIA the way it has often done with Intel in recent years.

As far as finances go, never underestimate AMD's ability to operate while losing money. It's something they've proven adept at doing over the last three decades. Its truly shocking how its managed to stay afloat while not making any money. AMD has gone so far as to cannibalize its profitable divisions to support its floundering CPU business in the past. It's also received bailout money from the German government which wanted to support what is now the Global Foundries plant in Dresden.
 
That's interesting.

I would never have guessed embedded would be the top segment. I would have thought that was Enterprise/Data center, what with how much they have been prioritizing EPYC and the margins they can charge on them.

It's disappointing to see them taking a loss on client, but I guess that is the way the entire market is going. Non-gaming computers have been contracting for decades now. People just use their phones.

I don't know how they are happy with that, but maybe that is just me. I find my phone an amazingly convenient device to use when I am on the go, but if I want to collect my thoughts and do anything serious, or really read up on a topic, I am sitting down at my desktop. I see phones as only good for temporary use when you don't have any other option, because you are out of the house.
 
That's interesting.

I would never have guessed embedded would be the top segment. I would have thought that was Enterprise/Data center, what with how much they have been prioritizing EPYC and the margins they can charge on them.

It's disappointing to see them taking a loss on client, but I guess that is the way the entire market is going. Non-gaming computers have been contracting for decades now. People just use their phones.

I don't know how they are happy with that, but maybe that is just me. I find my phone an amazingly convenient device to use when I am on the go, but if I want to collect my thoughts and do anything serious, or really read up on a topic, I am sitting down at my desktop. I see phones as only good for temporary use when you don't have any other option, because you are out of the house.

I'm wondering if embedded includes the console market. As for data centers, AMD has very little footprint there compared to Intel.

That has more to do with brand recognition and the purchase practices of large organizations.
 
AMD has figured out tiling GPUs on their M300. They can make any capacity APU they want now.

They can also focus on upgrading just the GPU tiles while a GPU i/o ship stays the same and the card stays validated so they can do much faster cycels.

But they got to get their architecture competitive.

nVidia isn't build for us. Their chips are cut done versions of designs for customers who are happy to put multiple kilowatts per machine in a rack of such machines.Like imagine the bitcoin shit, but nVidia actually focusing on mining performance.
 
nVidia isn't build for us. Their chips are cut done versions of designs for customers who are happy to put multiple kilowatts per machine in a rack of such machines.Like imagine the bitcoin shit, but nVidia actually focusing on mining performance.
The current offerings are really quite good with features, performance, and efficiency. Price however, is ... not.

It will be interesting to see how they navigate the forces of "boom du jour" which makes a ton of money for said boom, and the ever-present "gamers will always want stuff" but they can't wring as much out of them. Not for lack of trying, of course.
 
The issue is that Zen hit the market with Intel basically not innovating at all, same 4/8 with 5% uplift each year.

Nvidia, on the other hand, keeps making REALLY good GPUs, and charges enough to show they know it.
 
Buldozer was bad hardware and a bad experience, RDNA's hardware is fine, but the experience is mediocre at best.

AMD simply isn't aggressive when it comes to the discrete GPU market, they knock 5-10% off the price of an equivalent nVidia GPU but that nVidia GPU is superior across the board. Software, hardware, features, nVidia goes from strength to strength while AMD worries mostly about being slightly less expensive, but not less enough that it justifies the bad software and reduced feature-set. It's really hard to catch up to a company that as much in the driver's seat as nVidia. AMD doesn't want to sell cards for much less than nVidia so they never make much progress.

Never mind how little discrete gaming GPUs really mean to these two companies right now. I don't think either one is sweating it, or giving much of a shit about our whining. AMD has consoles and nVidia has AI.
 
There is some revisionist history there. Zen was not a hit out of the gate. What AMD did was under promise and over deliver. Zen was better than it claimed it would be by being conservative. The main change here isn't that AMD delivered a better product, but that it delivered better than it said it would rather than getting a hype train going based on deceptive marketing statements. AMD historically said a lot of things about its CPUs by overpromising and underdelivering. This has usually had a disastrous result in terms of the public perception of its CPU's. Sure, you had AMD apologists but the general industry would lambast AMD for its lies. This is what changed with Zen. It wasn't that the Ryzen series was that good, its that it didn't disappoint because AMD was honest from the start. Not only that, but it created a foundation for AMD to build on, something it claimed it could do with previous CPU's like Phenom and Bulldozer but never delivered on those promises. This is most likely a result of a change in leadership. AMD used to change CEOs more often than people changed the oil in their cars.
When Ryzen was released it was basically Haswell levels of performance. Which was behind Intel at the time, but this is also one of the reasons why Apple left Intel, because Intel wasn't trying to push for better products. The benefit of Ryzen was that...

  1. It was cheap, including motherboards.
  2. You often got twice as many cores.
  3. SMT was normally freely included, unlike Intel's hyperthreading.
  4. You can install ECC Ram, given the motherboard manufacturer supports it.
  5. If you bought their APU's, the graphics performance was far beyond that of Intel's graphics.

For what could be anywhere from 3% to 5% difference in performance to the benefit of Intel, you do gain a lot more from buying Ryzen.
RDNA PR crap aside, the fact is AMD's GPU marketshare is well behind NVIDIA's and really only ever competes at the midrange and only do so at the price / performance metric as they have to shuffle offerings around frequently to compete with NVIDIA in this range. At the top, they don't really compete at all having to slot their best GPU and its pricing somewhere around NVIDIA's second best or right behind even that. AMD also lacks the brand recognition and the reputation in this segment. While its not really true anymore, AMD still has a reputation for bad drivers and furthermore, on the software it does lag behind NVIDIA in a number of ways. FSR versus DLSS, etc.
As much as that sounds like it matters, it really doesn't. When AMD started to drop their prices, they sold more than Nvidia did this year. AMD wants to be recognized as a top quality brand GPU manufacturer, but nobody really cares about that. What people care about is the best price to performance, and AMD like Nvidia had priced themselves out of the competition. The only time you care about things like FSR vs DLSS is when you're paying more than $400 for a GPU.
I hear the argument about the mid-range market being the bread and butter market for sales and while this is true, its the top of the stack that drives brand perception for the general public. People assume that if you are the fastest at the top, then this trickles down to the rest of the product stack whether that's true or not. When most people think of General Motors, they think of the GMC pickups or they think of the Corvette. Not the POS economy cars and shitty crossover SUV's that people actually buy. This is where AMD struggles as it hasn't ever really been able to decisively take the performance crown from NVIDIA the way it has often done with Intel in recent years.
This is true. Like how a person with a Tahoe thinking they have a Corvette because they have a 5.3L iron version of the Corvette LS engine, so too do RTX 4060 owners think they have a RTX 4090. Even though their RTX 4060 is getting beat by the RTX 3060 12GB, if the game needs enough VRAM.
As far as finances go, never underestimate AMD's ability to operate while losing money. It's something they've proven adept at doing over the last three decades. Its truly shocking how its managed to stay afloat while not making any money. AMD has gone so far as to cannibalize its profitable divisions to support its floundering CPU business in the past. It's also received bailout money from the German government which wanted to support what is now the Global Foundries plant in Dresden.
AMD just doesn't throw money around in projects like Nvidia often does. This is also why Nvidia had Ray-Tracing first, along with DLSS, and AI. This is because Nvidia is dependent on AMD and Intel, to keep themselves in business. As much as people love to harp on ARM, there's nobody here that uses an Nvidia GPU on an ARM chip, and no the Nintendo Switch doesn't count. Nvidia's main source of business is running their GPU's in either AMD or Intel systems. This is why Nvidia often invests into things like ARM, because if they could they'd only sell their GPU's with their own ARM chips, but as it's often the case, hardly anyone buys them. This is also why Nvidia's technology like DLSS is only for Nvidia, because that locked in ecosystem is what keeps Nvidia from having to battle AMD in pricing. Nvidia is not selling you a GPU, but selling you a unique experience, and that is hard to put a price on, and Nvidia knows it. AMD open sources FSR because AMD's core business isn't graphics but their CPU's. The GPU is there to support their CPU market, much like why Google created Android to support their search engine business, which indirectly supports their advertising business. AMD's lack of funding graphics innovation is also why there's a stark contrast from the ATI days, when graphics was their core business. There isn't a whole lot of innovation from AMD today, just a lot of catching up to Nvidia.
 
The issue is that Zen hit the market with Intel basically not innovating at all, same 4/8 with 5% uplift each year.

Nvidia, on the other hand, keeps making REALLY good GPUs, and charges enough to show they know it.
Intel had plenty of 6 and 8 core chips for many years including the vaunted Bulldozer days. They simply sold them as HPDT and priced them to match. Just because they were expensive doesn't mean they didn't have them. AMD simply moved high core counts down the price chain.

BTW, moar cores isn't a panacea for anything related to gaming. To this day 8 core Intel chips game just as well if not better than AMD 8 core chips.
 
GPUs and CPUs have a long design pipeline, often tied to process improvements that are expected to be ready when the design is ready to be fabricated.

This is why Intel CPUs were stagnant for so long when Intel 10nm didn't work like it was supposed to. A giant pipeline stall in development because designs in progress were assuming the fab would work, and then they had to go back and make new 14 nm designs and had no solid fab timelines, etc, etc. What I'm saying is they can't do much to hurry up RDNA5, and dropping RDNA4 definitely won't help ... so the question is, is it better to release RDNA4 or linger on RDNA3?
Yeah the 10nm development forced Intel to scrap 3 core designs and scramble to reuse their existing designs for even more iterations I remember one interview where they say it set them back like 7 years.
Supposedly Lunar Lake finally gets to make use of some of their designs they had planned for 10nm that they couldn’t use because their GAA tech they hoped to implement then are only panning out now.
 
As an aside, I would like to add that I really appreciate that this has (so far) been a productive and even-keeled conversation as opposed to the usual fan boy drivel these threads devolve into. Refreshing and appreciated.
It's only a problem when it's beyond constructive criticism. When we start using feels to explain something, that's when the conversation devolves.
Intel had plenty of 6 and 8 core chips for many years including the vaunted Bulldozer days. They simply sold them as HPDT and priced them to match. Just because they were expensive doesn't mean they didn't have them. AMD simply moved high core counts down the price chain.
You mean the 6800K and 6900K chips? Yea, they had them and then some. I have an Intel Xeon 8 core from the Sandy Bridge era I think, and also has 16 threads. It's not that Intel didn't have them, it's the pricing. That Xeon I have has ECC ram too. You know how much cheaper my Ryzen 1700 was in comparison? Less than $200 for the CPU, and so was my 2700X and my 5700X while all using the same motherboard.
BTW, moar cores isn't a panacea for anything related to gaming. To this day 8 core Intel chips game just as well if not better than AMD 8 core chips.
Today an 8 core is a good idea. Going back to 2017 where most games were perfectly happy with dual core, because that's what most people used until Ryzen was released. Quad Core i5's were for those with deep pockets, and if you wanted a K at the end, then you're paying more. The fancy people had i7's, with their hyper-threading. Even i3's had it, and sometimes the Pentiums. Now I can't tell what Intel's naming means anymore, since they're up to i9 with big numbers. Even back then the naming thing with Intel made no sense, like an i5 in a desktop PC is expected to have 4 cores, but not in a laptop. Now with E-cores, it's an even bigger mess.
 
AMD could massively slash their GPU prices and go for market share.... but where would they FAB them?

The save the company, win the future moves are where you get 100x + mark ups, and those customers are buying every kind of accelerator as fast as they can be shipped.

I feel like the next generation's top tier will be all shit binned data center chips. Which I don't mind. But it would be odd if nVidia bothered to make a halo gaming product that isn't a cut down AI chip.
 
AMD could massively slash their GPU prices and go for market share.... but where would they FAB them?

The save the company, win the future moves are where you get 100x + mark ups, and those customers are buying every kind of accelerator as fast as they can be shipped.

I feel like the next generation's top tier will be all shit binned data center chips. Which I don't mind. But it would be odd if nVidia bothered to make a halo gaming product that isn't a cut down AI chip.
When you sell 100% of what you make shareholders will not let you “slash” prices.
Market share for a race to the bottom product is not something AMD cares about.
 
When you sell 100% of what you make shareholders will not let you “slash” prices.
Market share for a race to the bottom product is not something AMD cares about.
While this is true, this doesn't mean AMD shouldn't care about market share and not slash prices. Also it's not like AMD was selling all the products, because they had to slash prices to empty their inventory this year. Nvidia is the only company here that has the luxury to avoid slashing prices because they have all the market share. While AMD was comfortable at being #2, there was only two GPU manufacturers in the PC market. AMD was there to scoop up sales that Nvidia couldn't fulfill, like during the crypto boom and even now with the AI boom. The problem for AMD is that Intel has entered the market, and they do seem to care more about market share. Intel's BattleMage is rumored to be their first real flagship GPU. AMD could always end up as #3, which I would imagine shareholders wouldn't be very pleased with. Being a publicly traded company is a crutch when you always need to worry about quarterly profits. When you really need to plan 2-3 years down the road and start doing things now, that may not be good for the near future, but can pay off down the road.
 
While this is true, this doesn't mean AMD shouldn't care about market share and not slash prices. Also it's not like AMD was selling all the products, because they had to slash prices to empty their inventory this year. Nvidia is the only company here that has the luxury to avoid slashing prices because they have all the market share. While AMD was comfortable at being #2, there was only two GPU manufacturers in the PC market. AMD was there to scoop up sales that Nvidia couldn't fulfill, like during the crypto boom and even now with the AI boom. The problem for AMD is that Intel has entered the market, and they do seem to care more about market share. Intel's BattleMage is rumored to be their first real flagship GPU. AMD could always end up as #3, which I would imagine shareholders wouldn't be very pleased with. Being a publicly traded company is a crutch when you always need to worry about quarterly profits. When you really need to plan 2-3 years down the road and start doing things now, that may not be good for the near future, but can pay off down the road.
Problem is they already sell everything they make, to go after market share means they have to back off in another market, or somehow obtain more fab time. TSMC is at capacity and AMD’s designs don’t really work well on other fabs, Exynos disasters anybody? So more fab time isn’t in the cards either unless they start making things on nodes greater than 7nm or design something for a Samsung or Intel node. Designing a bespoke product for a budget launch in an attempt to claw market share doesn’t pan out they would be throwing money away. Lisa is not at all about throwing money away.
The mass 6000 series overstock was an outlier a huge one, one they will not repeat, the size of their MI300 series won’t allow it, they don’t have the wafers to spare.
 
Problem is they already sell everything they make, to go after market share means they have to back off in another market, or somehow obtain more fab time. TSMC is at capacity and AMD’s designs don’t really work well on other fabs, Exynos disasters anybody? So more fab time isn’t in the cards either unless they start making things on nodes greater than 7nm or design something for a Samsung or Intel node. Designing a bespoke product for a budget launch in an attempt to claw market share doesn’t pan out they would be throwing money away. Lisa is not at all about throwing money away.
The mass 6000 series overstock was an outlier a huge one, one they will not repeat, the size of their MI300 series won’t allow it, they don’t have the wafers to spare.
Ok but what about Intel? Their GPU's are made at TSMC as well. They're built on 6nm, but so is the RX 7600. Including even the OLED Steam Deck, with AMD's chip also using 6nm. AMD is reported to be looking at Samsung's 4nm as well. AMD's Q3 was down by around 8%, while Nvidia seems to be up by 15%. They all use TSMC, except Nvidia is on 4nm, unlike AMD and Intel. I guess if you make less you always sell out, but they're also selling less. It's not like Intel won't have access to their own manufacturing facilities to make their own GPU's in the future.
 
Ok but what about Intel? Their GPU's are made at TSMC as well. They're built on 6nm, but so is the RX 7600. Including even the OLED Steam Deck, with AMD's chip also using 6nm. AMD is reported to be looking at Samsung's 4nm as well. AMD's Q3 was down by around 8%, while Nvidia seems to be up by 15%. They all use TSMC, except Nvidia is on 4nm, unlike AMD and Intel. I guess if you make less you always sell out, but they're also selling less. It's not like Intel won't have access to their own manufacturing facilities to make their own GPU's in the future.
That was half my point, Intel is a terrible fab for a GPU currently, the way they structure their transistors is not good for it at volume. They can make them but the failure rate is too high for Intel to be able to support the volume they need things at.
The AMD samsung 4nm LPP partnership will be for the low power ultra mobile stuff like the 7x30 series of APUs to go in sub $500 laptops. A market that requires cheap parts and non existent margins, and one that Intel currently has the better performing CPU and iGPU for, because AMD once again put Vega and Zen 3 into play there.
So if the Samsung 4LPP parts come those will be their market penetration chips but we’re talking 780M or less in terms of performance there.
The new 4LPP MOL does power delivery better than the TSMC equivalent as they took some notes from Intel on their backside power delivery. But again that is for low power stuff 15w and below.
 
I honestly want AMD to focus more on APUs and SoCs that can be augmented via external memory and GPU I see that as a growth market for them and it plays to their strengths. The consolification of the Gaming PC if you will.
The MI300A is a god damned beast, picture if you will a 600w Gaming SoC that was an 8 Core x3D, and something akin to the 7900xtx bundled together with 64GB of HBM3e of unified memory on a single bit of fabric. Give some boards with the option for additional memory (as they do for these chips on the server side) and for multi-GPU synchronization (again like they do on the server side) and that would be something interesting and innovative and play to all of AMD's strengths.

Mini Rant:
My internal monolog constantly bitches about how much I hate that Nvidia has pushed the idea that 4K or 8K gaming is an actual thing, yeah something like 3% of the PC gamers out there have a 4K display so it obviously is a thing, but Jesus even the 4090 can't do it raw unless you start looking at titles from 10+ years ago.
Enter all the upscaling, frame generation, and ray tracing shortcuts and shenanigans that they then have to invent to actually make it something you can do, and even then you gotta drop serious money to make it happen. But that's fine that is their money and if they can do it then power to them.
The rub comes in at how much time, energy, and resources developers spend on making 4K look and work as best as they can, to the direct detriment of all the resolutions below, so Nvidia has basically created a scenario where Developers need to cater to 3% of the PC gaming population to ensure positive reviews and needs to implement their fancy-ass shortcuts to make it doable, which then keeps AMD or Intel on their back foot trying to scramble at the software front where Software is 100% Nvidia's strongest point.
Nvidia created a shitty narrative that forces Developers and Competitors to beat to their drum to appease the smallest fraction of Gamers which takes the bulk of the game review space while crapping on everybody who isn't dropping $5000 on a gaming rig. Then runs the narrative that for the "low" price of only $500 you can get a similar experience to all those $2500 cards if you use our hardware and the developers use our software, so make your voice heard and buy our stuff and push for our software.... GAAA makes me ... not mad... depressed???
 
At the minimum, I have been impressed with AMD's mid range gaming GPUs this generation. I think they are certainly getting there and offer a good value. A bit behind on the software front, but I think the hardware is very solid for the price.

I've typically used AMD CPUs, only ever used one Intel desktop CPU. So I am glad that Zen worked out well.
 
My internal monolog constantly bitches about how much I hate that Nvidia has pushed the idea that 4K or 8K gaming is an actual thing, yeah something like 3% of the PC gamers out there have a 4K display so it obviously is a thing, but Jesus even the 4090 can't do it raw unless you start looking at titles from 10+ years ago.

There's so much compute thrown away on pixels we can't even see. And then so much time fighting to get things to work with all those pixels. You don't need display scaling to work if your pixels are a reasonable size, and you use less ram and less gpu and less cpu if you don't have to push 4x the pixels... But this is just how it is, I guess; you can't get a nice 1080p hdtv anymore either, gotta get 4k if you want nice, and then everythings got to work harder when you could just have chunky pixels and be happy.
 
Back
Top