Details on Apple ARM agreement

Lakados

[H]F Junkie
Joined
Feb 3, 2014
Messages
10,733
https://www.tomshardware.com/news/a...-cents-per-chip-in-royalties-new-report-says#

Variations of this article are making the rounds, some more click bait than others.

TLDR;
Apple Pay’s less than $0.30 per ARM unit it ships, and the deal extends past 2040.
SoftBank tried a few times to renegotiate the deal but Apple said no and told them to take a hike.
Apple is putting more than the usual efforts into RISC-V but is still a few years out from it being a viable alternative. But Apple has threatened to make the change should ARM push too hard on upping the fees.
 
It's funny that some people that posted in this forum make it seem like Apple and ARM are good friends working together. They are obviously not. If Apple were to dump ARM for RISC-V, especially considering they just migrated their computers over to ARM, this would be an even bigger mistake than the already ARM move. As another reminder, they aren't doing so hot this year. I do not envy Apple users having to deal with their transition over to ARM, and a possibility over to RISC-V, all because Apple doesn't want to pay more than ¢30 per device.
 
It's funny that some people that posted in this forum make it seem like Apple and ARM are good friends working together. They are obviously not. If Apple were to dump ARM for RISC-V, especially considering they just migrated their computers over to ARM, this would be an even bigger mistake than the already ARM move. As another reminder, they aren't doing so hot this year. I do not envy Apple users having to deal with their transition over to ARM, and a possibility over to RISC-V, all because Apple doesn't want to pay more than ¢30 per device.
SoftBank isn’t anybodies friend, Apple and ARM were friends but ARM is dead and gone and has been for years, it’s just a CPU division of SoftBank now. If you go back to the article that Toms is reporting from it seems that SoftBank knew the Apple terms in advance but was confident they could convince Apple to sign a different one.

And really the ARM transition didn’t hurt Apple or really do anything. And you have to admit the Intel CPU options before Apple dumped them were kind of trash, 8th Gen Mobile was not a nice platform. Would have been great if Apple could have done AMD but AMD couldn’t feed Apple.
Apple forced Intel and AMD to step up their game which they are doing, not terribly well but they are doing it.
From a user perspective the only thing that changed was a huge performance increase, better battery life, and the laptops got thinner and cooler.
On the developer side all we had to do for our internal apps was update our version of XCode, check 2 additional boxes in settings, and recompile, then upload 3 versions to the repository instead of our normal 2.
One of the Apps threw an issue but that was a 2h fix, we missed replacing a few depreciated functions that the new version didn’t like. But that would have happened with or without the ARM build as it was an OS related thing and Apple announced it’s depreciation at least a year before the ARM announcement was made.

You have to remember Apples are stupid machines, there are no settings or wizards that are user configurable and forward facing. You turn them on enter a few usernames and passwords, the most complicated step in their configuration is choosing between their Light and Dark themes.
The architecture is 100% hidden.
 
Last edited:
Not exactly surprising that Apple is paying less. Is there any other major player designing their own cores on an architecture license, rather than just licensing the cores outright from ARM?
 
It's funny that some people that posted in this forum make it seem like Apple and ARM are good friends working together. They are obviously not. If Apple were to dump ARM for RISC-V, especially considering they just migrated their computers over to ARM, this would be an even bigger mistake than the already ARM move. As another reminder, they aren't doing so hot this year. I do not envy Apple users having to deal with their transition over to ARM, and a possibility over to RISC-V, all because Apple doesn't want to pay more than ¢30 per device.

The transition to in-house ARM chips is great for Apple, and most of the mac users I'm friends with are happy about the transition. Although they do have some annoying software issues from time to time, but that's part or the Apple: It Just Works (but it it doesn't, you're Just Screwed) philisophy.

When Apple runs mainstream chips, it's too easy to compare with other laptops and turns their laptops into a commodity. When they run their own chips, they don't have to worry about perf comparisons to other laptops with the same chip, but better thermal designs. Intel and AMD face market pressure to release chips that clock to the moon, and Apple doesn't need to, so they can explore the design space differently. Integration has benefits here.
 
You have to remember Apples are stupid machines,
Users love them. And the UIs are the best in the business.

there are no settings or wizards that are user configurable and forward facing. You turn them on enter a few usernames and passwords, the most complicated step in their configuration is choosing between their Light and Dark themes.
I think this is why users love them.

The architecture is 100% hidden.
No hacks, or in UNIXese, .ini files?
 
Users love them. And the UIs are the best in the business.
I don’t know about best, but it’s certainly clean and functional.
I think this is why users love them.
It’s certainly a strong reason, if you are buying an expensive tool having to spend weeks tinkering and tweaking to get things working isn’t exactly desirable. Turn it on and get to work.
No hacks, or in UNIXese, .ini files?
They exist, but they aren’t user facing, you have to dig and hunt and you are going there because you’re on a mission probably sent from some deity likely an unholy one.
 
"What goes around comes around"

"Ain't karma a bitch"

"Wherever you go, there you are"

In case you don't get what I'm sayin here, just suffice it to say that for many, many years, Apple has (successfully) only pursued deals that benefit THEM and their bank accounts....so the fact that they are using their business acumen and market position to pay as little as possible for ARM stuff should not surprise anyone.....

And also, they used RISC chips for several years in the early Macs and so did Atari, at least until Motorola's chip division went poopoo...so returning to them would really not be that big of a surprise, not to me anywayz. Granted the newer generation is way more advanced, but still :D
 
"What goes around comes around"

"Ain't karma a bitch"

"Wherever you go, there you are"

In case you don't get what I'm sayin here, just suffice it to say that for many, many years, Apple has (successfully) only pursued deals that benefit THEM and their bank accounts....so the fact that they are using their business acumen and market position to pay as little as possible for ARM stuff should not surprise anyone.....

And also, they used RISC chips for several years in the early Macs and so did Atari, at least until Motorola's chip division went poopoo...so returning to them would really not be that big of a surprise, not to me anywayz. Granted the newer generation is way more advanced, but still :D
Well .30 a unit was a decent deal in 1991, maybe 1992? when they signed the 50 year deal for the Newton handheld devices.
Not Apples fault Inflation kicked everybody in the teeth far beyond anybody thought possible or that some 15 years later they would find a way to make the Newton into a phone.

RISC falls very much into Apples top down wheelhouse. RISC can be used for custom memory and storage controllers, it works as a GPU, CPU, NPU, and AI acceleration, and there are some cool things people are doing with it as an Analog processor, as that’s making a comeback.

It would let Apple go all Apple top to bottom should they want too.
 
Last edited:
This will probably work out as well as Apple going from Power to x86.

The one thing Apple is really good at it making stupid architecture changes.
 
SoftBank isn’t anybodies friend, Apple and ARM were friends but ARM is dead and gone and has been for years, it’s just a CPU division of SoftBank now. If you go back to the article that Toms is reporting from it seems that SoftBank knew the Apple terms in advance but was confident they could convince Apple to sign a different one.
SoftBank is ARM, and until someone else owns ARM, this will continue.
And really the ARM transition didn’t hurt Apple or really do anything.
What about the 34% decline year over year? It's certainly not due to the iPhone, because that's said to be doing better.
And you have to admit the Intel CPU options before Apple dumped them were kind of trash, 8th Gen Mobile was not a nice platform. Would have been great if Apple could have done AMD but AMD couldn’t feed Apple.
I'm not sure what's wrong with Intel CPU's at the time, other than battery life couldn't be doing well. I do know that the old Intel's can still be good for productivity. Admittedly the render time on the M1 and M3 is drastically faster, even though that's probably to due with the Media Engine. Older Intel based Macs could use multiple monitors while the base model M series can't. Thunderbolt on the M series isn't as compatible as the older Intel ones.

View: https://youtu.be/dn_iTP4v8UE?si=uSyTAHVqekT8Zzb0
Apple forced Intel and AMD to step up their game which they are doing, not terribly well but they are doing it.
More so with Intel than AMD. Intel adopted the big little core design that Apple is using for their M series chips, as well as hardware video encoding. Intel was already working on their own GPU, but that's probably due to Nvidia taking their server market share away. AMD hasn't done anything different other than work on power efficiency, but it was due to Apple.
From a user perspective the only thing that changed was a huge performance increase, better battery life, and the laptops got thinner and cooler.
On the flip side you now get 50% the performance on all older x86 applications, and no compatibility with 32-bit applications. You lost Boot Camp, which means your ability to use Windows applications at full speed and compatibility is lost. While better battery was a thing back in 2021, the newer AMD Ryzen Dragon Range chips I believe have caught up. Not sure where you get performance because no M series chip has been proven faster unless you run Geekbench or video encoding.
On the developer side all we had to do for our internal apps was update our version of XCode, check 2 additional boxes in settings, and recompile, then upload 3 versions to the repository instead of our normal 2.
Just like it's that simple to add DLSS support to games?
The architecture is 100% hidden.
So is the engine in a car, but people forget to put oil in them and throw a rod.
The transition to in-house ARM chips is great for Apple, and most of the mac users I'm friends with are happy about the transition. Although they do have some annoying software issues from time to time, but that's part or the Apple: It Just Works (but it it doesn't, you're Just Screwed) philisophy.
As a Linux user I have annoying software issues from time to time, but obviously nobody wants to deal with it. The difference is that I paid nothing for my software problems, while that Macbook isn't cheap. Linux users have better choices when it comes to trying to get Windows software working.
When Apple runs mainstream chips, it's too easy to compare with other laptops and turns their laptops into a commodity. When they run their own chips, they don't have to worry about perf comparisons to other laptops with the same chip, but better thermal designs. Intel and AMD face market pressure to release chips that clock to the moon, and Apple doesn't need to, so they can explore the design space differently. Integration has benefits here.
This is why PowerPC eventually fell out of Apple's grace. ARM isn't exactly Apple's first foray with a RISC based CPU. At some point Apple couldn't convince people with Photoshop benchmarks that their G4's and G5's were superior to Intel. Even though at the time it was AMD who was faster with their Athlon XP's and 64's, but Apple has never considered AMD to be a serious competitor. Eventually the constant competition between AMD and Intel had pushed Intel to create the superior chip with their Core2Duo designs. With IBM falling beyond their goals to keep up, Apple made the correct choice and went with Intel.

The parallels to back then with today are extremely similar, but with Intel who was falling behind AMD. Instead of Photoshop, we now have video editing as the key factor Apple will use to showcase their performance. The reason why Intel fell behind was because AMD's Bulldozer wasn't a competitor to their Sandy Bridge. Without any real competition, most CPU architectures fade out of existence. Anyone remember MIPS? They were big in the 90's and even into the early 2000's, until hungrier ARM and PowerPC displaced them. The true dream for any CPU manufacturer is to lock customers into their architecture so there's no competition. Who in their right mind would want to retool all their software for another CPU architecture?
Well .30 a unit was a decent deal in 1991, maybe 1992? when they signed the 50 year deal for the Newton handheld devices.
Not Apples fault Inflation kicked everybody in the teeth far beyond anybody thought possible or that some 15 years later they would find a way to make the Newton into a phone.
At the time, ARM wasn't exactly in a good financial situation. ARM taking the ¢30 deal was their way of keeping the company afloat. 15 years worth of inflation isn't doing ARM any favors. If Apple doesn't pay them more, you'll see stagnation from future ARM designs. This may work in Apple's favor since they can add their own instructions but that would also make software development on future Apple products problematic.


View: https://youtu.be/nIwdhPOVOUk?si=PlI77_-_QlMrlaiv
 
SoftBank is ARM, and until someone else owns ARM, this will continue.

What about the 34% decline year over year? It's certainly not due to the iPhone, because that's said to be doing better.

I'm not sure what's wrong with Intel CPU's at the time, other than battery life couldn't be doing well. I do know that the old Intel's can still be good for productivity. Admittedly the render time on the M1 and M3 is drastically faster, even though that's probably to due with the Media Engine. Older Intel based Macs could use multiple monitors while the base model M series can't. Thunderbolt on the M series isn't as compatible as the older Intel ones.

View: https://youtu.be/dn_iTP4v8UE?si=uSyTAHVqekT8Zzb0

More so with Intel than AMD. Intel adopted the big little core design that Apple is using for their M series chips, as well as hardware video encoding. Intel was already working on their own GPU, but that's probably due to Nvidia taking their server market share away. AMD hasn't done anything different other than work on power efficiency, but it was due to Apple.

On the flip side you now get 50% the performance on all older x86 applications, and no compatibility with 32-bit applications. You lost Boot Camp, which means your ability to use Windows applications at full speed and compatibility is lost. While better battery was a thing back in 2021, the newer AMD Ryzen Dragon Range chips I believe have caught up. Not sure where you get performance because no M series chip has been proven faster unless you run Geekbench or video encoding.

Just like it's that simple to add DLSS support to games?

So is the engine in a car, but people forget to put oil in them and throw a rod.

As a Linux user I have annoying software issues from time to time, but obviously nobody wants to deal with it. The difference is that I paid nothing for my software problems, while that Macbook isn't cheap. Linux users have better choices when it comes to trying to get Windows software working.

This is why PowerPC eventually fell out of Apple's grace. ARM isn't exactly Apple's first foray with a RISC based CPU. At some point Apple couldn't convince people with Photoshop benchmarks that their G4's and G5's were superior to Intel. Even though at the time it was AMD who was faster with their Athlon XP's and 64's, but Apple has never considered AMD to be a serious competitor. Eventually the constant competition between AMD and Intel had pushed Intel to create the superior chip with their Core2Duo designs. With IBM falling beyond their goals to keep up, Apple made the correct choice and went with Intel.

The parallels to back then with today are extremely similar, but with Intel who was falling behind AMD. Instead of Photoshop, we now have video editing as the key factor Apple will use to showcase their performance. The reason why Intel fell behind was because AMD's Bulldozer wasn't a competitor to their Sandy Bridge. Without any real competition, most CPU architectures fade out of existence. Anyone remember MIPS? They were big in the 90's and even into the early 2000's, until hungrier ARM and PowerPC displaced them. The true dream for any CPU manufacturer is to lock customers into their architecture so there's no competition. Who in their right mind would want to retool all their software for another CPU architecture?

At the time, ARM wasn't exactly in a good financial situation. ARM taking the ¢30 deal was their way of keeping the company afloat. 15 years worth of inflation isn't doing ARM any favors. If Apple doesn't pay them more, you'll see stagnation from future ARM designs. This may work in Apple's favor since they can add their own instructions but that would also make software development on future Apple products problematic.


View: https://youtu.be/nIwdhPOVOUk?si=PlI77_-_QlMrlaiv


Does any of your understanding of technology come from sources other than youtubers doing ahegao faces on their title slides?
 
"What goes around comes around"

"Ain't karma a bitch"

"Wherever you go, there you are"

In case you don't get what I'm sayin here, just suffice it to say that for many, many years, Apple has (successfully) only pursued deals that benefit THEM and their bank accounts....so the fact that they are using their business acumen and market position to pay as little as possible for ARM stuff should not surprise anyone.....

And also, they used RISC chips for several years in the early Macs and so did Atari, at least until Motorola's chip division went poopoo...so returning to them would really not be that big of a surprise, not to me anywayz. Granted the newer generation is way more advanced, but still :D

With the exception of when they were using Intel CPUs Apple never left RISC. RISC is an CPU architecture/design philosophy that displaced the previous one.

PowerPC is RISC. ARM is RISC. The people initially behind RISC-V just took the acronym for use in their branding.

And for the last 25+ years Intel and AMDs x86 chips have been effectively RISC internally once you're past the instruction decode step. Meanwhile, while they've avoided complexities in CISC designs that were found to cause more problems that they were work, modern high end RISC designs have long since moved beyond the minimalism that was the 'R' in Reduced Instruction Set Computer.

RISC-V has some advantages over ARM in that being a newer design it can shed some design baggage ARM is stuck with for legacy reasons. It also has a degree of appeal to larger companies in that at it's base it's a royalty free open standard.

The flip side to that is that everything beyond the most basic parts of the standard is optional; that makes software tooling more difficult and code targetting it less portable; which would be a massive impediment for its use in general purpose computers. It's much less of an issue in the embedded world where a chip will only ever be running a single program written by a single company. Phones are halfway between the two in that wether Andriod or IOS 99.9% of software written for them is much more strongly abstracted away from and runs independently from the base hardware than is true in windows or Linux.

Google's also been tinkering with running Android on RISC-V chips; but at least for the next few years I don't expect anything to come out of their or Apple's experimentation. Looking 10+ years out instead of 3-5, it seems a lot more possible. With time the tooling should continue to improve; and a few baseline configurations that serve as defacto standards for what to include could form and gain significant ground. In Apple's case, their scale relative to all the embedded companies who're RISC-V's current natural target market is big enough that, a RISC-V design they used for either mobile or laptop use would become a defacto standard.
 
Does any of your understanding of technology come from sources other than youtubers doing ahegao faces on their title slides?
Does any of your understanding of technology come from sources other than Apple's promotional slides? Do you really think that Unified Ram is still superior? Videos or not, you wouldn't like any of my sources anyway. Only Apple approved sources for you.

sddefault.jpg
 
SoftBank is ARM, and until someone else owns ARM, this will continue.
ARM and Apple predate ARM and Softbank by 16 years. With a 50-year agreement in place Apple is still dealing with their pre-Softbank agreement, and unless Apple chooses to change the terms there is nothing Softbank can do about it, now some time in 2041 Softbank can start proposing Apple new terms for a possible 2042 agreement but who knows what the processor space will look like by then, I have seen some interesting things with Analog CPU's so who knows maybe we are going to make the 50's come alive again in 2040.
What about the 34% decline year over year? It's certainly not due to the iPhone, because that's said to be doing better.
In one quarter, after nearly tripling the number of macOS deployments active in the wild, they saw a bigger hit in that quarter because fewer Apple users felt the need to keep their Laptops on a 3 year cycle, and it was the quarter directly preceding the known launch date of the new model, are you going to spend upwards of $2K on a piece of computer hardware when you know its replacement is just 4 months out and will be a significant performance increase over the existing one. I mean their 4th quarter is up more than 16% from last year, at a record $22B and they managed to convert over another 2.3% of the global Android population during that time too.
I'm not sure what's wrong with Intel CPU's at the time, other than battery life couldn't be doing well. I do know that the old Intel's can still be good for productivity. Admittedly the render time on the M1 and M3 is drastically faster, even though that's probably to due with the Media Engine. Older Intel based Macs could use multiple monitors while the base model M series can't. Thunderbolt on the M series isn't as compatible as the older Intel ones.
Intel's 8th Gen mobile platform was unstable and had bad bios issues, the microcode updates for it are now globally infamous for how bad it has tanked performance, they run hot, and have a high failure rate because of bad solder under the heat spreader. Multiple monitors on the base M series is easy you just need to install Display Port Manager which is available from the DisplayLink site https://www.synaptics.com/products/displaylink-graphics/downloads/macos.
More so with Intel than AMD. Intel adopted the big little core design that Apple is using for their M series chips, as well as hardware video encoding. Intel was already working on their own GPU, but that's probably due to Nvidia taking their server market share away. AMD hasn't done anything different other than work on power efficiency, but it was due to Apple.
We can go round and round on who to attribute what to here, but competition is finally stepping up in the mobile space it has too long been filled with stagnant Intel offerings, and it still is, Apple leaving Intel cost them a pretty penny, it was a $3-4B a year deal and that stings no matter who you are. I am just glad AMD is stepping up here, I just hope they get some actual silicon out, Lisa presents a mean slideshow, but I do not see a lot of options on the shelves outside the top-end gaming models which isn't usually what I shop for in a laptop.
On the flip side you now get 50% the performance on all older x86 applications, and no compatibility with 32-bit applications. You lost Boot Camp, which means your ability to use Windows applications at full speed and compatibility is lost. While better battery was a thing back in 2021, the newer AMD Ryzen Dragon Range chips I believe have caught up. Not sure where you get performance because no M series chip has been proven faster unless you run Geekbench or video encoding.
The thing with the legacy x86 programs is if they weren't pinning the Intel options they still weren't pinning the M1, so if you are still not at max system utilization then there is no difference, can you tell if a program running at 20% system load is running better than if it were using 40%, but the faster storage and memory in the M series certainly made them feel snappier. Anything not updated to run ARM native by the end of the first year was replaced with a new software that had so it made many devs who had essentially abandoned their software actually make updates or just lose their customer base. Bootcamp was terrible, nobody had time to stop what they were doing, close all their windows, manually save up all their work, shutdown, boot into Windows, then start up and start doing other jobs, then repeat the whole shutdown process to reboot into windows, Parallels worked great on day 2 of the M1 launch, and it still does, my NT based accounting software we run for pensions and payroll runs better on Parallels Windows 11 ARM than it runs on anything else I have running Windows 11 64. I don't know why this is, I can't answer it and neither can the support company, we figure it has something to do with the Apple networking stack, but it loads windows noticeably faster when moving between the menus because it is still a Terminal-based software set.
But outside of gaming the M series either ties or crushes each Intel or AMD option it's placed against, especially if you are doing anything, Video, Audio, or IO-intensive, this is why Apple has seen nearly a 300% increase in their units in the workplace, because they are working.
Just like it's that simple to add DLSS support to games?
Yeah. XCode is a very nice developer environment, not sure what DLSS has anything to do with this, but I work alongside teaching staff who can get kids in grade 8 to add DLSS and now Frame Gen (we got the funding grant) to their projects during their environment setup that they cover in the first 75-minute class, and if an underpaid Teacher can lead a room of pubescent shit-disturbers through the process then I am not sure what the excuse of a lot of the professionals out there is.
At the time, ARM wasn't exactly in a good financial situation. ARM taking the ¢30 deal was their way of keeping the company afloat. 15 years worth of inflation isn't doing ARM any favors. If Apple doesn't pay them more, you'll see stagnation from future ARM designs. This may work in Apple's favor since they can add their own instructions but that would also make software development on future Apple products problematic.
In the early 90s, $0.30 per unit was a good deal for everybody involved, at that time ARM was charging between 1-2% of the MSRP based on the unit sold, but it was mostly being used in cheap electronics as a microcontroller, those controllers might end up in an expensive appliance or what not but Company A would license the ARM chip and build a circuit board with it and then sell it to company B for $10, who would then put it in a $700 appliance. So ARM would get like $0.20 for that chip. Apple wanted a long agreement and they had plans for it so at the time they offered up a contract where they were paying ARM significantly more than their average which is where that $0.30 came in and that 50-year term on the license.
Apple controls the developer API and environment, if you want to build a program for an Apple device, you are programming in XCode, using Swift, and Swift is to Objective-C, what C# is to C++, so if you are familiar with one then transitioning to the other isn't that steep of a learning curve.
Softbank has done a shit job of getting Apple, Qualcomm, Broadcom, MediaTek, etc to update their licenses so far and they were all long agreements for pennies.
Those agreements are coming up for renewal for most though, which is why those companies were so absolutely against Nvidia buying ARM, because they knew that Nvidia would be the one dictating the terms of the license renewals where they figured they had a better bet with Softbank, but Softbank went public on ARM, which is how we are getting all this Apple/ARM dirty laundry to begin with and the new terms are anything but kind... It's like Softbank is angry at them for some strange reason...

Funny enough though, all our Doom and Gloom of what would happen if Nvidia bought ARM, was tame compared to what Softbank is doing with the licenses, SoftBank is fighting Qualcomm, Broadcom, and a few others in court right now because they are not wanting to renew any of their licensing agreements with those companies, instead OEM's will be the ARM license holders and they will contract out the fabrication just as Apple does for their chips (TSMC, Intel, Samsung, etc), so they are looking to not only do away with the concept of an Architecture license agreement (so everybody instead licenses full SoC's), but they are also looking to completely remove Qualcomm, Broadcom, etc as the middleman in the process. Samsung would be relatively unaffected as they are the OEM, but they would no longer be able to continue their AMD partnership for the Exynos chips, and would instead need to be using the Mali or Immortal GPU architecture. The ARM board is looking to bring the company back into the Black, which it hasn't been for basically the whole duration of the time Softbank had it private. If they manage to proceed, the only companies after 2026 that would have the old license types are Apple and Nvidia, where Nvidia's wouldn't expire until 2060, so they get a full 20 years on Apple after that.

Crazy times for ARM right now, so for all we know ARM implodes in the next few years and gets chopped up and sold to patent trolls, which would also be kind of funny.
 
Last edited:
Does any of your understanding of technology come from sources other than Apple's promotional slides? Do you really think that Unified Ram is still superior? Videos or not, you wouldn't like any of my sources anyway. Only Apple approved sources for you.

View attachment 617818
Yeah, gotta give Apple props for saying that with a straight face, they lifted that line from an Nvidia playbook for sure... You're holding it wrong 2.0
 
With the exception of when they were using Intel CPUs Apple never left RISC. RISC is an CPU architecture/design philosophy that displaced the previous one.
Not quite, in the 1980s and early 1990s Apple was using Motorola m68k CISC CPUs.
Those were then replaced by IBM's then-new PowerPC RISC CPUs circa 1993 since Motorola had no roadmap for m68k beyond the 68060 (Pentium 1 equivalent) which was already outclassed by the PowerPC 601.

The writing was on the wall for Apple's move to ARM in 2020 considering Intel's bumblings at the time.
Now, due to SoftBank's bumblings it will not be shocking to see Apple move to RISC-V in the 2030s - compared to Intel at the time, however, this move won't be due to architectural inefficiencies with ARM as much as it will be due to contractual obligations and financials with SoftBank.
 
ARM and Apple predate ARM and Softbank by 16 years.
To the point where ARM was known as Acorn RISC Machine.
With a 50-year agreement in place Apple is still dealing with their pre-Softbank agreement, and unless Apple chooses to change the terms there is nothing Softbank can do about it, now some time in 2041 Softbank can start proposing Apple new terms for a possible 2042 agreement but who knows what the processor space will look like by then, I have seen some interesting things with Analog CPU's so who knows maybe we are going to make the 50's come alive again in 2040.
Assuming that ARM lasts that long, considering their financial situation. I know it benefits Apple to continue this agreement, but this could push them back to bankruptcy and maybe Nvidia owning ARM.
In one quarter, after nearly tripling the number of macOS deployments active in the wild, they saw a bigger hit in that quarter because fewer Apple users felt the need to keep their Laptops on a 3 year cycle, and it was the quarter directly preceding the known launch date of the new model,
Four straight quarters of shrinkage, not one. Don't forget earlier this year they were down 40%.
are you going to spend upwards of $2K on a piece of computer hardware when you know its replacement is just 4 months out and will be a significant performance increase over the existing one. I mean their 4th quarter is up more than 16% from last year, at a record $22B and they managed to convert over another 2.3% of the global Android population during that time too.
Hard to forget that earlier this year Apple released the M2 Pro and Max, and already has the M3 lineup out in the same year. Makes buying their products unconformable when you're not sure if it'll be outdated within the same year.
The thing with the legacy x86 programs is if they weren't pinning the Intel options they still weren't pinning the M1, so if you are still not at max system utilization then there is no difference, can you tell if a program running at 20% system load is running better than if it were using 40%,
If it were a light application then no you wouldn't. It's when you run applications that haven't been ported to ARM that always use max CPU utilization that you'll notice that 50% decline. Not many people want to go buy hardware that automatically has a 50% performance penalty to older applications. It gets costly when you need to upgrade that application just to get the same performance on the older Intel Mac. I'm sure there's some people like paradoxical who have unlimited funds, but some people do not want to pay more for less unless they pay even more.
but the faster storage and memory in the M series certainly made them feel snappier.
Unless you bought the base model M2 where the 256 SDD has half the speed because it has half the nand chips. Even the base model M3 with 8GB has a dramatic difference over the 16GB models, to the point where it can be 1/5th the speed. These are not cheap machines, and anyone buying the base model, which I imagine is like the majority, will be extremely disappointed.
But outside of gaming the M series either ties or crushes each Intel or AMD option it's placed against, especially if you are doing anything, Video, Audio, or IO-intensive, this is why Apple has seen nearly a 300% increase in their units in the workplace, because they are working.
Where exactly you seeing the M series beating anything AMD and Intel has? Again, no Geekbench.
Yeah. XCode is a very nice developer environment, not sure what DLSS has anything to do with this,
The point is that just cause it seems easy, doesn't mean you won't run into problems.
Funny enough though, all our Doom and Gloom of what would happen if Nvidia bought ARM, was tame compared to what Softbank is doing with the licenses, SoftBank is fighting Qualcomm, Broadcom, and a few others in court right now because they are not wanting to renew any of their licensing agreements with those companies, instead OEM's will be the ARM license holders and they will contract out the fabrication just as Apple does for their chips (TSMC, Intel, Samsung, etc), so they are looking to not only do away with the concept of an Architecture license agreement (so everybody instead licenses full SoC's), but they are also looking to completely remove Qualcomm, Broadcom, etc as the middleman in the process. Samsung would be relatively unaffected as they are the OEM, but they would no longer be able to continue their AMD partnership for the Exynos chips, and would instead need to be using the Mali or Immortal GPU architecture. The ARM board is looking to bring the company back into the Black, which it hasn't been for basically the whole duration of the time Softbank had it private. If they manage to proceed, the only companies after 2026 that would have the old license types are Apple and Nvidia, where Nvidia's wouldn't expire until 2060, so they get a full 20 years on Apple after that.
Considering ARM was up for sale not long ago, shows that these license agreements don't work, at least for the long term. Nvidia would have brought out their team of endless lawyers to fight everyone for more money, or at least be creative in how to circumvent the agreements.
You missed the "12" in front of the 8 bro.
You know what else has the number "12" in front of it? The amount of ram in a Samsung Galaxy S21 Ultra from 2021.
 
You know what else has the number "12" in front of it? The amount of ram in a Samsung Galaxy S21 Ultra from 2021.

You didn't really just say this, did you? LMFAO this dude doesn't even know the difference between DDR and flash storage!

My Macbook has 128gb of ram. Your phone has 128gb of storage. Keep watching youtube vids, one day you'll figure out this technology stuff.
 
To the point where ARM was known as Acorn RISC Machine.

Assuming that ARM lasts that long, considering their financial situation. I know it benefits Apple to continue this agreement, but this could push them back to bankruptcy and maybe Nvidia owning ARM.

Four straight quarters of shrinkage, not one. Don't forget earlier this year they were down 40%.

Hard to forget that earlier this year Apple released the M2 Pro and Max, and already has the M3 lineup out in the same year. Makes buying their products unconformable when you're not sure if it'll be outdated within the same year.

If it were a light application then no you wouldn't. It's when you run applications that haven't been ported to ARM that always use max CPU utilization that you'll notice that 50% decline. Not many people want to go buy hardware that automatically has a 50% performance penalty to older applications. It gets costly when you need to upgrade that application just to get the same performance on the older Intel Mac. I'm sure there's some people like paradoxical who have unlimited funds, but some people do not want to pay more for less unless they pay even more.

Unless you bought the base model M2 where the 256 SDD has half the speed because it has half the nand chips. Even the base model M3 with 8GB has a dramatic difference over the 16GB models, to the point where it can be 1/5th the speed. These are not cheap machines, and anyone buying the base model, which I imagine is like the majority, will be extremely disappointed.

Where exactly you seeing the M series beating anything AMD and Intel has? Again, no Geekbench.

The point is that just cause it seems easy, doesn't mean you won't run into problems.

Considering ARM was up for sale not long ago, shows that these license agreements don't work, at least for the long term. Nvidia would have brought out their team of endless lawyers to fight everyone for more money, or at least be creative in how to circumvent the agreements.

You know what else has the number "12" in front of it? The amount of ram in a Samsung Galaxy S21 Ultra from 2021.
Their license agreements were terrible, but Softbank knew that when they purchased them and were happy enough just kicking the can down the road until it was somebody else's problem. I am certain when they bought ARM the intent was to restructure it, "increase" its value, then flip it over to another tech giant. But they failed at doing anything with it's value, and didn't do anything to address the glaring issues with the licensing structure. The license structure was designed to be disruptive (subsidized losses), to gain market share and then convert that market share to profits later when they had established a footprint. Well ARM became the defacto embedded device CPU of choice around the world but SoftBank never did anything to covert from the subsidized loss model to a profit-based one because they are lazy AF, their abysmal track history with every tech project they have ever touched shows that.

If you do any work with Blender, the Adobe suite, or anything to do with Audio editing and mastering then the Apple platform is the way to go, no other mobile solution exists that lets you have as many lossless audio or 4k video files open and in an active project as the Apple ones currently do, if you move over to a full desktop range then that changes. The Studio lineup is an exception to that because I classify that as a desktop and it badly needs a refresh because dollar to dollar you can get other OEM systems that will beat it as long as you aren't buying your memory from those OEMs because FML they are charging some stupid prices for DDR5 right now WTF, did a warehouse "flood" again and I just missed that in the news? But the fact exists that you can buy aftermarket memory for an OEM and install it and have that as an option and given the prices, it's the only sensible one, unless you are an Audio Engineer.
Audio engineers have been bitching at Microsoft for a long while, Windows 10 does not correctly handle audio from USB sources, and until 23H1 neither did Windows 11, there is too much latency in the processing, and it causes corruption in the incoming stream which then needs to be dealt with. It's made worse by Microsoft's class-compliance issues which can interfere with MIDI keyboards and DAW controllers so you can find your expensive boards and mixers just refuse to work on your new Windows device, but will work on an older one, or work differently on your new one than they did on the old one because neither are properly complaint. Depending on the age of the hardware Windows drivers may or may not be available and is it worth risking the need to purchase multiple thousands of dollars in musical equipment worth saving a few hundred dollars on the computer hardware?
 
Apple did start out with CISC and yes actually left RISC.

m68k (CISC) to ppc (RISC)
ppc to x86 (CISC with caveats)
and x86 to arm (RISC).
I never said apple started on RISC, and called out intel in the part of my post you quoted.
 
I never said apple started on RISC, and called out intel in the part of my post you quoted.
I never said you were wrong? "Apple never left RISC." is however ambiguous so that they started out with CISC is important clarification.
 
You didn't really just say this, did you? LMFAO this dude doesn't even know the difference between DDR and flash storage!

My Macbook has 128gb of ram. Your phone has 128gb of storage. Keep watching youtube vids, one day you'll figure out this technology stuff.
I'm not referring to the Nand. You may notice that the S21 Ultra has the option for 16GB of ram too. The prevailing joke on the internet about the base 8GB M3 models is that there are some phones with more ram as the base model. The S21 Ultra gets mentioned a lot as a comparison for the joke. Just so happens that the base model S21 Ultra has 12GB of ram.
https://www.gsmarena.com/samsung_galaxy_s21_ultra_5g-10596.php
galaxy S21 Ultra.png
 
Meh, Apple is a phone company that sells computers on the side.
They consistently are taking 10-15% of the PC consumer market (dependent on quarter), which isn't small by any stretch of the imagination. People keep repeating this same line over and over as if it's Apple's fault that the mobile phone market is more lucrative.

I'll tell you right now, if Dell could be Apple and have the biggest market cap and make the most profit every year, they'd switch places in a second and "deal with" the epithet of being called "a phone company".
 
Their license agreements were terrible, but Softbank knew that when they purchased them and were happy enough just kicking the can down the road until it was somebody else's problem. I am certain when they bought ARM the intent was to restructure it, "increase" its value, then flip it over to another tech giant. But they failed at doing anything with it's value, and didn't do anything to address the glaring issues with the licensing structure. The license structure was designed to be disruptive (subsidized losses), to gain market share and then convert that market share to profits later when they had established a footprint. Well ARM became the defacto embedded device CPU of choice around the world but SoftBank never did anything to covert from the subsidized loss model to a profit-based one because they are lazy AF, their abysmal track history with every tech project they have ever touched shows that.
How can SoftBank convert ARM to a more profitable model if Apple and Samsung have ironclad agreements that says they don't pay a penny more for decades? All I see them doing now is going after everyone else, like Qualcomm.
If you do any work with Blender, the Adobe suite, or anything to do with Audio editing and mastering then the Apple platform is the way to go, no other mobile solution exists that lets you have as many lossless audio or 4k video files open and in an active project as the Apple ones currently do, if you move over to a full desktop range then that changes. The Studio lineup is an exception to that because I classify that as a desktop and it badly needs a refresh because dollar to dollar you can get other OEM systems that will beat it as long as you aren't buying your memory from those OEMs because FML they are charging some stupid prices for DDR5 right now WTF, did a warehouse "flood" again and I just missed that in the news? But the fact exists that you can buy aftermarket memory for an OEM and install it and have that as an option and given the prices, it's the only sensible one, unless you are an Audio Engineer.
It's hard to find comparisons between the M3's and anything AMD/Intel, but sadly Max Tech seems to be the only ones who've done it. I can already hear paradoxical foaming at the mouth because it's a YouTube link. If you go 2/3 through the video because most of it is just reading spec sheet stuff, he does actually run benchmarks. In GPU performance the M3 Pro is destroyed by the RTX 4070. In Blender it's faster on the Intel machine. With Davinci resolve, the Intel machine was faster. It helps that Intel has hardware AV1 encoding while the M3's do not. The Intel machine did lose in Adobe Lightroom Classic and Geekbench, but nobody should listen to Geekbench scores. Surprisingly, the Ray-Tracing hardware did help with Blender because the M2 Pro was twice as slow. I don't know if the Intel machine is faster in audio editing, and I'm sure it slows down dramatically when unplugged but I wouldn't say that Apple is the way to go.

View: https://youtu.be/Teb4HlsW2PI?t=396
Audio engineers have been bitching at Microsoft for a long while, Windows 10 does not correctly handle audio from USB sources, and until 23H1 neither did Windows 11, there is too much latency in the processing, and it causes corruption in the incoming stream which then needs to be dealt with. It's made worse by Microsoft's class-compliance issues which can interfere with MIDI keyboards and DAW controllers so you can find your expensive boards and mixers just refuse to work on your new Windows device, but will work on an older one, or work differently on your new one than they did on the old one because neither are properly complaint. Depending on the age of the hardware Windows drivers may or may not be available and is it worth risking the need to purchase multiple thousands of dollars in musical equipment worth saving a few hundred dollars on the computer hardware?
As someone who's on Linux, I can confirm that a lot of my audio issues just vanished when I moved away from Windows 10. Stuff just works on Linux Mint, and I've jumped onto Pipewire which is not ready for prime time but still less issues than Windows 10.
 
Last edited:
How can SoftBank convert ARM to a more profitable model if Apple and Samsung have ironclad agreements that says they don't pay a penny more for decades? All I see them doing now is going after everyone else, like Qualcomm.
Qualcomm, Broadcom, and Mediatek have their previous 20+ year agreements expiring in 2026. Changing to a model where OEM’s directly license chips from ARM cutting out the middleman is how.
Selling ARM CPU’s makes up the bulk of Qualcomm and Broadcom’s profits, and almost all of MediaTek’s they make more building and selling bulk stock ARM chips than ARM makes selling them licenses by an overwhelming amount.
By changing the licensing terms, getting licensing directly with the manufacturers, and cutting out the middleman they are looking to increase their profits while decreasing manufacturing costs. So that’s how they plan on doing it and that’s why ARM is in court as companies who really only exist to resell ARM IP are about to be cut loose.
MediaTek is already prepping for this and partnering up with Nvidia, and Rockchip will likely go with licensing from ARM China as they are still independent of ARM SoftBank.
 
You didn't really just say this, did you? LMFAO this dude doesn't even know the difference between DDR and flash storage!

My Macbook has 128gb of ram. Your phone has 128gb of storage. Keep watching youtube vids, one day you'll figure out this technology stuff.
DukenukemX meant that the Samsung Galaxy S21 Ultra has 12GB RAM, hence the "12" in front of it.
Storage was never mentioned, so apparently you are the one who doesn't know the difference between RAM and NAND.

Your reading comprehension needs some work.
More memory than processing. :p

Their license agreements were terrible, but Softbank knew that when they purchased them and were happy enough just kicking the can down the road until it was somebody else's problem. I am certain when they bought ARM the intent was to restructure it, "increase" its value, then flip it over to another tech giant.
Agreed, the sale of ARM really should have gone to NVIDIA, an infinitely better megacorp with the drive to actually push forward the ARM ISA instead of license-restricting it into oblivion like SoftBank.
 
Last edited:
Agreed, the sale of ARM really should have gone to NVIDIA, an infinitely better megacorp with the drive to actually push forward the ARM ISA instead of license-restricting it into oblivion like SoftBank.
Honestly ARM is now doing what I assume Nvidia planned to do with it. Only Nvidia would have paired it with their GPU solutions and gotten rid of Mali and Immortalis. A cellphone is the perfect use case for DLSS and it you can upscale from 420p using traditional methods and still have it look great at that screen size.

Qualcomm and Broadcom were right to fear their future should Nvidia own ARM, but ARM needs to start turning a profit even more so now that it’s public. Nvidia would have ended in the same place ARM is going but they would have done it gradually and smoothed the path with some partnerships and deals. ARM is just going in dry and drinking their tears.
 
DukenukemX meant that the Samsung Galaxy S21 Ultra has 12GB RAM, hence the "12" in front of it.
Storage was never mentioned, so apparently you are the one who doesn't know the difference between RAM and NAND.

Your reading comprehension needs some work.
More memory than processing. :p

There are only two options to interpret his poor wording. Either he made the worst, lamest comeback ever (possible) or he doesn't know the difference between storage and ram.

1. He posts a meme about how Apple laptops have 8GB of RAM and think it's acceptable.
2. I post a screenshot showing my personal laptop with 128GB of ram and say "he missed the 12 in front of the 8" (hence, making a three digit number)
3. He says "You know what else has a 12 in front of it? The amount of ram in a Samsung Galaxy S21 Ultra" SICK BURN (except nobody understands how this is even relevant)

Essentially, what he said is pants on head dumb either way. The only thing that makes sense is that he was trying to throw a dig at Apple showing that a phone from 2021 had the same amount of ram as my Macbook (which of course it didn't), but it did have 128gb of storage which is where he possibly got confused. If that wasn't what he was saying, I guess he just decided to say something entirely random with no connection to the discussion.

Me: "My Macbook Pro is badass and has 128GB of ram"
Him: "A Samsung phone has 12gb of ram"
Everyone else. "ok....and?????"
 
There are only two options to interpret his poor wording. Either he made the worst, lamest comeback ever (possible) or he doesn't know the difference between storage and ram.
Nobody here mentioned storage but you.
1. He posts a meme about how Apple laptops have 8GB of RAM and think it's acceptable.
I don't think having 8GB of ram in a $1000 laptop is acceptable.
2. I post a screenshot showing my personal laptop with 128GB of ram and say "he missed the 12 in front of the 8" (hence, making a three digit number)
Cause a lot of people are going to buy a $7k Macbook?
3. He says "You know what else has a 12 in front of it? The amount of ram in a Samsung Galaxy S21 Ultra" SICK BURN (except nobody understands how this is even relevant)
It's... it's more ram than a base model Macbook. It's an issue that Apple addressed it by saying something stupid like 8GB is like 16GB for a Windows PC.

View: https://youtu.be/u1dxOI_kYG8?si=qy6wDwKC6WbSHyml

Honestly ARM is now doing what I assume Nvidia planned to do with it. Only Nvidia would have paired it with their GPU solutions and gotten rid of Mali and Immortalis. A cellphone is the perfect use case for DLSS and it you can upscale from 420p using traditional methods and still have it look great at that screen size.

Qualcomm and Broadcom were right to fear their future should Nvidia own ARM, but ARM needs to start turning a profit even more so now that it’s public. Nvidia would have ended in the same place ARM is going but they would have done it gradually and smoothed the path with some partnerships and deals. ARM is just going in dry and drinking their tears.
Only problem is that ARM may not have enough money for lawyers to make it work. Nvidia on the other hand does, but Nvidia could also engineer ARM in such a way that a new agreement would be something that Apple, Samsung, and etc would try and seek out. I still don't like the idea of Nvidia owning ARM, but on the other hand it's exactly what ARM needed.
 
1. He posts a meme about how Apple laptops have 8GB of RAM and think it's acceptable.
While the picture is a meme, since the M1 debuted in 2020 I have seen many individuals actively defend Apple by saying 8GB of unified RAM is enough on their platforms, even though it has been proven that it clearly is not sufficient and puts additional wear on the NAND storage due to SWAP usage.
Ergo, the meme speaks the truth.

Me: "My Macbook Pro is badass and has 128GB of ram"
Just because someone owns a Ferrari doesn't automatically mean they know shit about cars.
Having nice things and being knowledgeable are not mutually exclusive.

I don't think having 8GB of ram in a $1000 laptop is acceptable.
I agree with this and especially at that price point, but this is Apple and one does pay the Apple tax for the privilege to use their equipment and software, good as it may be.
8GB of RAM was barely sufficient for casual usage circa 2019.

The minimum that should be used in any laptop or desktop for both Windows and MacOS in 2023 should be 16GB just due to the fact at how much software has ballooned in resource usage over the last few years, and regardless of whether the ISA is x86-64 or AArch64.
Obviously one can get away with 8GB of RAM, but as you mentioned earlier even Samsung Galaxy smartphones have had 8-16GB of RAM since 2020, and those are phones.
 
Last edited:
While the picture is a meme, since the M1 debuted in 2020 I have seen many individuals actively defended Apple by saying 8GB of unified RAM is enough on their platforms, even though it has been proven that it clearly is not sufficient and puts additional wear on the NAND storage due to SWAP usage.
Ergo, the meme speaks the truth.


Just because someone owns a Ferrari doesn't automatically mean they know shit about cars.
Having nice things and being knowledgeable are not mutually exclusive.


I agree with this and especially at that price point, but this is Apple and one does pay the Apple tax for the privilege to use their equipment and software, good as it may be.
8GB of RAM was barely sufficient for casual usage circa 2019.

The minimum that should be used in any laptop or desktop for both Windows and MacOS in 2023 should be 16GB just due to the fact at how much software has ballooned in resource usage over the last few years, and regardless of whether the ISA is x86-64 or AArch64.
Obviously one can get away with 8GB of RAM, but as you mentioned earlier even Samsung Galaxy smartphones have had 8-16GB of RAM since 2020, and those are phones.
8 Gigs is enough if you never run anything more complicated than the apps built into the Macbook, but you so much as glance at Chrome or Office, or god forbid anything Adobe and you will shred that NAND like fresh vinyl.
Apple saying their 8GB is enough is like in 2020 when HP and Dell were still shipping Windows 11 systems with 4GB ram standard, as an acceptable minimum, spin it all you want it's garbage.
 
And you have to admit the Intel CPU options before Apple dumped them were kind of trash, 8th Gen Mobile was not a nice platform.

Let's be honest here. Apple's laptop designs aren't known to be the best engineering over the past 10-12 years, despite the propaganda. Not that Intel's chips were super great but the way around Intel's issues was to design computers that actually could disperse heat and use the chips to their full potential without thermal throttling. But you know, muh pretty pretty shinies. Form over function. All to get the thinnest laptop.
 
Back
Top