Apple, the M1 chip, and What It Means to the Rest of Us
The heart of any computer is its central processing unit (CPU). There’s a fair amount of mystery surrounding this part which does its work hidden away from our eyes.
Apple is transitioning away from Intel, which has provided Apple’s CPUs since 2005, to its own brand of Silicon. There are good reasons for this – Apple is a major customer to Intel and could stand to improve their profit margins significantly by cutting out a supplier and bringing CPU production in-house.
With the choice of Intel CPUs, Apple is building compatibility with the rest of the computing world – even that of unused or long-gone legacy features – into every desktop and mobile machine they sell. This stack of computing legacy is costly, particularly in terms of energy-efficiency, which is an increasingly important consideration, not only in the battery-powered portable market, but also in homes, offices etc. The need for cooling translates into noisy fans and unnecessarily large PC enclosures, cost, and higher energy consumption. So, reason number one for the M1 chip is clearly improved efficiency over the Intel hardware.
Add to this the benefits of moving to a single chipset family for mobile and desktop computing, which will accelerate the convergence of mobile and desktop operating systems over the coming years. A single platform will bring thousands of app developers, facilitate machine inventory management, and, of course, bring us users a huge benefit: the applicability of our learned skillset across all our devices. While this is entirely speculative, we can see it coming from where we are at now.
The good news is we have been here before. Apple has a proven track record of making choices and managing transitions. The last big lift was in 2005, when Apple exited the Power PC alliance it had with Motorola and IBM and transitioned to Intel chips. This is when the legacy chip emulation named Rosetta made its debut as the first iteration of that software. It allowed the older generation chip to be represented in software – albeit with a penalty in computing power, but the newer, more powerful hardware more than made up for it. Then there was the 32 to 64-bit transition: for a while Apple shipped its own software in “Fat” bundles containing both 32 and 64 bit sets of code. Mac OS 10.6 Snow Leopard was the first of Apple’s operating systems that could run 64-bit software in native mode allowing a single app to break through the maximum memory allocation of 4 gigabytes and it still ran on 32-bit machines. I bet that few if anyone reading this remembers running into issues.
Further back in the distant past was the transition to OS X – a true milestone development and not even vaguely related to the Classic Macintosh. Many feared to have a brand-new operating system to play with without any applications, which would have to be re-written first, to be used inside OS X. Apple had long planned for this transition and gave us Blue Box, better known as Classic, an environment where the old operating system could run as an application and legacy software inside of it.
When many of the current mac-tech crew were literally or figuratively still in diapers, we saw the first major transition we can think of: the move from the Motorola 68000 family CPU to the PowerPC, a huge leap into the then future by the aforementioned alliance between Motorola, IBM, and Apple. Macintosh System 7.2 came with an emulation extension that translated software made for the old Motorola chip on the fly and ran it on a brand-new processor.
We now see Rosetta v2 doing the same thing yet again; the vast computational power available in our newer CPUs is only minimally tapped to run older machine code in emulation. Chances are, you won’t notice. It is only a matter of time until we will see all our applications recompiled and run natively on newer Macs powered by the M1 and later generations of Apple’s own Silicon. As more and more software titles are recompiled, we will see a gradual improvement in the performance. There are few hard NOs with M1 based computers such as virtualized systems running on software compiled for intel for example, and solutions are starting to emerge – even in this special use case.
What’s next and how do we deal with the hardware transition in progress already? What considerations do we need to make when investing in a new computer – or ten, or perhaps a hundred for our organizations? Together, we will figure it out.