“Make me one with everything.” – Buddhist monk to a hot dog vendor
Why don’t polar bears eat penguins? Easy: because polar bears live at the north pole and penguins live at the south pole. A polar bear in the wild has never seen a penguin.
Did a Tyrannosaurus rex ever hunt a Stegosaurus? Nope, because the two species were never contemporaneous; they lived millions of years apart. In fact, we’re closer in time to T. rex than any Stegosaurus ever was.
Will ARM replace the x86 as the dominant CPU species? Maybe, but only for a short while. In the grand scheme of things, both will be just brief spots on the technological timeline. I doubt the starship Enterprise will use an upgraded version of either, just as our machines today don’t use upgraded steam engines or water wheels. Stuff changes.
Apple’s recent announcement of new products based on “Apple Silicon” processor chips (the company’s code name for ARM) has tongues wagging. Will Apple disrupt the entire PC industry? Will x86 continue to roam the earth? Or is this nothing more than one company hyping its own products, as companies are wont to do?
There was a time when computer companies always designed and built their own processors. IBM created its CPUs from clean-sheet designs and assembled them in its own factories. Same for Burroughs, Data General, Digital Equipment Corp., Motorola, Sun Microsystems, Texas Instruments, Fujitsu, Rockwell, Hewlett Packard, and on and on. That’s what you did. You created your own processors for pride and differentiation.
Somewhere along the line, these companies figured out that (a) processors are really expensive to design and maintain; (b) customers don’t care; and (c) there wasn’t actually much differentiation anyway. Everyone was working from the same set of rules, attending the same seminars, reading the same research papers, and solving the same problems. Sure, these CPUs all differed in their details. But to a first-order approximation they all delivered comparable performance, power efficiency (if anyone cared), and flexibility. And customers still didn’t care.
Economically, it made a lot more sense to consolidate all that CPU design work. One by one, computer companies started buying CPUs instead of making them in-house. Naturally, each one wanted its CPU to become the standard for everyone else, so IBM (PowerPC), Sun (SPARC), MIPS Computer Systems (MIPS), Intergraph (Clipper), and most of the others tried and failed to become merchant microprocessor vendors. Some threw in the towel early; a few fought to the end. Some are still hanging in there (ahem, Oracle). But Intel ultimately won out, for reasons that aren’t entirely clear. Eventually someone had to win, and it’s tempting to retroactively declare that the outcome was completely logical and inevitable, but I suspect it was mostly luck that saw Intel’s x86 come out on top. Not that there’s anything wrong with that. Evolution is messy.
Running a big monolithic CPU company like Intel, AMD, or a few others made economic sense. They could centralize CPU development and amortize its cost over many computer companies and their combined volume. The profits got plowed back into a single R&D budget instead of getting split up amongst all the competing CPU vendors. Processor performance surged forward, and computer companies had one less thing to worry about. The downside was that all (almost) computers used the same processor but, again, customers didn’t care.
In marketing terms, the computer companies disaggregated. They disaggregated again when the CPU vendors started outsourcing their silicon manufacturing. AMD and nVidia might design their own processors, but the chips are fabbed elsewhere. Turns out, the cost of semiconductor R&D is even higher than it is for CPU design. Centralize the cost, amortize it across more customers, remove a measure of differentiation, and move the state of the art forward. We’re getting good at this.
Now IBM ThinkPad PCs are assembled and branded by Lenovo, they use processors from AMD, and their silicon is fabricated by TSMC. Technical support is likely handled by a third-party specialist firm as well. And that’s for a company that prides itself on its vertical integration.
Intel is pretty much the last holdout of the “big smokestack industry” model where everything from design to manufacturing to marketing is done under one roof. But even that might be changing soon, amid recent slips in the company’s manufacturing technology and upper management’s reevaluation of its path forward.
ARM and the other processor-IP vendors added yet another layer to this disaggregation. Now the CPU is designed in one place, integrated in another, fabricated in another, and soldered into yet another company’s product. The product company – say, Nest or Dell or Chevrolet or Samsung – has no say whatsoever in their processor design, pinout, feature set, cost, fabrication, or timeline.
Apple chose to reverse some, but not all, of that trend. The Cupertino company hasn’t gone back to the Late Cretaceous period of full vertical integration, but it has wound back the clock a bit. Its M1 processor chip is based on an architecture licensed from ARM, but Apple creates its own implementation. That removes several links in the chain between instruction set and end product. Instead of buying a readymade chip and then designing a product to fit around it, Apple can design the chip it wants from the outset. The instruction set is still generic, but we know that doesn’t matter. Customers still don’t care, and Apple is adept at swapping out ISAs and plastering over the differences.
The reviews have been glowing. Apple’s new MacBook Air and other M1-based products perform as well as their x86-based counterparts, which is remarkable in itself. Performance, power efficiency, and even software compatibility all get positive marks. Apple fans inevitably predict a revolution and a new king of the processors.
Maybe. But Apple’s real innovation was in going back to basics. The M1 is a nice part, but it’s not particularly revolutionary. It’s simply designed for its intended use. Instead of shoehorning in a mass-market processor that will be shared with all of its competitors, Apple created exactly what it wanted. The buses, the caches, the instruction mix, the coprocessors, the auxiliary hardware assists – they’re all designed with the end product in mind. As a result, the M1 is the right part in the right place. It’s efficient. The surrounding logic was designed for it, and vice versa. The operating system was designed for it, and vice versa. It was a holistic approach, a Zen design process, at one with everything.
If that’s the wave of the future, bring it on. I’m all for efficient and effective design. It’s not for everybody – we can’t all have Apple’s R&D budget – but it’s glorious when it works. But let’s not forget that M1 isn’t special because of some secret ingredient. It has the same basic CPU, the same logic, and the same TSMC fabrication that others use. Apple hasn’t cheated physics or sidestepped reality (this time). It’s just a result of good top-down design philosophy and a focus on integration instead of disaggregation. So, when do MacBooks start eating PCs?
Wall Street hype pushes apple stock price up. Don’t let this information out.
What would happen if we put some polar bears in Antarctica and penguins at the North pole?
Apple is just the latest “IBM”. I’m from that ancient age of, “IBM and the seven dwarfs”.
It is never really about the underlying technology. I is simply, “Marketing, marketing, marketing…” That is why I own zero Apple products. And a lot of “extra” cash 🙂
I’m also a licensed “oenologist”. Can you taste the “black currents” or the “grassiness”?
…sun microsystems…
While the overall argument is sound, the examples are not really spot on. Sun grew up with the commercial chip model (68000 for three generations) and resorted to bespoke (SPARC) when Motorola dropped the ball.
True that the early Sun was 68K and other standard/generic hardware, and the company was proud of it. But for most of its history it’s been SPARC all the way.
Great article, Thanks! One important point to add though: Its not just the staggering cost of engineering the CPU, its the staggering cost of engineering all the supporting toolchains required. We obtain enormous benefit from using GCC or ARM compilers for the ARM architecture. Its *really* expensive to build and maintain and continually advance the toolchains required for customers to actually *use* the products.
Next up: All the MCU vendors continue to roll their own peripherals. Once again, really expensive, and they’re all buggy as hell (both hardware and provided drivers we need to actually use the products). Guess where this will lead…
Thanks again for the article,
Best Regards, Dave