“The young always inherit the revolution.” – Huey Newton
Karl Marx, Charles Babbage, and Adam Smith walk into a bar.
After ordering drinks, the fathers of modern computing, socialism, and capitalism have a little discussion about the best ways to improve engineers’ productivity and efficiency. Babbage the engineer says that creativity and inspiration are key to innovation; that individual designers and inventors are the foundation of our technological progress.
Smith says that competition improves the breed. When individuals and corporations compete with one another, the whole industry benefits. Winners overcome losers in the marketplace of ideas.
Marx, peering through his beard, says that’s nonsense, and wasteful besides. Central planning improves productivity by eliminating redundancy. Competition is inherently inefficient because it results in duplicated effort, much of which will be discarded or wasted. Better to choose a path early and march along it.
The discussion gets more heated after the drinks arrive. But who’s right? If engineering is all about efficiency, why do we operate in such an inefficient manner?
Take microprocessors, operating systems, and programming languages as just three examples. Why do we have so many? Is an ARM processor really all that different from an x86, MIPS, RISC-V, or the dozens of other CPU architectures clogging the supply pipeline? Any Turing-complete machine can, by definition, perform the same tasks as any other. So why don’t we just pick a $#@% chip family and move on?
It’d be great. All software would be compiled for the same instruction set, so there’d be no more compatibility issues. Engineering students could learn the same, standardized, instruction-set architecture and programming model. No more big/little-endian byte-ordering problems. Code size would be predictable. Disassembly much easier. And microprocessor manufacturers could focus on valued-added features instead of these hugely expensive architectural expeditions.
Just look at all the ambitious CPU vanity projects that have failed: Intel and HP spent billions developing, building, and marketing Itanium. That’s space-program money… completely wasted. Sun/Oracle gave up on SPARC chips. PowerPC has finished circling the drain. Clipper, Alpha, and dozens of other high-end architectures all came to nothing, but only after squandering all the money and the man-hours. If we had just agreed on a single processor – any processor – that effort could have been put to more productive use.
Same goes for operating systems and programming languages. If everyone used C++ (or Python, or whatever) we could all share code, helpful tips, and source libraries. Programmers would be largely interchangeable from one project to another, and from one company to another. You’d have a much easier time finding a coding job if you didn’t have to narrow your search to the one or two languages you wield with proficiency.
We already do this on a smaller scale. Most companies use the same CPU, OS, and development tools for all their internal projects. You don’t often see a design team jump from Linux-on-x86 to RTOS-on-ARM and back again. They want to leverage their experience and amortize their investments. They’d also like to be quicker and more efficient, and sticking with the tools you know is the best way to do that. Change is bad.
All this fragmentation and balkanization doesn’t help the product. Did you buy your smartphone, video game, or TV based on its processor, OS, or development toolchain? The hardware is just a means to an end, and it’s the app layer, not the OS, that provides differentiation. That app development would’ve gone a lot quicker if the programmers had all relied on the same underlying software.
Marx smiled through his Weiβbier, certain he’d made his case for efficiency and standardization.
Smith wasn’t having any of it. Carefully setting down his single-malt, he decried any form of mass coordination as unethically monopolistic and against human nature. It’s in our character, our spirit, to compete and search for better solutions. That’s how progress is made. How can we hope to find the best solution to anything – be it hardware, software, or otherwise – with a single decision, never to be overturned? We can’t have innovation by fiat. Surely the nature of engineering challenges changes over time, and we should change with them. Different problems call for different solutions, and who better to discover that solution than the individual inventor, developer, engineer, or programmer? We’ll crowdsource the revolution.
Furthermore, the prospect of individual gain adds another impetus. Some will pursue engineering challenges purely for the enjoyment of it, but others are driven by the financial upside. We wouldn’t have venture capital without the carrot at the end of that stick. More’s to the good; the customer benefits either way. Let each developer create whatever they may, however they please, and let the invisible hand of the marketplace decide. Yes, there will inevitably be losers, bankruptcies, and layoffs. But there will also be huge successes rewarded by the very customers they’re serving. Competition leads to agility; standardization is stultification.
Babbage finished his ale in time to observe that we already practice a little of both philosophies. Defense contractors usually do their programming in Ada because it’s required. The language’s constraints make testing, debugging, and maintenance easier, and it levels the playing field among competing contractors. Networking protocols are standardized to enable interoperability. And thousands of hardware interfaces, from disk drives to USB dongles to RJ45 jacks, are also standardized. We may not be able to agree on what an AC power plug should look like in each country, but we convert them all to the same 5VDC for our phone chargers.
The bartender, being of course the wisest of the group, suggested that these technology trends run in cycles. For a while, everyone wants customized logic, in-house software, and specialized user interfaces. Then the pendulum swings the other way, and everyone rushes toward computing-as-a-service, virtualized hardware, and standard APIs. It seems to follow generational cycles, he observed. Every new crop of engineers thinks their predecessors were idiots. They tear up the rulebook (or so they think) and start over again.
“It’s all fine with me,” he concluded,” as long as they keep coming into my bar.” Let’s start another round.
One thought on “Three Radicals Walk Into a Bar…”