“That’s one small step for [a] man, one giant leap for mankind.” – Neil Armstrong
We all know that computers get faster over time, but it’s sobering to realize just how much faster they’ve gotten.
Next month will mark the 51st anniversary of the first crewed landing on the moon, thanks to the Apollo 11 astronauts and all the engineers and scientists who put them there. By now most of us have heard the story of how lunar module (LM) commander Neil Armstrong had to ignore the onboard computer’s warnings and fly the LM down to the lunar surface when the machine got overwhelmed.
Unlike Gemini capsules, which were flown manually like an airplane, Apollo was fly-by-wire. It was computer controlled, with a combination of input methods including a pilot’s joystick, a calculator-style keypad (a first, since there were no calculators), and fluorescent seven-segment displays controlled by mechanical relays.
The Apollo Guidance Computer (AGC) was state-of-the-art for its time but would be breathtakingly underpowered by today’s standards. It’s not nearly as fast as your laptop, or even your phone – doesn’t matter what phone. It had less computing power than a digital watch. Less than a cable modem or a Wi-Fi card. Less performance, even, than a USB charger.
Programmer Forrest Heller has done a quick analysis of the differences between NASA’s AGC circa 1969 and a couple of different modern USB chargers. His findings? A good USB charger is not just faster than the AGC, it’s more than 500× faster than the computer that took us to the moon and back. Strewth.
Contrary to popular belief, the AGC was built using silicon integrated circuits. Real ICs! More than 2500 of them, in fact, each one an identical dual 3-input NOR gate. That’s it. With no other logic, NASA’s engineers had to synthesize all other functions and operations from nothing but 3-input NORs. Those thousands of ICs were wire-wrapped to a passive circuit board. A nearly identical AGC lived in the Apollo 11 Command Module.
Memory was mostly wire-wound magnetic cores. There were 2048 words (about 4KB) of RAM and 36 kilowords of read-only memory for program storage. A “word” meant 15 bits of data plus parity for the memory, but only 14 bits of data, a sign bit, and an overflow bit inside the pure-NOR “processor.” The machine used one’s-complement integer arithmetic; there was no floating-point capability, despite the obvious applications in spaceflight.
The AGC was programmable, of course. It had registers, a program counter, a 12-bit address map, and a fully defined instruction set. It could do addition, subtraction, multiplication, and even division. It had conditional and unconditional branches (with a one-deep return stack), bitwise logic functions, and lots of timers. It makes the x86 architecture seem positively orthogonal.
Every AGC instruction took at least 12 clock cycles to execute, and often 24 or 36 cycles were needed. Integer division took 72 cycles. There was a low-power standby mode, but it was never used in flight. The machine itself was considerably bigger than a shoebox and weighed about 70 pounds in its metal case. Keypad sold separately.
The AGC ran at a 1024-KHz (1.024 MHz) clock frequency, which works out to around 1 microsecond per cycle, or more than 11 microseconds for even the quickest instructions.
Compared to that, Anker’s $49 USB charger has one of Cypress Semiconductor’s CYPD4225 USB-C microcontrollers, which is based on a Cortex-M0 CPU running at 48 MHz. Right off the bat, the USB charger enjoys nearly a 48× improvement in clock speed. More significant, however, is the fact that Cortex-M0 executes most instructions in a single cycle, not dozens of cycles like the AGC. Assuming that ARM instructions are more or less equivalent to AGC instructions – and they are – that works out to well over a 500:1 ratio in performance.
The USB charger also has twice as much RAM and nearly twice as much ROM. On the other hand, 32-bit ARM instructions take up more memory space than AGC’s 16-bit words, so we could argue that the USB charger has less effective ROM space. “Going to the moon on 8KB sounds hard,” Heller points out.
I don’t know which is more impressive: that we sent people to the moon and back with so little computing power, or that we’ve improved so radically since then. Fortunately, we don’t have to choose. Both are real.