feature article
Subscribe Now

A History of Early Microcontrollers, Part 4: The Intel 8048 and 8748

Intel announced the first commercially successful microprocessor, the 4004, in late 1971. By 1974 Intel had introduced four microprocessors: the 4-bit, “low-end” 4004 and the upgraded 4040, and the 8-bit 8008 and 8080. Intel’s 4-bit 4004 and 4040 microprocessors were used primarily for embedded control applications where I/O capabilities and performance and lower part cost outweighed the superior processing performance of 8-bit devices. However, Intel no longer had the microprocessor market to itself. Several other semiconductor vendors had introduced competing microprocessors by 1974, notably the 4-bit Rockwell PPS4, the 8-bit Motorola 6800, the multichip Fairchild F8, and National Semiconductor’s 16-bit, multichip IMP-16. Toshiba had designed, built, and delivered the 12-bit TLCS-12 microprocessor specifically as an engine controller for Ford, and many more microprocessors were on the way.

Even worse, the 4-bit TMS1000 microcontroller family, introduced by Texas Instruments (TI) in 1974, put a 4-bit CPU, RAM, ROM, and I/O circuitry on one chip, which simplified system design and cut the cost of processor-based control significantly. The introduction of TI’s TMS1000 family certainly got the attention of several prospective customers (and several semiconductor makers). Intel, under siege, was starting to lose embedded system designs to microprocessor and microcontroller competitors in the embedded market. That had to stop. Intel needed to respond, and the company knew it.

When Henry Blume Jr. arrived at Intel in October 1974, there was already an internal agreement that the company would develop a microcontroller. Intel was already making microprocessors, RAM chips, ROMs, and UV-erasable EPROMs and had the process technologies to make these parts in house. The main decision yet to be made was whether the microcontroller would have a 4-bit CPU, like the TMS1000, or an 8-bit CPU. According to an oral history panel conducted by the Computer History Museum (CHM) in 2008 with many of the people responsible for developing the 8048, Ed Gelbach, Intel’s Senior Vice President of Corporate Marketing, made the decision. Intel’s microcontroller would have an 8-bit CPU because a 4-bit CPU didn’t have enough sex appeal. (Note: The 8048 oral history in the reference section contains several references to sex appeal, not in a sexual context, which seems to have been used as a frequent figure of merit at Intel during those times.)

Although the 8048 is the generic name that’s traditionally used for Intel’s first microcontroller – the actual family name is MCS-48 – the oral history makes it clear that the 8748 EPROM version appeared first. That’s because the 8748 could be used immediately for software development and early prototyping and it would take a year or two for customers to develop their software to the point where they could order ROM-based 8048 devices. The 8748 would also be a more difficult device to manufacture because Intel’s EPROM process technology was quite different from the company’s other process technologies, so it made sense to have that device ready first, and then take the next year to design the ROM-based 8048. Finally, Intel could and would charge a lot more money for a reusable 8748, which meant the company could collect more revenue, sooner.

David Stamm joined Intel in January 1974, and initially worked on fixing bugs with the 4004 and 4040 microprocessors, which were in production at the time. He then designed the Intel 4308, a support chip for the Intel 4040 that combined a 1-Kbyte ROM with some I/O ports. The 4308 combined the functions of four Intel 4001 chips that combined a 256-bit ROM with an I/O port, so the 4308 contained about half a microcontroller, lacking a CPU and RAM. After designing the 4308, Stamm was assigned to the 8048 project. He states in the 8048 oral history that he was single at the time and so he “lived and breathed the 8048 for two-plus years, night and day.”

Stamm’s first responsibility was to develop the microcontroller’s instruction set. It would be all new, to suit an 8-bit microcontroller instead of the microprocessors that Intel was already making. In the oral history, Stamm recalls:

“There were three of us who were full-time on the project: myself, I was in charge of the instruction set and the logic design and the overall chip schedule; David Buddy, who was in charge of the complex tasks associated with integrating the EPROM technology and all of the EPROM programming logic and the sensing logic for the EPROM components, as well as most of the circuit design; and then Dwayne Hook who was responsible for all of the layout of the chip…

“The first phase was really the instruction set design. So here I am, I’m out of college a year, had a Bachelor’s degree, thinking to myself what business do I have developing the next generation instruction set for these components? And really, nobody was looking that carefully over my shoulder, but I said, okay, fine I’ll go for it. Luckily, in college I had taken assembly language programming in a language called COMPASS, which was part of CDC, which is no more.

“But I learned a lot about assembly language programming there, and then I’d worked on the 4040 and I looked at all the other instruction sets in order to try to come up with what kind of instruction set might make sense. The challenge was [that] you had to consider the instruction set in terms of how much complexity in die-cost it would be.

“So, for example, subtractions and compares would be really valuable instructions, but they added dramatically in additional chip area. So, I decided we really couldn’t afford those, especially since we were going to go into an 8-bit design. I had been a big proponent of 8-bit. One of the counter arguments was cost and die-size. So, during these design steps I did everything I could to jettison features and functions that I thought was going to add extra die-size. Looking back at it now, I think I probably went a bit overboard — although at the time it would’ve been difficult to have made that conclusion.”

At one point in the oral history, Blume interjects:

“I also want to point out Dave’s two famous – or his favorite instructions were SEX and SIN, which stood for set external mode and set internal mode. And then when the people in systems took over, they removed SEX and SIN… they renamed them.”

Limitations in the 8048’s design, made to ensure that the chip could be manufactured with the available process technology, were not restricted to the instruction set. Memory-space limitations were another important restriction. The original 8048 had a 4-Kbyte address space limit for program memory, but it was really split into two 2-Kbyte banks. The 8048’s program counter ostensibly had 12 bits, but the most significant bit came from a separate register that could be used for bank switching. Initially, the 8048 incorporated a 1-Kbyte program memory, so this decision didn’t pinch at first. Eventually, it would. Stamm explains, “…there were lots and lots of limitations that existed, primarily in order to keep the die-size down.”

Intel had no simulation software in those early days, so the design team built a breadboard instead. Stamm recalls:

“So, we built this large breadboard, which in and of itself was yet a whole other design project, using the same type of combinatorial logic that we were hoping to eliminate through the development of the 8048… we were using TTL and DTL devices, and the breadboard was huge as I recall. It was probably five foot tall by two or three feet wide, just completely covered with wiring on the back.”

Such was the state of the art for chip design in those days.

Before releasing the new microcontroller to production, the design team had to meet with the Intel executives – Andy Grove, Gordon Moore, and Lex Vadasz – and convince them that the 8748 was ready to enter the market. A demo was in order, and Stamm decided to write a blackjack program. Stamm loved to gamble and frequented Lake Tahoe casinos. The programmed 8748 would drive a dumb terminal, presumably through an RS-232 voltage translator. Stamm recalls that he discovered the limitations of his own microcontroller instruction set while writing the blackjack program. He also filled the 8748’s program memory and did not have room to add a double-down feature to the blackjack game. The very first application program written for the 8048 immediately brought its main limitations to light.

To add sex appeal to the demo, the team decided to power the 8748 with a battery made from strips of copper and zinc stuck into a lemon or an orange. At least that’s how it’s described in the oral history. The 8748 needs 5 volts at 100 milliamps to run, and a copper-zinc battery made from one orange generates a volt or less, so either more than one orange was used to power this demo or the story is apocryphal. It’s a great story though.

Intel announced the MCS-48 microcontroller family, which included the 8048 and 8748, in late 1976, with the goal of shipping 1000 revenue units, all 8748 devices, in the first quarter of 1977. Blume recalls that the actual number of revenue units shipped was 770, which everyone considered a success and is no doubt the result of a lot of pre-selling. The Intel 8748 was a hit. Product manager Howard Raphael recalls early customers including Gilbarco (gasoline pumps), Tektronix, and Chrysler.

Magnavox based its Odyssey2 video game console on an 8048. The 8048 was used extensively to power a variety of analog music synthesizer keyboards including the Korg Trident series, the Korg Poly-61, the Roland Jupiter-4, and the Roland ProMars.

Like many analog music synthesizers, the Roland Jupiter-4 used an Intel 8048 as a controller. Image credit: Raymangold22, Wikimedia Commons

The Sinclair QL personal computer used an Intel 8049 (an 8048 with a 2-Kbyte ROM) to manage its keyboard, joystick ports, RS-232 inputs, and audio output. Nintendo used the ROM-less 8035 microcontroller (quite possibly an 8748 with a bad EPROM) in its original Donkey Kong arcade game to generate the game’s music. My friend and colleague Wally Wahlen incorporated an Intel 8048 as a controller into his design of the Hewlett-Packard 9876 thermal page printer, which was introduced in 1979.

Nintendo used an 8035 (a ROM-less 8048) to generate music in its Donkey Kong video game arcade console. Image credit: Rob Boudon, New York City, USA, Wikimedia Commons

Eventually, the IBM PC would use an Intel 8048 as a keyboard controller. However, the IBM PC was not the first computer to use the 8048 this way. That milestone belongs to the Tandy TRS-80 Model II, which used a cost-reduced, 28-pin version of the 8048 called the 8021 to manage its detachable keyboard and scan the keys.

The Tandy Model II personal computer used an 8048 as a keyboard controller. Image credit: Piergiovanna Grossi, Wikimedia Commons

The Intel 8048 family became a huge success for Intel, but its design limitations surfaced almost immediately. The limitation on program address space began to noticeably pinch customers by 1977, and, by the fourth quarter of that year, just a year after its introduction, Intel started to define the 8048’s successor, which would be called the 8051. It would become an even bigger success and could easily be called the microcontroller that would not die, at least not yet. However, that story must wait its turn.

References

Oral History Panel on the Development and Promotion of the Intel 8048 Microcontroller, Computer History Museum, July 30, 2008

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Dec 20, 2024
Do you think the proton is formed from three quarks? Think again. It may be made from five, two of which are heavier than the proton itself!...

Libby's Lab

Libby's Lab - Scopes Out Silicon Labs EFRxG22 Development Tools

Sponsored by Mouser Electronics and Silicon Labs

Join Libby in this episode of “Libby’s Lab” as she explores the Silicon Labs EFR32xG22 Development Tools, available at Mouser.com! These versatile tools are perfect for engineers developing wireless applications with Bluetooth®, Zigbee®, or proprietary protocols. Designed for energy efficiency and ease of use, the starter kit simplifies development for IoT, smart home, and industrial devices. From low-power IoT projects to fitness trackers and medical devices, these tools offer multi-protocol support, reliable performance, and hassle-free setup. Watch as Libby and Demo dive into how these tools can bring wireless projects to life. Keep your circuits charged and your ideas sparking!

Click here for more information about Silicon Labs xG22 Development Tools

featured chalk talk

High Voltage Intelligent Battery Shunt
Sponsored by Mouser Electronics and Vishay
In this episode of Chalk Talk, Scott Blackburn from Vishay and Amelia Dalton explore the what, where, and how of intelligent battery shunts. They also examine the key functions of battery management systems, the electrical characteristics of high voltage intelligent battery shunts and how you can get started using a high voltage intelligent battery shunt for your next design.
Dec 4, 2024
15,586 views