In 1952, the same year that IBM introduced its first electronic computer, the tube-based Model 701, the company became an original licensee for the Bell Labs transistor patents, and Thomas Watson Jr. became IBM’s president. Prior to 1952, IBM specialized in punched-card machines: card readers, card punches, card tabulators, card calculators, and of course, the punched cards themselves. Before the IBM 701, the company essentially made electromechanical equipment. Starting with the IBM 701, the company entered the electronics era. Just five years later, in 1957, Thomas Watson Jr. became CEO and caused the following product-development policy proclamation to be made: “It shall be the policy of IBM to use solid-state circuitry in all machine developments. Furthermore, no new commercial machines or devices shall be announced which make primary use of tube circuitry.” That policy would put IBM into the transistor business – the bipolar transistor business.
Watson Jr’s concerns about the company’s lack of R&D and recommendations from IBM executives and outside consultants had caused IBM to create IBM Research in 1956. IBM Research was structured to conduct research, independent of the company’s product development and manufacturing needs. Its projects had a long-range outlook on technology, with five- and ten-year horizons. IBM started a separate corporate semiconductor research group in 1956. From the start, the semiconductor research group took the responsibility for meeting IBM’s immediate semiconductor needs while IBM Research was tasked with long-range projects.
Initially, IBM did not make its own semiconductors. It bought them on the commercial market. Its first major transistor supplier was Texas Instruments (TI), which signed an agreement that made TI IBM’s primary semiconductor supplier. The agreement also made provision for joint research semiconductor projects between the two companies.
In 1960, IBM formed an organization that shortly became the company’s Components Division (CD). The division’s mission was to develop and make semiconductor devices specifically for IBM computers and other IBM equipment. Although TI and Fairchild had independently invented the integrated circuit by 1959, IBM’s CD decided to take a different path to component integration, which it called “solid logic technology” (SLT). This approach to component integration leveraged the thick-film, ceramic hybrid technologies developed in the 1950s. One Fairchild development that IBM’s CD did leverage was the planar semiconductor manufacturing process. CD licensed the planar process to make bipolar transistors, either packaged individually or for use in SLT devices.
IBM developed SLT devices for the upcoming IBM 360 computer family, which would further establish IBM as the leading mainframe computer manufacturer. IBM also used an SLT variant to build the Launch Vehicle Digital Computer (LVDC), which was used to guide and control the Saturn V rocket used by the US Apollo program to reach the moon.
An SLT device consists of one or more bipolar diode or transistor die mounted on a ceramic substrate along with several passive components. The transistors were mounted face down, enabled by early versions of flip-chip packaging and solder-bump technology. SLT devices were the equivalent of an IC, but they were not monolithic. To make SLT devices cost competitive, IBM invested a large sum of money to automate SLT manufacture, echoing the manufacturing automation of the US National Bureau of Standards micromodule program of the 1950s, called Project Tinkertoy, developed for the US Navy. IBM’s heritage was electromechanical, so the company was able to develop machinery that could crank out bushels of SLT devices.
Because IBM was the volume leader in computer mainframes, it could drive the component volumes needed to make SLT economically viable in IBM’s pocket universe, where component cost was not so important because the company sold (mostly leased) computer systems. IBM was not a merchant semiconductor vendor, so component-level pricing was not so important. It was an internal accounting sort of thing. IBM assumed that SLT would be a viable technology for at least five years and by 1962, managers in IBM’s CD were dead set against ICs, apparently because they threatened IBM CD’s investment in SLT manufacturing. That attitude left IBM Research, with its five- and ten-year project horizons, free to pursue IC research and development work.
To be fair, IBM was able to put SLT into production much faster than it would have been able to develop ICs. However, SLT contained a time bomb. Integrating just a dozen or so components on an SLT hybrid was eminently practical, but the technology would not remain practical when the number of integrated components grew to hundreds or thousands.
IBM Research had previously been precluded from any research on semiconductor devices based on germanium or silicon because of an agreement with CD. However, the work on MOS transistors by Fairchild and RCA, underscored by the many MOSFET papers published at semiconductor conferences, gave IBM Research the opening it needed to start work on IC development. No company had successfully developed commercial MOS ICs by 1963, but the possibility of putting large numbers of MOS transistors on one monolithic die was an irresistible lure, sufficient to cause IBM Research to dare crossing the line drawn by CD.
Although CD had experimented with MOS transistors, they were too slow to be practical, for the near term at least, which strongly suggested that IBM Research should be looking into addressing that shortcoming. IBM Research started its LSI program to develop large-scale ICs in 1963. Within months, CD was challenging IBM Research’s right to do so. The matter went to IBM’s R&D board, where CD argued that any research on semiconductor materials in use at the time belonged in CD. The R&D board sided with IBM Research. The LSI research program continued.
At the time, IBM’s goal was building the world’s fastest computers. That goal was part of being the world’s biggest mainframe manufacturer. Early MOS transistors were the opposite of fast. They were a hundred times slower than bipolar transistors. By 1965, IBM Research’s MOS/LSI program had made progress in the development of MOS ICs, but the lack of speed made these chips of marginal value to IBM. SLT devices and even bipolar ICs purchased from other vendors were more suitable.
In October 1965, after presentations made by CD (which had become part of IBM’s Systems Development Division) in support of bipolar IC development and against MOS/LSI development, IBM’s Corporate Technology Board directed IBM Research to curtail the MOS/LSI research program. However, the company’s Corporate Technology Board had no authority to enforce its decision. It was an advisory panel. IBM Research ignored the board’s decision and the MOS/LSI program rolled on. However, even though the MOS/LSI program continued, finding customers for the slow technology proved difficult.
In December 1963 and August 1964, IBM’s Americo DiPietro visited General Microelectronics. That’s where he met Frank Wanlass, the Johnny Appleseed of MOS development. Wanlass made it his life’s work to help MOS semiconductors attain the high-density IC destiny he’d envisioned while getting his physics PhD at the University of Utah. He learned what he needed to know at Fairchild Semiconductor and then started making MOS transistors and ICs at General Microelectronics.
For most people, to meet Wanlass was to become a MOS disciple, and that’s exactly what happened to DiPietro. He returned to IBM and became a strong advocate for integrated circuits over SLT, MOS technology, and IBM Research’s LSI program. By 1966, DiPietro was working for J.A. Haddad, IBM’s corporate Director of Technology and Engineering, who was also a member of the Corporate Technology Board. Eventually, DiPietro convinced Haddad that ICs were the path forward. (DiPietro, who became IBM’s Director of Technology, retired in 1986.)
During these early days of MOS development in the 1960s, all companies including IBM focused on p-channel MOS transistors. It was nearly impossible to make working n-channel devices, a problem that was eventually linked to sodium contamination. At Fairchild, Wanlass had discovered that sodium contamination caused p-channel MOSFETs to have severe parametric drift and rendered n-channel MOSFETs essentially non-operational.
IBM’s bipolar transistor problem was one of reliability. Some of the transistors in its SLT devices would fail due to electrical leakage, and CD’s work on the problem uncovered the culprit and the solution. The culprit was sodium, and the solution was phosphorus. By 1964, IBM had discovered that adding trace amounts of phosphorus to the silicon dioxide layer in the planar process, the phosphorus bonds with the sodium and prevents it from descending into and poisoning the silicon below. A further development determined that applying a negative voltage to the silicon substrate further alleviated the problem of sodium contamination by increasing the MOSFET’s threshold voltages.
By 1965, IBM Research had shifted the focus of its MOS/LSI program to developing n-channel MOSFETs, because they are two or three times faster than p-channel devices, and IBM systems groups wanted speed. Instead of researching the characteristics of MOS transistors and MOS processing, the group directed its efforts at making MOSFETs attractive to CD. IBM Research devoted significant resources to making the MOS manufacturing process cleaner to produce more stable MOSFETs. By 1966, it was possible to reliably make stable n-channel devices. By solving the process problem with its bipolar transistor manufacturing process caused by sodium contamination, the company also made the manufacture of n-channel MOSFETs practical. Soon, the entire semiconductor industry would follow IBM’s lead and would turn to n-channel MOSFETs, at least for a few years.
IBM’s systems development group in Poughkeepsie then identified the perfect vehicle for MOS ICs: memory. The competing memory technology, magnetic cores, was incredibly slow. Magnetic memory’s access and cycle times were on the order of microseconds. MOS memory ICs would be faster than that. In addition, the use of MOS memory ICs could significantly cut memory’s cost per bit while speeding access times. In 1966, the systems development group offered IBM Research a contract to develop MOS memory chips. Meanwhile, CD showed no interest and was now working on bipolar memory ICs.
IBM Research published a study suggesting that MOS memory would cut the cost of memory in IBM’s mainframes by 40 to 80 percent relative to bipolar memories, due to the increased device density on the IC. Frank Wanlass’s original 1962 vision was becoming reality.
Eventually IBM succeeded in making MOS memories but was beaten to market by Intel, which announced the groundbreaking 256-bit 1101 MOS SRAM in 1969, followed a year later by the 1-kbit 1103 MOS DRAM. Nevertheless, the work IBM Research invested in MOSFETs paid off. The technology was adopted by the company’s semiconductor manufacturing plant in Burlington, Vermont, which began manufacturing MOS ICs in 1971. By 1972, IBM had introduced two computer systems, the IBM System 370 Models 158 and 168, which used 1-kbit MOS memory chips made by IBM. The following year, IBM announced that it would be upgrading to 2-kbit memory chips. By 1975, IBM’s MOS memory manufacturing program was one of the world’s largest. MOS ICs had found a home at IBM after more than a decade of work. But by then, it wasn’t just IBM making MOS ICs. There were many players.
However, memory ICs firmly established MOS manufacturing at IBM, and IBM Research continued to innovate and drive MOS development. IBM Research pioneered or helped to pioneer several manufacturing techniques that have allowed MOS technology to scale for decades including ion implantation, chemical-mechanical polishing (CMP), copper interconnect and the damascene process required to make copper interconnect practical, hi-k dielectrics, silicon-germanium transistors, FinFETs, extreme ultraviolet (EUV) lithography, and nanosheet/gate-all-around (GAA) transistors. Currently, IBM Research has developed a 2nm MOS process technology and is in partnership with Rapidus in Japan to commercialize the process. The history of IBM Research’s involvement with MOSFETs is nearly synonymous with the MOSFET’s history, and IBM Research’s work was clearly instrumental in finding a home for MOSFETs, the technology that no one wanted.
References
To the Digital Age: Research Labs, Start-Up Companies, and the Rise of MOS Technology, Ross Knox Bassett, 2002
For a fascinating analysis of IBM’s SLT devices and their use in the Apollo moon program, see “The core memory inside a Saturn V rocket’s computer” by Ken Shirriff.
You seemed to have missed the great innovation which finally made MOS technology superior to BiPolar Technology: CMOS. This combination of n-fet and p-fet in a single process and IC improved performance and increased density significantly.
@herbstoller: I covered the invention of CMOS by Frank Wanlass in Part 3 of this series. However, CMOS languished until the industry could make reliable NMOS devices. (You can’t have CMOS without first having a working NMOS.) Even then, CMOS was far too slow. Only RCA advocated CMOS for many years, which will be thoroughly discussed in Part 5 of this series. Then, in the late 1970s and early 1980s, something changed. CMOS got fast, thanks to Hitachi, and then everyone started using it. You’ll find that discussion at the conclusion of Part 7 of this series, which is yet to come.