feature article
Subscribe Now

Steve Sanghi’s new “Up And To The Right” book chronicles Microchip’s journey from nearly bankrupt to Top-20 semiconductor maker

Although he did not know it at the time, when Steve Sanghi left Wafer Scale Integration (WSI) to join Microchip in April 1990, he had jumped from the frying pan directly into the fire. Nine months prior, Sanghi had looked at Microchip’s status in the hopes of justifying a merger between ailing WSI and Microchip. At that point, Microchip had $10 million remaining from a Series A funding round and looked to be heading up. However, the merger didn’t happen, and WSI continued to spiral downward. After WSI’s CEO, VP of technology, VP of sales, and VP of process technology all jumped ship, Sanghi, who was the company’s VP of operations, decided to leave the company, too. Initially, he planned to start a new semiconductor company, but Microchip’s investors, who were also WSI’s investors, convinced him to join Microchip instead.

On the day he joined Microchip, the company announced that it would be looking for a replacement for its CEO. He also discovered that the company’s cash position had deteriorated during those intervening nine months. At that point, Microchip had $70 million in annual revenues and was losing money at an annualized rate of $10 million per year. Three months later, in July 1990, Sanghi became the new CEO of Microchip.

By 1990, Microchip was primarily an EPROM manufacturer, selling 256-Kbit chips. EPROMs were commodity parts. Many companies were making essentially identical EPROMS at the time. “It was like selling ice to Eskimos,” says Sanghi. Because Microchip was in a commodity chip business, margins were thin and therefore yields were critical for financial success. Semiconductor yields made the difference between profitability and financial failure and Microchip’s yields were terrible.

As Sanghi explains in his new, self-published book, “Up And To The Right: My Personal and Business Journey Building the Microchip Technology Juggernaut,” the factory yield for wafers coming out of the fab was 70 percent. Of 100 wafers started in the fab, only 70 finished wafers would come out due to misprocessing, out-of-spec wafers, and breakage. For the 70 percent of the wafers that managed to make it through the fab, only 50 percent of the die on the wafers tested good. Assembly and test operations on the back end would scrap another 20 percent of the product. That means that the overall manufacturing yield was 28 percent (70% x 50% x 80%). As Sanghi explains, any business that manufactures 100 of anything (chips, tires, cars, houses) and ships only 28 will soon go out of business.

 

Image credit: Steve Sanghi 

There were no quick fixes in sight for Microchip’s yield problems. For example, Sanghi writes:

“Our process engineers would look at the results of the previous wafer lot and tweak the equipment continuously to try to make the process fall in the center of the distribution. In my opinion, they were making the process go even further away from the center. So I asked the process engineers not to tweak the process and leave it alone for one week. They told me that it would not work and that yields would become very low because our processes needed continuous tweaking. I said I would take responsibility. So, no tweaking for a week. After a week, the engineer came to me and told me we had just had the highest yield from the Fab. A light bulb went on. That led to substantial statistical process control and change management training inside our Fabs.”

Sanghi and his management team initiated as many as 60 process and product improvement teams throughout Microchip in those early days. In the fab, wafer yields climbed from 70 percent to 96 percent. Die yield per wafer went from 50 percent to 97 percent. Assembly and test yield climbed from 80 percent to 99 percent. Overall yield went from 28 percent to 92 percent as a result. Finally, Microchip could make and sell chips at a profit. However, the company was still playing in the commodity EPROM business, which had low gross margins. Microchip needed a more profitable product line.

At this point, it’s important to understand how Microchip managed to devolve into a company selling commodity EPROMs at a loss. It wasn’t always this way. The story starts with General Instrument (GI), an electronics manufacturer founded in 1923 in New York City. During the 1950s, the company’s president, Moses Shapiro, started an ambitious acquisition program, focusing on east coast companies. GI purchased General Transistor Corporation – located in Jamaica Plain, New York – in 1960, which put the company firmly in the semiconductor business. General Transistor became a seed crystal for GI’s Semiconductor Division. At the end of 1965, GI’s Semiconductor Division hired Frank Wanlass – the Johnny Appleseed of MOS technology – who set the division’s direction firmly towards a MOS future. (See “A Brief History of the MOS transistor, Part 3: Frank Wanlass – MOS Evangelist, Inventor of CMOS.”)

By the end of the 1960s and well into the 1970s, GI’s Semiconductor Division, which was eventually renamed GI Microelectronics, had become a leading provider of advanced MOS integrated circuits with a huge product line. Some of the company’s MOS LSI chips that I recall seeing in the company’s thick product catalogs included:

·         Early calculator chips

·         Clock and clock-radio chips and modules

·         Frequency counter chips

·         TV tuner and TV game chips (including a “pong” game)

·         Music chips and sound generators

·         A text-to-speech synthesizer (I’ve got one of these in an unopened Radio Shack retail package)

·         Digital voltmeter chips

·         ROMs, SRAMs, EAROMs (electrically alterable ROMs)

·         Keyboard controllers

·         Communications chips including UARTs (universal asynchronous receiver/transmitters), P/SARs and P/SATs (programmable synchronous/asynchronous receivers and transmitters)

GI developed an early 16-bit microprocessor called the CP1600 with Honeywell as a partner and launched the device in 1975. The CP1600 microprocessor used memory-mapped I/O, but GI didn’t develop conventional peripheral chips for this microprocessor’s bus. Instead, GI developed an I/O microcontroller called the PIC1650 to expand the CP1600’s I/O capabilities. GI introduced the PIC1650 microcontroller in 1976. (See “A History of Early Microcontrollers, Part 9: The General Instruments PIC1650.”)

However, the semiconductor business is a tough one. Perhaps even cruel. In his fantasy novel “Through the Looking Glass,” published in 1871, Lewis Carrol precisely captured the future plight of semiconductor manufacturers in a statement made by the Red Queen to Alice:

“Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

Most of the company’s older chip designs were obsolete and were therefore discontinued during the 1980s. GI Microelectronics’ parent company proved unwilling to make the large investments in semiconductor process technology and chip design needed to keep pace with the competition.  GI Microelectronics was losing money by the mid-1980s while GI’s cable television and gaming (electronic off-track betting) operations were doing quite well. GI saw no reason to invest its profits in a division that was losing money when it had other, very profitable businesses to nurture.

Consequently, GI lost interest in making chips and spun out GI Microelectronics in 1987 as a wholly owned subsidiary. The new company’s name became Microchip Technology, and GI took a sink-or-swim approach with its new subsidiary. The newly spun-out subsidiary continued to sink. A consortium of investors led by Sequoia Capital purchased Microchip in 1989 and funded it with $12 million in a Series A funding round. Steve Sanghi joined the company a year later, just as the company’s new investors started their search for a new CEO and after the company had burned through $10 million of its Series A funding.

Despite promises to the contrary, the investors declined to put more money into Microchip, so the company teetered on bankruptcy for nine months. Eventually, the company pulled in another $10.5 million of Series B funding while Microchip’s management team struggled to make the company’s semiconductor process technology profitable. With new funding in place and a profitable process technology, Sanghi and his executives started to look for a way to enter more lucrative markets while making the smallest possible investment by using existing company resources.

The strategy that Microchip selected was to focus on microcontrollers. Microchip was still selling ROM-based microcontrollers using its old PIC1650 architecture. Because its design and process technology were so old and because the PIC1650 development tools were rudimentary, Microchip ranked 23rd in microcontroller sales in 1990, the year that Sanghi took over as the company’s CEO. That same year, Motorola’s Semiconductor Products Sector (SPS) ranked first among microcontroller vendors. Microchip had plenty of room for improvement in its sales position.

In 1990, you had two choices for microcontroller chips: ROM-based and EPROM-based. ROM-based parts were inexpensive on a unit-cost basis, but it took many weeks for a vendor to start cranking out parts with a new ROM image, and the risk of finding a bug after the design was committed to the fab was high. There was also a large NRE (non-recurring engineering cost) to contend with. EPROM-based microcontrollers sold for ten times the unit cost of ROM-based microcontrollers because of the expensive packaging required. EPROM-based parts required expensive packages because they needed an optical window in the package to allow strong UV light to hit the microcontroller die during erasure.

Consequently, there was a pent-up demand for low-cost, field-programmable microcontrollers, and Microchip realized it already had the technologies needed to meet this demand. By combining its existing EPROM technology with its 8-bit PIC1650 architecture, Microchip created an electrically erasable, field-programmable microcontroller that could be packaged in an inexpensive plastic DIP. The company introduced the industry’s first field-programmable microcontroller with on-chip EEPROM memory, the PIC16C84, in 1993.

However, silicon alone doesn’t win market share in the microcontroller market. Microcontrollers are software-programmable devices and require a tool chain that includes software and development kits to go with the silicon. Microchip released Windows-based development tools and low-cost development kits for its PIC microcontrollers. Microcontrollers also benefit from a robust ecosystem that includes training and consulting resources, so Microchip started building an ecosystem for its PIC microcontrollers.

The company also focused on three underserved groups in the microcontroller market. First, electronic distributors were largely locked out of the microcontroller market because the ROM-based parts needed to come directly from the factory. EPROM-based parts were not very popular because of their unit cost, so distributors couldn’t sell many of those parts. Microchip’s low-cost, field-programmable PIC microcontrollers could be stocked in bulk at distributors, which could also program them for customers as a value-added service. The company quickly became popular with distributors as a result.

The availability of Microchip’s PIC microcontrollers, free software, and low-cost development tools also helped to engage two other underserved groups: engineering students and consultants. These groups typically had low-volume requirements, so they relied upon distributors to obtain parts. The PIC microcontroller’s field-programmability met their needs for low-cost parts that could be programmed and reprogrammed. Microchip initiated aggressive programs to support students and consultants. Many of these people eventually joined larger companies and brought their familiarity with PIC microcontrollers with them. That was also part of Microchip’s strategy.

A chart on page 24 of Sanghi’s book tells the company’s entire microcontroller success story. In 1990, Motorola was the top microcontroller supplier and Microchip was 23rd. By 1993, when Microchip introduced the PIC16C84, the company’s ranking had already risen to 13th. By 2005, Microchip had risen to the #3 slot. Meanwhile, Motorola divested SPS in 2004, Motorola SPS became Freescale, and private investors bought the company in 2006. Freescale later merged with and was absorbed by NXP. By 2006, Microchip had taken the #1 slot from Motorola SPS, and Freescale had dropped to third place. Microchip had gone from 23rd place to 1st place in 16 years. By 2013, Microchip was shipping more than one billion microcontrollers every year.

Once the company was on a firm financial and technical footing, Microchip embarked on an aggressive acquisition strategy. The idea behind the strategy was to capture all of a design’s sockets surrounding the microcontroller, in addition to capturing the microcontroller socket itself. Microchip bought semiconductor companies, software companies, and additional fab capacity when the prices were right. Among Microchip’s semiconductor, software, and system acquisitions are:

·         KeeLoq (wireless security, 1995)

·         TelCom Semiconductor (analog and mixed-signal ICs, 2000)

·         Hampshire company (touch-screen technology, 2008)

·         HI-TECH Software (C compilers, 2009)

·         ZeroG Wireless (WiFi chips, 2010)

·         Silicon Storage Technology (non-volatile memory chips, 2010)

·         Ident Technology (capacitive sensing, 2012)

·         Roving Networks (WiFi, Bluetooth ICs, 2012)

·         Standard Microsystems (Networking controllers, flash media cards, 2012)

·         Novocell Semiconductor (non-volatile memory IP, 2013)

·         EqcoLogic (Equalizer and coaxial transceiver chips, 2014)

·         ISSC Technologies (Bluetooth SoCs, 2014)

·         Supertex (mixed-signal and high-voltage ICs, 2014)

·         Micrel (Networking and consumer ICs, 2015)

·         Atmel (Microcontrollers, 2016)

·         Microsemi (FPGAs, analog, RF, and mixed-signal ICs, 2018)

·         Tekron International (GPS, precision-timing ICs, 2020)

According to the company’s latest investor presentation in May, Microchip saw $8.4 billion in revenue for its 2023 fiscal year. That’s the hard evidence of a successful strategy.

After spending 31 years as the company’s CEO, Sanghi relinquished that role in 2021 and turned the reins over to Ganesh Moorthy. He’s now the company’s Executive Chairman. His new book chronicles the strategies he and his management team used to transform Microchip from a money-losing semiconductor maker into a solid Top-20 semiconductor maker.

Steve Sanghi attending a FIRST robotics competition. Image credit: Steve Sanghi

If you’re interested in reading a case study that covers both the technical and business strategies of a highly successful technology turnaround, Sanghi’s “Up And To The Right” is clearly the right book for you.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Outgassing: The Hidden Danger in Harsh Environments
In this episode of Chalk Talk, Amelia Dalton and Scott Miller from Cinch Connectivity chat about the what, where, and how of outgassing in space applications. They explore a variety of issues that can be caused by outgassing in these applications and how you can mitigate outgassing in space applications with Cinch Connectivity interconnect solutions. 
May 7, 2024
39,294 views