feature article
Subscribe Now

Optane, We Hardly Knew Yeh

Buried in Intel’s recent and somewhat dismal second-quarter 2022 financial results was a line item under “Non-GAAP adjustment or measure” that read: “Optane inventory impairment” for the sum of $559 million. Ouch. About 60 percent of the way through the “Forward-Looking Statements” section of the financial earnings statement is the phrase “the wind-down of our Intel Optane memory business.” Bye, bye Optane.

This is a CEO-level decision, and it can’t have been an easy one for Pat Gelsinger. Optane Persistent Memory has been a foundational cornerstone of Intel’s differentiation strategy for its Xeon server CPUs and Core CPUs for PCs and laptops, as you can see from this short, uplifting video:

Intel’s relationship with Optane non-volatile memory has not been a quick fling. Optane started as a joint project between Intel and Micron a decade ago but was formally announced in July 2015. Back then, it was called 3D XPoint by Micron and Optane by Intel. Although the technical and material science details underlying Optane memory were never fully explained, it’s generally considered to be a phase-change type of resistive RAM (ReRAM).

Optane non-volatile memory was available in two forms: SSDs (SATA and NVMe) and “Persistent Memory” DIMMs that could plug into existing DIMM slots, if the system’s memory controller understood Optane DIMMs’ unique timing requirements, which differed from SDRAM timing requirements. Only Intel CPUs and some Intel FPGAs incorporated memory controllers that understood Optane Persistent Memory DIMMs, which was both a competitive advantage for Intel and somewhat of a hindrance to wider use of Optane Persistent Memory DIMMs.

Optane has always been caught between DRAM and NAND Flash memory. DRAM is a bit faster and somewhat more expensive on a per-bit basis than Optane memory, and NAND Flash memory is much slower and much less expensive on a per-bit basis. Optane Persistent Memory DIMMs also offered more bit density than SDRAM DIMMs. The latest (and presumably last) Optane selection guide says that Intel offered non-volatile Optane Persistent Memory DIMMS in 128, 256, and 512 GByte capacities.

Although a few companies adopted Optane Persistent Memory, the benefits of Optane technology never really sold most customers, and the envisioned volume sales needed to drive down manufacturing costs didn’t materialize. Worse, storage analyst and consultant Tom Coughlin believes that “… Intel has been subsidizing its Optane products in order to keep the price low enough to generate demand (roughly half the price of DRAM, in $/GB).”

The End of Optane

Micron announced that it would cease development of 3D XPoint in March, 2021. Further, the company said that it was selling the lone 3D Xpoint semiconductor fab in Lehi, Utah to Texas Instruments, which meant that Intel would have to find another fab to host Optane production if it wanted to continue selling Optane products. With Optane’s unique materials and processing requirements, I can just hear the managers of Intel’s processor fabs yelling, “Not in my back yard!”

Intel’s idea of using Optane Persistent Memory to differentiate its processors isn’t a bad one, but clearly the idea wasn’t boosting CPU sales. Q2 2022 quarterly revenue for the Client Computing Group (that’s PC and laptop CPUs) was down 25% versus Q2 2021 while sales for the Datacenter and AI Group (that’s where Xeon lives) were down 16%. This is hardly Optane’s fault. The year 2021 was a booming year for Intel, thanks to accelerated purchases of PCs, laptops, and servers caused by COVID-related changes to the way people work. Last year, every semiconductor maker including Intel was running fabs around the clock and shipping every chip that tested good. It was a great year for chip makers. This year was bound to be much less exciting in terms of growth for semiconductor makers.

Optane’s termination is not the first time Intel has exited a memory business. Most recently, Intel announced that it was selling its NAND Flash business along with its SSD businesses based on NAND Flash and Optane memory in October, 2020, just a few months before Gelsinger took over as Intel’s CEO. SK hynix bought the NAND Flash business, and a new company, Solidigm (an SK hynix subsidiary) took over Intel’s SSD business. Intel entered the Flash memory business in 1988 with NOR Flash devices and later focused on NAND Flash, which started selling in much greater volumes. That’s because NAND Flash memory, especially multi-level and 3D NAND Flash memory, costs much less per bit that NOR Flash devices.

An even bigger memory divestiture occurred in 1985 when Intel famously exited the DRAM business. Robert Noyce and Gordon Moore left Fairchild Semiconductor and founded Intel specifically to make semiconductor memories. The company literally created the commercial DRAM business with the introduction of the 1103 DRAM in October, 1970, and DRAM sales drove the company’s top and bottom lines for several years. However, DRAM was perhaps too successful of a product. Everyone wanted to play in the DRAM market. Exactly fifteen years after announcing the world’s first commercial DRAM, Intel announced that it was exiting the DRAM business due to a glut of worldwide DRAM manufacturing capacity, mostly in Japan, which eroded DRAM pricing and margins to unacceptable levels, at least for Intel.

Andy Grove and Gordon Moore, respectively Intel’s president and CEO at the time, had a famous, private conversation about DRAMs in 1985, when Intel’s DRAM market share was clearly eroding. Grove asked, “What would happen if somebody took us over, got rid of us — what would the new guy do?” “Get out of the memory business,” Moore replied.

Grove and Moore didn’t wait to be replaced. They quickly exited the DRAM business. Grove later described that decision in a 2012 interview with National Public Radio (NPR) correspondent Laura Sydell: “It was an emotional decision. We had been the first to introduce the product and build the business … In retrospect, getting out of DRAMs when we did was the best business decision we ever made.” After Moore and Grove made that decision to terminate its DRAM business, Intel laid off roughly a third of its workforce, more than 7000 employees, and closed multiple manufacturing plants. During that same NPR interview in 2012, Grove said it took ten years to turn Intel around and get the company back on track after that fateful decision.

Nearly four decades later, Intel CEO Pat Gelsinger found himself facing a similar decision regarding Optane non-volatile memory. With the decision to jettison NAND Flash memory and SSDs already made before his return to Intel, Gelsinger has now ditched Optane as well. No doubt that’s because he needs a leaner, trimmer company to go up against chief CPU rival AMD.

AMD’s CPUs have become much more competitive with Intel’s products with respect to both performance and efficiency, so there’s no room at Intel for money-losing products. Coincidentally, Intel also put its drone business, Intel Drone Light Shows, on the chopping block, likely for the same reason. (Kimbal Musk, Elon’s brother, bought that business from Intel.) At Intel, every employee and every product line is now at battle stations and on high alert with deflector shields set to double front.

The Endless Quest for Storage Class Memory

Intel’s Optane was one of many entries – perhaps the most successful entry to date – in the quest for “storage-class memory” or SCM, which attempts to deliver low-cost, non-volatile bit storage, like disk drives and SSDs but with greatly improved, DRAM-like access times. Optane largely achieved this goal for the most part, but apparently not the “low-cost” aspect, or at least, not low manufacturing cost. So, the quest for an SCM technology that truly challenges SDRAM continues.

There are many candidates for such a technology, and these candidates have been around for a long time. I’ve been writing about SCM for at least a decade and a half and about alternative non-volatile memory technologies for four decades. As of today, none of these technologies are competitive against SDRAM with respect to capacity or cost per bit of data storage, but that doesn’t prevent aspirational SCM vendors from continuing to develop their non-volatile memory technologies. There’s a pot of gold at the end of the SCM rainbow. You simply need to find the right leprechaun.

Personally, I’m rooting for MRAM – magnetic RAM – not because I think that MRAM is close, it’s been close way too long. MRAM devices have been on the market for years. So far, they cost too much money per bit and their device-level capacity is too low. No, I’m rooting for MRAM because I’d feel a certain closure to see the return of magnetic memory to its previous position at the pinnacle of the memory hierarchy.

I’m just barely old enough to remember when Intel announced the 1103 DRAM in 1970. Within five years of its introduction, DRAM dethroned magnetic-core memory, which had reigned for two decades. MIT’s Jay Forrester developed the first successful magnetic-core memory plane and installed it in the groundbreaking MIT Whirlwind computer on August 8, 1953.

Forrester’s magnetic-core memory replaced an older electrostatic memory technology that stored bits in specially built cathode ray tubes (CRTs). Whirlwind’s electrostatic CRT memory had a very slow access time of 25 microseconds, many times slower than what was required. The slow electrostatic memory bogged down the entire computer and prevented the machine from reaching its performance goals. The installation of magnetic-core memory reduced Whirlwind’s memory-access time to 9 microseconds and eased that problem.

Whirlwind’s groundbreaking magnetic cores were installed just one month before I was born. For the next 20 years or so, magnetic core memory sat atop the computer memory Ziggurat, until Intel’s DRAM toppled core memory’s rule. The computer industry’s complete conversion from magnetic cores to DRAM was so swift that many 3rd-party companies making core-memory boards for popular mainframes and minicomputers – companies with now-unfamiliar names like Ampex, DATARAM, Electronic Memories, and Fabri-Tek – either switched from using magnetic cores to DRAMs or vanished seemingly overnight. (In reality, it took about five years for core memory to dwindle and die.) DRAM has been king of memory mountain for 50 years now.

Many technologies seek to topple the king and there are several candidates aspiring to become the one true SCM technology:

  • Faster Flash
  • Ferroelectric RAM
  • Resistive RAM
  • Memristors
  • Phase-change memory (PCM) – Optane was in this class

 

I’ve written about all these candidate SCM technologies over the years. They always seem just over the horizon. Like practical fusion power generation, they seem to be perpetually ten years off. Somehow, they all seem to be a bit further away now that Optane has fallen by the wayside. Until there’s a breakthrough, which could happen any minute (or not), DRAM continues to reign. Long live the king. So long Optane. We hardly knew yeh.

One thought on “Optane, We Hardly Knew Yeh”

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Wi-Fi Locationing: Nordic Chip-to-Cloud Solution
Location services enable businesses to gather valuable location data and deliver enhanced user experiences through the determination of a device's geographical position, leveraging specific hardware, software, and cloud services. In this episode of Chalk Talk, Amelia Dalton and Finn Boetius from Nordic Semiconductor explore the benefits of location services, the challenges that WiFi based solutions can solve in this arena, and how you can take advantage of Nordic Semiconductor’s chip-to-cloud locationing expertise for your next design.
Aug 15, 2024
59,588 views