feature article
Subscribe Now

Big Data in Semi Manufacturing

Rummaging for Better Results

He looked yet one more time out onto the crime scene. The answer was out there somewhere. But how many times could he come back “with fresh eyes,” hoping to see something he hadn’t seen the last time? Dried leaves turned a particular way… dirt compacted ever more slightly here than there… distinctions hard to make out with the naked eye, but, given enough data and some way to sort through it all, he knew a pattern would emerge that would lead him to the answers he needed.

A few weeks back, we looked at how Big Data was impacting the world of design management. But that’s not the only corner of the semiconductor world infected by the Big Data bug. At this summer’s Semicon West show, I met up with a couple of companies using data analysis to improve different aspects of semiconductor manufacturing.

The idea behind any of these ventures is that there’s a ton of data out there. We use it to a limited extent, but there’s tons more we could learn if we aggregated it and sliced and diced it the proper way. And, if we automate some of that – once we figure out exactly what recipe we want to follow – then we can integrate it into the actual manufacturing flow, providing control feedback as well as alerts and notifications that something might be amiss. Or we might be able to build small, inexpensive equipment that competes with much larger units.

One of the new things about Big Data is a different way of organizing the data crudely (or, more accurately, not organizing it) so that you can keep up with the firehose feeding yet more data into the system, even as impatient analysts want to see the latest NOW and make inferences and see trends before the Other Guy does. The poster child for this new approach is the Hadoop project.

However, one of the reasons for taking that approach is the broad, unpredictable variety of things that need to be cached away – it’s particularly helpful when the incoming data has little or no inherent structure. That’s not the case with the companies we’re going to discuss. They have highly structured data, so they don’t use a Hadoop-like approach. They’re still faced, however, with the challenge of processing the data quickly enough to provide low-latency feedback on an active production line.

The first company we’ll look at is called Optimal+ (in an earlier life, called Optimal Test). Their focus is on analyzing reams and reams of test data and drawing conclusions that can impact current work in progress. They recently announced that they’re aligning with HP Vertica as a database platform for supporting the processing performance they require.

Their baseline product is called Global Ops, and you might call it generic in that it’s simply a way of taking data from all aspects of the manufacturing flow, learning something, and potentially changing the flow. We’re talking, in many cases, incremental tweaks, and the trick is always walking that tightrope to improve yield or throughput without reducing quality.

The idea is to analyze the bejeebus out of the data and come up with ideas for new “rules.” Those rules can then be tested across the range of historical data to show what might have happened had those rules been in place back then. If you like the way things turn out, then – and this is key – you can push the new rules out for immediate implementation across the supply chain. That means fabless companies impacting wafer production or fabful companies impacting an offshore test house.

More specific products can be laid over this Global Ops platform. There’s one for managing the test floor, one for reducing test time, one for detecting outliers, one for preventing escapes, and one optimized for high-volume production – managing the ramp of a new product or the yield of an established product.

Note that all of these solutions are centered on test data. No process monitor data or other internal fab information is used in the analysis. Latency from new test data to completed analysis is in the range of 7-10 minutes.

The second company I spoke with is Nanotronics, and there appear to be a couple of things going on there. These guys focus on inspection at the atomic level – atomic-force microscopy (AFM). They claim to do with relatively small systems a range of things normally covered by families of big inspection platforms from folks like Applied and KLA-Tencor.

These guys instead use computational photography techniques to improve feature resolution, and they use data analysis on a die to identify miniscule features and across an entire wafer to identify macro-level features like a big scratch on a wafer. Such a defect, crossing multiple dice, would be apparent only by analyzing the results of the entire wafer rather than an individual die.

While the larger competing platforms tend to use a variety of wavelengths and approaches for different jobs in order to ensure a reliable “signal,” Nanotronics claims to be able to get that same signal without all of that varied hardware. That said, however, we are talking about a table-top setup, so, even though they provide automation tools like a wafer handler, it’s hard to imagine this running high volumes and putting the other guys out of business.

The aspect of the platform that would appear to fly the Big Data flag more prominently would be their learning capabilities – both unsupervised and supervised; their feature recognition relies on this. The other Big-Data-like characteristic is that, from a single scan, they can then run multiple different analyses on the resulting single dataset (examples they give are die yield analysis and defect detection).

While much of their focus is on semiconductors, they’re also playing in the bioscience areas as well, analyzing viruses, bacteria, and cells. Same basic approach and scale; different features.

One of the big distinctions between these two data-oriented platforms is the openness of the analysis. Optimal+ is specifically about giving data to users and letting them customize the analysis and the learning. Nanotronics, by contrast, keeps its algorithms under the hood, using them as a vehicle for improving the user’s inspection experience. In one case Big Data rises to the fore; in the other, it sinks into the background.

In either case, it changes the rules of the game. Once novel, Big Data is rapidly becoming commonplace as we figure out how to rummage through it all in a productive manner.

 

More info:

Nanotronics

Optimal+

 

One thought on “Big Data in Semi Manufacturing”

Leave a Reply

featured blogs
Apr 19, 2024
In today's rapidly evolving digital landscape, staying at the cutting edge is crucial to success. For MaxLinear, bridging the gap between firmware and hardware development has been pivotal. All of the company's products solve critical communication and high-frequency analysis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured chalk talk

Optimize Performance: RF Solutions from PCB to Antenna
Sponsored by Mouser Electronics and Amphenol
RF is a ubiquitous design element found in a large variety of electronic designs today. In this episode of Chalk Talk, Amelia Dalton and Rahul Rajan from Amphenol RF discuss how you can optimize your RF performance through each step of the signal chain. They examine how you can utilize Amphenol’s RF wide range of connectors including solutions for PCBs, board to board RF connectivity, board to panel and more!
May 25, 2023
37,157 views