feature article
Subscribe Now

Teradyne’s Tactics to Tackle Twenty-First Century Test

When I was a young sprout, I used to work for a pair of sister companies called Cirrus Designs and Cirrus Computers in the UK. While at Cirrus Designs, I learned all* about testing integrated circuits (ICs) and printed circuit boards (PCBs). Meanwhile, at Cirrus Computers, I learned all* about digital logic simulation, automatic test pattern generation (ATPG), and automatic test equipment (ATE). (*When I say “all” in this context, I don’t mean I learned everything there is to know; rather, that I learned everything I could squeeze into my meager mind.)

Eventually, both these entities were acquired by the American company General Radio, or GenRad for short. Founded in 1915, GenRad was an early manufacturer of electronic tools and instruments. During the 1950s, it became a major player in the ATE market, manufacturing a line of testers for assembled PCBs.

Another famous American company in this space is Teradyne, which was founded in 1960 when the semiconductor industry was in its infancy. Teradyne started by building testers for discretely packaged components, like diodes, and quickly evolved to building testers for ICs. In the mid-1960s, to address the ever-increasing complexity of these devices, Teradyne began to include minicomputers in its testing products. This was a radical idea that spawned a new market, which Teradyne essentially owned until the mid-1970s.

Following the 1970s, Teradyne experienced a lot of ups and downs before eventually transmogrifying into the multi-billion behemoth we know and love today. In 2001, Teradyne acquired GenRad, which—in a strange sort of way—makes me feel like I’m a junior member of the Teradyne family.

I was just reading up on the Ancient Greek mathematician, physicist, engineer, astronomer, and inventor, Archimedes, who is regarded as one of the leading scientists in classical antiquity and one of the greatest mathematicians of all time. My interest in Archimedes was spurred by a chat with Eli Roth, who is Product Manager for Smart Manufacturing at Teradyne. Eli was bringing me up to date with the recent launch of Teradyne’s Archimedes Analytics Solution.

Eli commenced by referencing my recent column on chiplets (see Are You Ready for the Chiplet Age?). He noted that there are ongoing advancements in semiconductors that are driving quality challenges, one of these enhancements being chiplets. There are new materials, new types of devices, new architectures, and new packaging technologies, all of which are providing new computational capability, and all of which are demanding higher quality standards.

Semiconductor enhancements drive quality challenges (Source: Teradyne)

There are a couple of things that jumped out at me when I first cast my orbs over the above illustration. Look for yourself and see if anything causes you to raise a furrowed eyebrow. The first point is that there’s no mention of chiplets, but Eli assures me that these little scamps are embraced by the “3D stacking and advanced packaging” bubble, so that’s all right.

How about the “Dark Silicon” bubble—what the heck is this? I had a quick Google while no one was looking. The Wikipedia was unusually unhelpful, informing me only that dark silicon refers to the amount of circuitry on an IC that cannot be powered-on at the nominal operating voltage for a given thermal design power (TDP) constraint, which effectively told me nothing I wanted to know.

Fortunately, the Semiconductor Engineering website was a little more fulsome, explaining that dark silicon refers to a method of conserving power in ICs by powering down segments of a chip when they are not in use. In slightly more detail they say: “[…] This failure of Dennard scaling has introduced the era of what designers call ‘dark silicon.’ If the number of transistors doubles, but the power budget for the circuit as a whole stays the same—or goes down, thanks to the proliferation of mobile devices—then the available power for each transistor is cut in half. If the threshold voltage stays the same, then the number of transistors that can operate at one time is also cut in half. These non-operational transistors are dark silicon, measured as a fraction of the chip’s total area.”

However, we digress…

In the case of chiplets, which we piece together on a common substrate, how do we get known good die and how do we attach and package up these chiplets? If we don’t detect a failed chiplet until the final packaging step, that’s far too late in the game.

As another example, take the automotive industry. In the past, failures that resulted in warranty hits were typically electromechanical in the form of the engine or the drivetrain. Now, most failures resulting in warranty hits can be root-caused back to a semiconductor device. As a result, the automotive industry is pushing for failure rates in semiconductor devices at one part per billion (eeek).

Eli notes that one way to increase yield and improve quality is to employ advanced analytics, which—and I can only imagine your surprise—neatly returns us to the Archimedes Analytics Solution.

Archimedes analytics solution (Source: Teradyne)

We don’t have the time to go through all these items individually, but we can certainly cherry-pick a couple. Take real-time data, for example. What has historically happened is that the wafer tester generates a log file and—at the end of a run—that log file is uploaded to the cloud for big data processing. This is where you do classic statistical process control. Also, you can look at devices individually, but this may take minutes, hours, or even days after you test the wafer. By comparison, the real-time data streaming capabilities of the Archimedes Analytics Solution means everything is happening at the time the device is being tested, so not minutes, hours, or days—we’re talking milliseconds of time to data availability.

And then we have security. The data must be secure at all times and during all stages of the process. You can’t let anybody jump in and modify the device, modify the model, or snoop on the data, so the Archimedes Analytics Solution employs a Zero-Trust (ZT) model in which data is encrypted throughout. For example, you don’t want anybody to take data off the tester and manipulate it before shipping it out. Thus, Teradyne ensures that all the data coming off the tester is genuine and is essentially “stamped” as being “Teradyne known good”; that is, this is genuine data that came directly off the tester and cannot be manipulated by nefarious scoundrels—it’s data customers can count on.

Two main usage scenarios are depicted below. In the first scenario, data from a Teradyne tester is streamed directly to Teradyne’s edge device in the form of the UltraEdge2000. Featuring the lowest latency, highest performance parallel compute process Teradyne offers, the UltraEdge2000 lives directly in the test cell, thereby providing real-time analytics and real-time actionable data that can be used to make real-time decisions. In addition to Teradyne’s out-of-the-box analytical solutions, customers can employ their own homegrown solutions or solutions from other analytics providers.

Archimedes analytics solution (Source: Teradyne)

Also, the data can be streamed to large data pools in the cloud. This allows analytics to be performed across multiple test cells, possibly across multiple manufacturing sites. And there’s even a potential future where analytics will be performed across data sets from multiple companies (the data from individual companies could be obfuscated) to facilitate the analysis of manufacturing issues across the entire industry.

I know that nostalgia is not what it used to be, but talking about testing has caused a wave of nostalgia to wash over me. I had a lot of fun hand-creating test programs for ICs and PCBs back in the day, and I loved working with automatic test pattern generation and automatic test equipment. How about you? Do you have any test-related tales of derring-do you’d care to share with the rest of us?

 

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Industrial Internet of Things
Sponsored by Mouser Electronics and CUI Inc.
In this episode of Chalk Talk, Amelia Dalton and Bruce Rose from CUI Inc explore power supply design concerns associated with IIoT applications. They investigate the roles that thermal conduction and convection play in these power supplies and the benefits that CUI Inc. power supplies bring to these kinds of designs.
Aug 16, 2024
50,903 views