feature article
Subscribe Now

Treading on Thin Air

Engineering the Second Generation

Somewhere, in a nondescript cubicle in building number umpteen of a multi-billion-dollar multinational multi-technology conglomerate, an engineer sits at a lab bench staring at an eye diagram on six-figure scope. It’s the same every day. Any time he is not in a meeting or writing a status report, he sits in this lab and eats and breathes signal integrity. He has almost no concept of the end product that will incorporate his work. His entire universe is jitter, pre-emphasis, equalization, noise, amplitudes, and bit-error rates. For him, time stands still – in the picoseconds.

Across town, another engineer is also worried about signal integrity. He is including some DDR4 memory on his FPGA board, and he needs to be sure his design will work. He will worry about signal integrity for less than one day. He has prototyped his project on an FPGA development board with a couple of add-on modules plugged in with expansion connectors. His system is using an application he wrote on top of a LINUX TCP-IP stack, a six-axis gyro/accelerometer, a GPS, a camera, and some brushless motors – all coordinated by an FPGA with an embedded processor. He will check his DDR4 interface with an EDA tool that simplifies the operation almost to a pushbutton. This engineer is doing just about every aspect of the design, from the FPGA work to the board layout to the applications software. He is intensely aware of the requirements of his end product.

The second engineer relies almost completely on engineers like the first. He is truly working in a “jack of all trades, master of none” scenario. In order for him to be successful, an entire legion of engineers like the first must do their jobs, boil the essence down to a few variables or API calls, and package up the whole thing into a tight bundle of encapsulated engineering expertise. It is this kind of collaboration with those we’ve never met that empowers us to create the remarkable systems we are seeing today. 

The technology ecosystem truly has a food chain, beginning with bare-metal technologies and working its way – layer by layer – up the pyramid to the system-level design. As the structure gets taller, the lower layers become ever broader, requiring a greater number of increasingly specialized engineering disciplines to keep the whole thing supported. The number of distinct electronic engineering specialties today is greater than it has ever been before, and there is no sign of the trend letting up.

Our postgraduate engineering education is designed to focus us on particular narrow areas of technology. This is an ever-moving target. Universities fund departments to put together programs in what are perceived as the “hot” topics of the day, and, at any given time, the most popular tracks are those tackling today’s most exciting and interesting problems. Few electronic engineering programs divert students into mature, stable areas of development, even if there is considerable work remaining to be done there. 

The result is an age gradient in the technology chain, with the lower levels occupied by older engineers who have spent long careers refining their solutions at what have become supporting levels of technology. The higher levels trend toward younger engineers who went to school when various higher-level challenges were in fashion. Almost like sedimentary rock, the layers of technology are attached to the generations of engineers who produced and refined them. 

This presents an obvious and disturbing problem. What happens when the bottom of the pyramid goes into retirement? Does our whole technological structure run the risk of collapsing onto itself because nobody is left who truly understands the deepest and most fundamental levels of technology?

Fortunately, in semiconductors, the base level has been re-engineered every two years for the past five decades, which has kept a good deal of fresh blood circulating to the bottom of the pyramid. The engineers designing with 14nm FinFETs could not rest on the laurels of those who understood one-micron planar CMOS. The bottom-to-top churn of Moore’s Law has had the side effect of constantly renewing our attention on every level of the food chain, preventing knowledge decay at the lower levels. 

But what about software? Does our arsenal of algorithms become so robust and canned that we lose touch with the fundamentals? Will there be a time in EDA, for example, when nobody understands the innermost workings of some of the most sophisticated tools? Is there a risk that the brilliant and subtle optimizations and refinements of engineers of the past will be lost as art, existing only deep in the lines of the legacy code they left behind? 

We are just now reaching the time where the first generation of engineers who created the Moore’s Law revolution have retired. We have not yet seen an age like the second and later generations will experience, where our base technologies were all optimized by engineering ghosts from the past. 

As our systems become more complex, constructed by plugging-and-playing scores of independent components whose entangled fractal complexity is completely understood by no one, will we reach a point where our systems take on organic characteristics? When nobody fully understands a system, will its behavior becomes less deterministic, and will something more akin to a personality emerge? And, does this mean there will be a time when we diagnose and treat our designs in something that more closely resembles today’s medical practices? Will concrete engineering decisions of the distant past become the subject of future research discoveries?

It is entirely possible that engineering archeology will one day be an important discipline, and those who can reverse-engineer the lost miracles of the past will be just as highly regarded as those who create new solutions for the future. 

As I stare at the latest FPGA development board and realize that I cannot identify all of the components mounted on it, I can feel my contact with the bare metal layer begin to erode. Even though that board will enable me to create the most sophisticated designs of my career, that increased sophistication comes with a loss of innocence, a frightening failure of gravity that lifts our technological feet out of contact with the terra firma. Taking the plunge of trust and depending on technology that we do not fully understand is more challenging for an engineer than for most other personalities. Our identity is often wrapped up in our command of our domain from almost a molecular level, and feeling that positive contact slipping away can be terrifying. 

Leave a Reply

featured blogs
Apr 25, 2024
Cadence's seven -year partnership with'¯ Team4Tech '¯has given our employees unique opportunities to harness the power of technology and engage in a three -month philanthropic project to improve the livelihood of communities in need. In Fall 2023, this partnership allowed C...
Apr 24, 2024
Learn about maskless electron beam lithography and see how Multibeam's industry-first e-beam semiconductor lithography system leverages Synopsys software.The post Synopsys and Multibeam Accelerate Innovation with First Production-Ready E-Beam Lithography System appeared fir...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Secure Authentication ICs for Disposable and Accessory Ecosystems
Sponsored by Mouser Electronics and Microchip
Secure authentication for disposable and accessory ecosystems is a critical element for many embedded systems today. In this episode of Chalk Talk, Amelia Dalton and Xavier Bignalet from Microchip discuss the benefits of Microchip’s Trust Platform design suite and how it can provide the security you need for your next embedded design. They investigate the value of symmetric authentication and asymmetric authentication and the roles that parasitic power and package size play in these kinds of designs.
Jul 21, 2023
31,915 views