feature article
Subscribe Now

Death of the Hardware Engineer

A Dirge for the Digital Designer

Any engineering discipline done well should ultimately be self-eradicating. The key problems should be solved from the bottom up, and the creative genius of each generation should be absorbed into the collective tooling, IP, and best-practice methodologies of the next. Today, digital design bears little resemblance to what I learned in school twenty something years ago. For many of today’s bright young engineers, DeMorgan equivalents are something they learned in an introductory logic design class, but not anything they apply in their day-to-day work. They’re much more likely to be worried about whether the Ethernet stack they are dropping into the software side of their system is compatible with the version of the MAC they bought from their silicon IP supplier, whether the layout will meet timing without some manual tweaking to the chip layout, and if electro migration will cause a reliability problem in their 90nm-based technology at the junction temperatures they’re likely to be running.

The abstraction level of digital engineers’ thinking has gone from transistor to subsystem over the course of four decades. Sure, there are people still optimizing the design of the common transistor today, but the vast majority of the engineering world takes their work for granted – including the biannual doubling of density and frequency. Atop that transistorized foundation, a framework of ever-higher-level structures has been designed, refined, repeated, and commoditized so that future re-design is mostly unnecessary. While we engineers may be propelled by the “not invented here” syndrome to re-invent the wheel a few times, eventually we tire of the exercise and want to move on to the rest of the car.

In our case, the “rest of the car” has moved from gates to multipliers to arithmetic logic units, and now to subsystems like processors, memory, and I/O modules connected by standardized plug-and-play interconnect fabrics. Once we conquer a level, we rarely go back to visit except for occasional tuning. The result? Our concerns have moved gradually from the core to the periphery, and more and more often the thing we’re designing is an environment in which software can encapsulate most of the true complexity of our application. A growing number of electronic systems today amount to “design a high-performance computing system that fits in this form factor, attaches to these peripherals, and burns this much power.”

Given the state of platform assembly tools, even today, the complexity of that task is rapidly approaching zero. I’ve sat through a number of demonstrations where marketers, company executives, and even supremely unqualified technical editors are used to demonstrate that complex computing systems could be created by relative neophytes in just a few minutes and with just a few mouse clicks. What’s left for the real, trained hardware engineer to do?

Clearly, our work here is not yet done. There are newer, faster interfaces to invent, better busses to build, and more powerful processors to pursue. However, the true frontiers of electronic technology exploration are increasingly moving to where the electronics touch the real world – mechanical interfaces – and where the new complexity resides – software. Think about one of today’s challenging system design problems – hardware/software partitioning. Why do we do this? Obviously anything that we can handle effectively in software is best done that way. What’s left is what we should put into hardware. Usually, that means functions for which we can’t get the desired performance in software. For many of these, there are existing IP blocks that we can grab and plug. For the diminishing number of functions that remain, we need to custom design some hardware – using ASIC, FPGA, or other implementation technology, with hardware description languages like VHDL or Verilog. This is the main domain of the endangered EE.

Now, we have new tools (that are somewhere between primitive and semi-sophisticated) that will allow even these parts of many designs to be created by a software engineer and then compiled into custom hardware accelerators. Over time, these tools, combined with improved programmable hardware fabrics like FPGAs, will allow software engineers to take over many of these acceleration tasks. The realm where true custom logic design is actually needed will be rarified even more.

There are analogies to this type of revolution in many other disciplines – assembly programming is not dead. There are still a few applications that require the bit- and- register-level discipline that can be achieved only with such detailed coding, but the mainstream has moved on to higher levels of abstraction. What is happening today is that the majority of the functionality of any embedded system is moving up the food chain to the software engineer. Over the coming years, specialized digital hardware design skills will be required less and less frequently. We’ll all gradually follow in the footsteps of the one-hour photo lab.

So – what are we to do? Should we hardware types all give up, turn in our soldering irons, take Java classes online, and join the unwashed masses of keyboard-bound pizza gobblers pounding out programs for peanuts? (Oh man! I can hear that comments box filling up already.) Is “digital system designer” joining the ranks of “typist,” “telegraph operator,” “keypunch technician,” and countless others in the Smithsonian Museum of Obsolete Occupations? Should we lobby our governments for protection of the profession – levying taxes on software-based functionality that would make it more attractive to implement new things in hardware? Do we set about creating a subversive plan to upset the foundation – moving to three-state logic or attempting some other stunt that would guarantee us all work for the next few decades re-inventing our last century’s work?

Probably such drastic measures are not required. Our profession will most likely evolve rather than die. As more of the digital system design moves into software, more of the burden of designing to power, form-factor, and interface standards will fall on a single class of super-systems-engineers working with both hardware and software concepts and supported by highly sophisticated tools and technology. This new professional will not be the same software developer that creates applications for desktop computers. This person will also need a new type of education that our system does not yet provide. This category of designer will be born in industry, and will have their discipline later formalized into academia. In fact, their forefathers are out there already – creating the latest versions of consumer and industrial embedded systems using newer, more advanced design techniques – pioneering new methodologies for getting more complex designs to market faster. Their creative breakthroughs will be tomorrow’s pedagogy. It always works that way. Bright minds cannot help but find challenging problems to solve.

Leave a Reply

Death of the Hardware Engineer

A Dirge for the Digital Designer

Exactly two hundred years ago this June, Augustus De Morgan was born. Arguably, before that time, there were no logic designers in the world. For the next 200 years, however, logic designers steadily increased in number until today, when we walk the earth in six or seven digit numbers. In the big picture, however, the time for our species may be drawing to a close. Self-made storm clouds have been on the horizon for awhile now, the engineer-extincting meteors are headed for earth, and the distant dirge of death for the digital design profession as we know it grows ever-louder over the horizon.

Any engineering discipline done well should ultimately be self-eradicating. The key problems should be solved from the bottom up, and the creative genius of each generation should be absorbed into the collective tooling, IP, and best-practice methodologies of the next. Today, digital design bears little resemblance to what I learned in school twenty something years ago. For many of today’s bright young engineers, DeMorgan equivalents are something they learned in an introductory logic design class, but not anything they apply in their day-to-day work. They’re much more likely to be worried about whether the Ethernet stack they are dropping into the software side of their system is compatible with the version of the MAC they bought from their silicon IP supplier, whether the layout will meet timing without some manual tweaking to the chip layout, and if electro migration will cause a reliability problem in their 90nm-based technology at the junction temperatures they’re likely to be running.

The abstraction level of digital engineers’ thinking has gone from transistor to subsystem over the course of four decades. Sure, there are people still optimizing the design of the common transistor today, but the vast majority of the engineering world takes their work for granted – including the biannual doubling of density and frequency. Atop that transistorized foundation, a framework of ever-higher-level structures has been designed, refined, repeated, and commoditized so that future re-design is mostly unnecessary. While we engineers may be propelled by the “not invented here” syndrome to re-invent the wheel a few times, eventually we tire of the exercise and want to move on to the rest of the car.

In our case, the “rest of the car” has moved from gates to multipliers to arithmetic logic units, and now to subsystems like processors, memory, and I/O modules connected by standardized plug-and-play interconnect fabrics. Once we conquer a level, we rarely go back to visit except for occasional tuning. The result? Our concerns have moved gradually from the core to the periphery, and more and more often the thing we’re designing is an environment in which software can encapsulate most of the true complexity of our application. A growing number of electronic systems today amount to “design a high-performance computing system that fits in this form factor, attaches to these peripherals, and burns this much power.”

Given the state of platform assembly tools, even today, the complexity of that task is rapidly approaching zero. I’ve sat through a number of demonstrations where marketers, company executives, and even supremely unqualified technical editors are used to demonstrate that complex computing systems could be created by relative neophytes in just a few minutes and with just a few mouse clicks. What’s left for the real, trained hardware engineer to do?

Clearly, our work here is not yet done. There are newer, faster interfaces to invent, better busses to build, and more powerful processors to pursue. However, the true frontiers of electronic technology exploration are increasingly moving to where the electronics touch the real world – mechanical interfaces – and where the new complexity resides – software. Think about one of today’s challenging system design problems – hardware/software partitioning. Why do we do this? Obviously anything that we can handle effectively in software is best done that way. What’s left is what we should put into hardware. Usually, that means functions for which we can’t get the desired performance in software. For many of these, there are existing IP blocks that we can grab and plug. For the diminishing number of functions that remain, we need to custom design some hardware – using ASIC, FPGA, or other implementation technology, with hardware description languages like VHDL or Verilog. This is the main domain of the endangered EE.

Now, we have new tools (that are somewhere between primitive and semi-sophisticated) that will allow even these parts of many designs to be created by a software engineer and then compiled into custom hardware accelerators. Over time, these tools, combined with improved programmable hardware fabrics like FPGAs, will allow software engineers to take over many of these acceleration tasks. The realm where true custom logic design is actually needed will be rarified even more.

There are analogies to this type of revolution in many other disciplines – assembly programming is not dead. There are still a few applications that require the bit- and- register-level discipline that can be achieved only with such detailed coding, but the mainstream has moved on to higher levels of abstraction. What is happening today is that the majority of the functionality of any embedded system is moving up the food chain to the software engineer. Over the coming years, specialized digital hardware design skills will be required less and less frequently. We’ll all gradually follow in the footsteps of the one-hour photo lab.

So – what are we to do? Should we hardware types all give up, turn in our soldering irons, take Java classes online, and join the unwashed masses of keyboard-bound pizza gobblers pounding out programs for peanuts? (Oh man! I can hear that comments box filling up already.) Is “digital system designer” joining the ranks of “typist,” “telegraph operator,” “keypunch technician,” and countless others in the Smithsonian Museum of Obsolete Occupations? Should we lobby our governments for protection of the profession – levying taxes on software-based functionality that would make it more attractive to implement new things in hardware? Do we set about creating a subversive plan to upset the foundation – moving to three-state logic or attempting some other stunt that would guarantee us all work for the next few decades re-inventing our last century’s work?

Probably such drastic measures are not required. Our profession will most likely evolve rather than die. As more of the digital system design moves into software, more of the burden of designing to power, form-factor, and interface standards will fall on a single class of super-systems-engineers working with both hardware and software concepts and supported by highly sophisticated tools and technology. This new professional will not be the same software developer that creates applications for desktop computers. This person will also need a new type of education that our system does not yet provide. This category of designer will be born in industry, and will have their discipline later formalized into academia. In fact, their forefathers are out there already – creating the latest versions of consumer and industrial embedded systems using newer, more advanced design techniques – pioneering new methodologies for getting more complex designs to market faster. Their creative breakthroughs will be tomorrow’s pedagogy. It always works that way. Bright minds cannot help but find challenging problems to solve.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Industrial Internet of Things
Sponsored by Mouser Electronics and CUI Inc.
In this episode of Chalk Talk, Amelia Dalton and Bruce Rose from CUI Inc explore power supply design concerns associated with IIoT applications. They investigate the roles that thermal conduction and convection play in these power supplies and the benefits that CUI Inc. power supplies bring to these kinds of designs.
Aug 16, 2024
50,903 views