feature article
Subscribe Now

Advancing FPGA Design Efficiency: A Proven Standard Solution

For decades the SoC design community has consistently lost ground in the battle to match advances in design technology productivity with the growth of available silicon technology. The silicon evolution roadmap has long been chronicled via Moore’s law, so how could the design community allow the existence of the well-known “Design Gap?” Understanding how we got to this point will make it easier to answer that question and make reasonable adjustments for the future, especially if there are obvious things to be learned from the evolution of many analogous industries.

Even in the advanced design field of FPGAs today; most FPGA IP is vendor-supplied and locks the purchasing company into the vendor it was purchased from. This paradigm limits the amount of growth that can happen in industry. In order for the market to grow as a whole, FPGA IP vendors must support and adopt industry standards. By embracing and adopting more standards into FPGAs we can ensure the time to market and cost advantages of standard FPGA IP are met.

Despite any and all management attempts, marketing programs and training, FPGA development is “product driven” in thought and implementation of design methodology, tools and flows. This is largely unavoidable as the entire practice around automation has evolved from and been built around the immediate gratification of “just automating what the designer is doing.” Simply, take the activity, implement it in code or hardware, and string it to the next activity in the design chain.

That worked well in the beginning and did make the design process faster, though it initially left the integration of tools as a largely manual activity, patched into place as so-called integrated tool flows evolved. New tools are individually grafted-in as older versions break from the demands of increasing design complexity, and so tool “integration” has progressed as best it can. “Best of breed” tool alternatives, often with incompatible interfaces, are sometimes brute-forced into flows, especially when the alternative is a prolonged wait for a proprietary solution from a vendor to which you are often inevitably tied.

Hence the existence of the design gap is now entirely predictable. Design technologies are today operational, segmented and compartmentalized for design-function specialists, clearly lagging what the FPGA community needs to create products with the right features, on-time and fully utilizing the available silicon technology. If we had done better with the design process, could we have demanded and achieved faster evolution in silicon (and other) implementation technology?

Nevertheless, there are signs of industry maturation on the horizon. The last five years have seen a clear acknowledgement among major design teams and corporate visionaries that things need to change. We have moved (see Figure 1) from a mentality of seeking solutions with a brute-force building and management of massive SoC design teams and an “integrate or die” approach, through “collaborate or die” where one or two major players get together to pool resources on critical high profile programs, and now to “standardize or die.” In this more mature phase, groups of companies are establishing nimble consortiums that might broadly address those design infrastructure needs that have grown too large for individuals to create them alone.

These consortiums are led by the rapid prominence of groups such as OCP-IP and SPIRIT and ECSI out of Europe, and are also evidenced by the much older and original appearances of VSIA in the U.S. and STARC in Japan. The new arrivals are up, running and building structure, while the older protagonists have evolved and continue to refine and tune their roles for the membership they serve.

So what does the model look like that we have followed, and what might we do in the future to better advance SoC design? Consider the “Industry Development Model” (see Figure 2). Here we plainly see the evolution of how products (P) progress. Most companies start with a well-defined, native product and make as much headway possible selling it in its minimum form until competition dictates they must bring more to the table to survive and thrive.

Thus, products next evolve to a value-added form and competition drops off as competitors fail to keep up, or focus efforts to more viable ground for their circumstances. Still, the competition stiffens even more and another evolution takes place wherein “whole products” are evolved (as popularized by Geoffrey Moore’s descriptions). A whole product is an even more compelling offering whereby the purchaser can often replace a whole set of products and internal operation or development, by getting a superior external solution. He is given a simpler “make or buy” alternative, if he trusts to outsource a once-internal solution. Again (per our model), from the product provider perspective, the number of competitors drops off and a few providers remain, but in a much stronger and more prominent industry position than when they entered that product market with their simple, standalone native product.

But what does Industry get as a result? Generally, we then have a few strong players competing for other related providers to add to their personal infrastructure and make their proprietary product the only real business alternative. Herein lies the problem. What is considered an ideal infrastructure is determined by a commercial and proprietary entity. Often great solutions can and will emerge, but if they are not well suited to the diverse needs of users, this can be a real set-back.

A natural evolution of the whole product is when that product itself supports a standard (S). When this transition occurs, the total number of infrastructures needing support shrinks dramatically. Indeed, the whole products themselves can be a part of the “Standard” infrastructure. Suddenly we can center efforts around just a single infrastructure and bring clarity and focus to a community of providers. The entirety of a shared infrastructure can dramatically focus industry efforts and indeed help whole-product proprietary providers, by letting them focus on their product, without fighting for Industry following to build what is often a token, “complete” solution.

Thus, everyone can focus on known needs and all players can expand their individual core offerings with the diversity a much larger market can embrace. So, how do we evolve to the Infrastructure model? In general, when we follow the product-driven corporate path we will see good technical progress and corporate growth, but if we wish to truly accelerate the growth of an industry, we must evolve with specific direction and shared effort. The bottom line, industry grows and matures faster when there are shared and common resources and practices, and this can ONLY come through the common ground of standards.

The inevitable question is: “How do we accelerate this process?” Unfortunately, in many ways this is an evolutionary event. As we see from discussion above, there is already progress afoot with real, cross-company coordination. Lessons for this maturation process are all around as we look at most all adjacent technological spaces. As industries mature, the major players serve segments of those industries. So, for a new generation of technology to emerge, it is normal to see major players get together and define standards to exchange work. We see this in aerospace, telecommunications, ship-building, etc. and indeed all major industries where problems grow so large that needs ultimately outweigh individual goals. Major players will look for competitive advantage by sharing to grow and this, in larger total markets that are created as a result. If the FPGA Community bands together to support and adopt open standards pressure is put on the larger companies to integrate FPGA’s and bring them together in a uniform way.

What does it mean for the design industry? Expect to see more collaboration and standardization between major users and providers; hope to see real collaboration and coordination between the groups. As we inevitably embrace the need to move along design productivity more rapidly and get on a different design integration path, we will abandon the “product first” approach and build on standards, such that we capitalize on the massive shared infrastructure inherently produced.

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Dec 24, 2024
Going to the supermarket? If so, you need to watch this video on 'Why the Other Line is Likely to Move Faster' (a.k.a. 'Queuing Theory for the Holiday Season')....

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured chalk talk

Developing a Secured Matter Device with the OPTIGA™ Trust M MTR Shield
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Johannes Koblbauer from Infineon explore how you can add Matter and security to your next smart home project with the OPTIGA™ Trust M MTR shield. They also investigate the steps involved in the OPTIGA™ Trust M Matter design process, the details of the OPTIGA™ Trust M Matter evaluation board and how you can get started on your next Matter IoT device.
Jul 2, 2024
32,004 views