feature article
Subscribe Now

A Few Rounds on EDA360

Several months ago, Cadence announced a strategy called EDA360. It was accompanied by a whitepaper that was eventually made public. It was, and is, billed as the brainchild of Cadence CMO John Bruggeman and was issued as a call to arms – a manifesto, even – to the EDA industry.

It can be summarized as, “It’s the software, stupid.”

And it caused a bit of a stir, some of it the kind Cadence would like, some less so.

Well, now that some time has passed to let everyone calm down some, seems like it might be good to come back to this topic in the hope that rational heads can prevail. What does this mean for Cadence and for the industry?

OK, that sounds way too generic. Here’s the real question people ask: is this just marketing hype or is there some substance behind it? The specific reason that question applies is that most examples given by Cadence about how EDA360 gets put into play involve pointing to products that have existed long before EDA360. Which almost makes it look like the strategy is really a repositioning of existing products.

In fact, to hear Cadence describe what’s happened since then, there’s both substance and marketing going on here. But… I’m getting ahead of things. First, let’s back up and review what the hoo-hah is all about.

Motivated by a so-called “glass ceiling” where EDA can’t seem to raise their functionality above synthesis (not to mention raise their collective revenues), the concept is that creating chips shouldn’t be the primary focus of EDA. The industry should travel further upstream to that which executes on the chips: the software application. Or applications (a system architect’s “application engine” is a chip designer’s “mode”).

There was a time when chips did things. They no longer do so much. Software does things, and the chips let the software do its thing. So you could say that designing a chip without knowing the software is like filling a spice rack without knowing which type of cuisine you’re going to cook. It’s not all just paprika and garlic salt anymore.

Cadence divides the effort needed to bridge between application and IC into three layers. On the inside is the IC layer, which pretty much covers a lot of what is done in EDA today. It’s how you make a chip.

Today’s bigger chips are monolithic embedded systems – the so-called System on Chip (SoC). The next layer that Cadence defines is SoC realization. This takes the SoC itself and slathers over it the basic software required to connect the hardware with other software. It’s a patina of software, a kind of passivation layer (to pacify software guys) that makes hardware look like software. Basically, drivers. And debug support. Low-level stuff. They also include integration of hard silicon IP here.

At the top is system realization. This pretty much includes everything else: operating system, virtualization, middleware, application.

Pretty simple. Software on top, hardware on the bottom, and an adaptation layer in between.

And with something like this, there’s always more than one schema that can represent the order of things, so that, while people can quibble about whether it’s right, it’s definitely not wrong.

And here’s where the questions start. Cadence and the rest of EDA have most visibly occupied the lower (or inner) layer: the silicon one. Actually… that’s not quite true. Mentor, for example, has been doing embedded stuff and hardware/software co-design since before “seamless” was trendy. But, in fairness, if you look at the amount of focus on what it takes to implement designs at 28 nm, well, yeah, there’s a lot going into silicon, no doubt.

And, in fact, when you look at Cadence’s products and their recent “holistic approach” announcement, it focuses on the silicon side of things. (That’s not my interpretation; they’re very up front about it.)

A discussion with Cadence’s EDA360 Evangelist Steve Leibson sheds some further light. (As an aside, being an evangelist isn’t always an enviable job… it can carry the connotation of a message that must be delivered to ears that are not always receptive…). He points out that Cadence has completely restructured along the lines of the three layers, and that people are busily executing away against the vision. Not a small matter for a large company.

So is it substantial or marketing? Yes. Both. It’s clear they’re tying existing products into the vision. That’s all they have for now. And, frankly, if you have a vision in which your current products have no place, you’ve got a problem. It also sounds like they’re working on a lot of stuff they’re not talking about yet. So, until they announce new stuff, it’s all going to be about the old stuff.

So if they are doing more, are they tackling the entire vision or just parts of it? Here things get a bit more vague. Leibson claims that Bruggeman’s intention is to “create cooperation where competition doesn’t do any good.” Again, it’s hard to point to specific new initiatives since the two examples mentioned (CPF and UVM) predate the strategy.

The whole concept of standards and openness and such seems to hit a bit of a sore spot. After all, CPF does co-exist with UPF, a less-than-desirable situation. And some of that dispute was about openness. So rather than open all of that up again, let’s just say that, to paraphrase Steve, human nature has to be factored into all of this. Bottom line: Cadence isn’t necessarily taking on the entire thing; they’ll be looking for willing collaborators. Exactly how much they do remains to be told.

Exactly what Cadence plans to do on its own is further clouded by their position on having a full flow. Steve correctly points to the countless IC flows and the number of people employed not in creating ICs, but in managing the flows. These flows comprise numerous tools (presumably considered “best of breed”) stitched and scripted together into a tottering, wobbly, amorphous mass that bears silence in its proximity lest the whole thing crumble in the acoustic wake.

So Cadence’s solution is simple: you should use a full-flow provider. (Us.) Don’t mess with all those point tools out there. One flow can easily provide designs that customers “will be happy with.” Which sounds like “good enough is good enough.” Which is often true. What about in this case?

When asked about the “best of breed” concept, Steve points out that the notion is actually very difficult to define since most tools provide a range of results depending on the design, and that the ranges from competing tools tend to overlap a lot. So every company can point to the particular set of benchmarks that prove its tool is the best. That’s not so much “good enough is good enough” as much as it is “there’s no such thing as ‘best of breed’; the tools are more or less all the same.”

When asked about the start-ups that tend to bring focused new innovation and then get gobbled up by some big guy, he just doesn’t see the little guys being able to out-innovate the well-funded large companies to the point where they could create that much differentiation. Yeah, you might be 10% better, but that’s not going to turn many heads if you’re a small unknown company.

That doesn’t necessarily square with what’s happened in the industry over the last while, but, then again, this whole full-flow discussion goes back and forth, and each company has to re-hash the issue at some point. Magma was focused on full flow and then decided to limit their efforts to areas where they could provide unique value; they’ll argue that full-flow provides no value. (Ooo, wait, if they changed their mind again, then they’d have come full circle and could have a clever new campaign about 360 deg… oh… wait… never mind.)

Something tells me we won’t be settling the flow issue any time soon.

So, for all the sturm und drang associated with this initiative, we’ve uncovered very few areas that would drive more than perhaps some animated discussion. So what is it about this that got everyone’s goat? Do people think the strategy is fundamentally wrong?

The answer to that last question is a vigorous “No.” No one is saying that the concept of software taking on primacy is wrong. In fact, many say that many have been saying it for quite a while. And that’s where people get annoyed: Cadence has been pointing to recent keynotes and other speeches (and only recent ones) where Synopsys’s Aart de Geus and Mentor’s Wally Rhines have been speaking of the importance of software as evidence of Synopsys and Mentor coming on board and endorsing the Cadence message.

And the hackles rise.

Both Synopsys and Mentor say they’ve been articulating this message and executing against it for some time now.

Synopsys’s John Chilton says, “As software has consumed a larger part of the design cycle, the complexity problem increasingly has become a software problem. In fact, this has been a key plank in Aart’s messaging for several years.” He points to their efforts and acquisitions in the IP and systems space as examples of execution both organically and by going outside for technology.

Mentor’s Ry Schwark notes that, “We’ve been in the embedded software market… for about 15 years because we believe in the importance of software in design.  Our embedded software is in more than half of all cellphones on the market.  Odds are our embedded software is in your handset.”

Feathers clearly remain ruffled. Said one person who declined to be named, “[It’s] like crashing a party then telling the host you’ve decided it’s your house.”

So, beating a discreet retreat, let’s summarize what this all amounts to.

It comes back to this: Cadence is saying, “It’s the software, stupid,” and re-orienting their business around it.

(And everyone else is saying, “Yeah, we knew that.”)

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Developing a Secured Matter Device with the OPTIGA™ Trust M MTR Shield
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Johannes Koblbauer from Infineon explore how you can add Matter and security to your next smart home project with the OPTIGA™ Trust M MTR shield. They also investigate the steps involved in the OPTIGA™ Trust M Matter design process, the details of the OPTIGA™ Trust M Matter evaluation board and how you can get started on your next Matter IoT device.
Jul 2, 2024
31,971 views