feature article
Subscribe Now

More Tools in the Cloud

Cadence and Elastifile Debut

Seems like it wasn’t long ago that we did a piece on EDA in the cloud. And yet here we are again, with yet more developments. One comes from an EDA vendor, the other from a company focused on the cloud infrastructure needed to handle EDA in the cloud. We’ll take them in that order.

Cadence’s New Cloud Approach

A long while back, we looked at Cadence’s original cloud venture. It featured a private Cadence-hosted cloud that their customers could utilize. And… things were kind of quiet after that. This was the era when EDA companies were recognizing the value that could be had with cloud-based tools, but it was also the era where no one trusted sending their crown jewels up to the cloud – they might be hacked.

Well, times have changed, and management has relaxed in the last very few years, making cloud-based EDA more attractive.

As we’ve noted before, there are really two different sets of designers that the cloud will appeal to. On the one hand, you’ve got big companies with big (OK, never big enough or growing fast enough) budgets for tools, with locally installed and customized flows. They like the cloud for bursty requirements. That means that they don’t have to invest in permanent capital for short-term need.

On the other hand, you have small companies without an adequate EDA budget, and, for them, all of their designs can be done in the cloud, saving huge amounts of capital expenditure.

Three Options

So Cadence has come back now with their new, revamped cloud plan. And it has three possible embodiments, as illustrated below.

(Click to enlarge; Image courtesy Cadence)

Let’s look at this piece-by-piece.

On the left, we have the “customer-managed” side: Cloud Passport. This is for large companies that have internal teams that manage their cloud presence. They may already have a preferred cloud provider that they use, and they work with Cadence to install the Cadence tools and license servers.

Then, on the right, is the “Cadence-managed” side of things, and this has two – well, actually three – components. First is the basic cloud-hosted design solution. Here Cadence manages the cloud; the customer sees a Cadence portal, and the cloud details are abstracted away from the designer. Entire flows can be provisioned using this approach.

The Express Access bit, just to the right of the cloud-hosted solution, is the easiest way to access a single tool. So, for example, a small company may want to host an entire flow in the cloud – that would be the cloud-hosted design solution. A big company, however, may want to access – say – one verification tool to add capacity to a burst of activity. Express Access gets them to that specific tool.

Finally, on the very right, we have the Palladium Cloud. As you may know, Palladium is Cadence’s emulation solution. Like all emulator providers, they’ve moved to a datacenter-centric model where the emulators can be ganged in a datacenter for access by anyone with the right authority. Many big companies manage such datacenters, and this has given them flexibility, allowing a single machine to be used by more teams.

But what if you’re not a huge company with a giant capital budget for emulators? That’s where the Palladium Cloud comes in. Cadence is provisioning their own Palladium farm, and they’ll act like – no, they’ll truly be – emulators in the cloud. Or emulation-as-a-service (EaaS?) (Kidding!). Here again, this will give big companies room to expand temporarily for bursty business, while it will give small companies access to emulators that they might not have been able to afford before.

Allowing access to emulation without having to install emulators is a bigger deal than simply not having to shell out for capital. Palladium emulators use a fair bit of power, and they dissipate that power – so much so that the machines are water-cooled. So you can’t buy these and set them on a desk somewhere. Your facilities folks need to run water to – and from – them. So not having them onsite saves a lot of work.

Directly connecting to clouds like AWS is essential for enterprises. Cadence has also added some security layers to the existing cloud security system. They keep an eye on firewall rules; they have their own event logging system that they can monitor for suspicious activity; and they have built their own secure access system so that there’s no need for a VPN to get to the cloud. 

Is EDA Natural for the Cloud?

You might think that, since cloud servers are simply servers in someone else’s building, it’s just a matter of installing tools up there instead of down here. And, once upon a time, that might have been true. But the cloud folks have been busy building abstraction and laying down infrastructure. And EDA hasn’t been at the top of their priority list.

Here’s the thing: cloud storage is set up for objects. This is an abstract approach that can be applied broadly to many different kinds of software in wildly diverse applications. But EDA tools, when they access storage, don’t look for objects; they want a standard NFS file system. While Cadence says that some cloud providers are working on making this natively available, it’s not there yet. So how does an EDA company make their tools work on a cloud server?

One approach would be to change all the EDA tools so that they speak object instead of NFS. That’s a lot of work, and it probably requires ripping up bits of software that have been working just fine for a long time. No one wants to disturb that kind of code, lest that cause all kinds of new issues. So the alternative is to provide NFS access in the cloud so that these tools will work up there like they do down here. But who’s going to do that work?

Well, in Cadence’s case, it seems like they have already done the work themselves. I also checked with Synopsys with regard to their cloud offering; they declined comment. Mentor has built a Veloce emulator farm that feeds a virtual private cloud that their customers can use. As this is purpose-built, it can use whatever requirements Veloce sets up. (I wasn’t able to get more info any other cloud-based Mentor tools; if I get that later, I’ll update here.)

But for companies that don’t wish to do it themselves, a company called Elastifile is providing an NFS file system for cloud providers. While it will work for any industry, they’re particularly targeting EDA for the moment.

It’s not an abstraction atop the object-storage system; it’s built as a native system. That said, object storage remains the most straightforward way to handle things, so permanent archival storage is still handled by object; the NFS system spins up for use only while the tools are running. Bridging the gap is what they call CloudConnect, which connects files with objects. So, for example, when the cloud is used for burst capacity, the on-premise datacenter resources, in the form of NFS files, can move to the cloud transparently.

CloudConnect uses a proprietary object format to handle EDA-related files. This format includes metadata about the file. It’s not human-readable, so they feel that it will be more secure than using the cloud-native object format.

(Click to enlarge; Image courtesy Elastifile)

Given the change in attitudes towards designing in the cloud and access to NFS capabilities, perhaps more EDA tools will find their way up there.

 

More info:

Cadence cloud offering

Elastifile

4 thoughts on “More Tools in the Cloud”

  1. Design a system where the important IP design data stays with the customer, uses highly secure communications, and the cloud does ONLY the processing, but NEVER retains ANY DATA past a working session. AND provide REGULAR RANDOM active audits of the cloud facility by 3rd parties that can certify that the customer data isn’t being skimmed, stored, or stolen by monitoring local storage and external communications.

    When the facility wants you to store your data there, or has terms in the licenses that allows them to collect your data for whatever purpose …. run. Because their facility will be targeted to get your data … either by bribing staff on site, or hacking in to the site.

    If you are working on open source, open hardware, who cares?

  2. As engineers we often are required to do work on very old projects, because some critical customer needs a critical update …. that’s reasonably easy when you archive entire development systems (hardware, system software, tools, and the design).

    It get’s a lot harder when you are unable to create these long term archives that span a decade or two, because the data and development system were lost when your clould vendor canceled the service or went out of business.

    If you have a pump and dump product, and no commitments to your customer base, who cares?

  3. I think I’d still ask for a locally installable product that has a perpetual license for existing designs that allows, bringing critical projects in house for security and archival … even if much of the work is done on the cloud.

Leave a Reply

featured blogs
Nov 15, 2024
Explore the benefits of Delta DFU (device firmware update), its impact on firmware update efficiency, and results from real ota updates in IoT devices....
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Dependable Power Distribution: Supporting Fail Operational and Highly Available Systems
Sponsored by Infineon
Megatrends in automotive designs have heavily influenced the requirements needed for vehicle architectures and power distribution systems. In this episode of Chalk Talk, Amelia Dalton and Robert Pizuti from Infineon investigate the trends and new use cases required for dependable power systems and how Infineon is advancing innovation in automotive designs with their EiceDRIVER and PROFET devices.
Dec 7, 2023
59,428 views