feature article
Subscribe Now

Modelling: not just for big boys?

Years ago, when I worked in PR, I used to visit a telecoms company every few weeks, with a colleague. (PR agency people always seem to travel to clients in pairs – I have never understood why.) The visits were usually to be briefed on a new product or a new release, and the briefings were conducted by engineers. On the way, we used to make small bets on how long it would be before the briefing engineer got up and started using the white-board: it was rarely more than five minutes into the meeting. Now, when on the receiving end of the PowerPoint presentations I used to help develop, if the presenter is, or was once, an engineer, there are frequently times when a notebook comes out and diagrams are sketched to make a point more clearly.

Engineers seem to think better when they do diagrams – for them a picture may be worth at least a thousand words. And yet modelling, model driven development, UML, SysML — all the different names that are bandied around for defining a system through pictures rather than a screed of words, has not made significant headway outside areas like defence and aerospace. In fact, within defence, Unified Modeling Language (UML) is mandatory for projects in the US and UK.

There are tools from a range of sources. At the top end of the pile for embedded is Telelogic. Telelogic, having swallowed up a number of players in the modelling field such as iLogix to create a broad range of tools, has itself been bought by IBM. IBM had previously bought Rational Software, a leader in using UML for developing enterprise IT applications. Telelogic’s tool range is large, but the two most important products for embedded applications are the Doors requirements specification tool, which we are not going to look at, and Rhapsody, a suite of UML/SysML based modules for modelling and related activities. (SysML – System Modeling Language – is an extended sub-set of UML, tuned for describing systems rather than software.)

The biggest independent player is Artisan, with Studio just about to enter release 7.0. This is aimed at embedded projects, but Artisan stresses the centrality of their tools to collaborative products, with “work as one” as a slogan. There are other commercial players, but not at this scale.

IAR has taken a slightly different approach, using the state machine subset of UML as the basis for its visualSTATE modelling and code generation suite. While not full system modelling, it is optimised to produce compact C/C++ code for constrained embedded systems.

All three of these companies are concerned with making their tools integrate with other parts of the tool chain. Telelogic was already moving towards Eclipse compatibility when they were bought by IBM, the company who created Eclipse and donated it to the software community. Artisan has just acquired High Integrity Solutions (HIS) to get full ownership of Vds (V-Design System), a framework like Eclipse but aimed at mission and safety-critical systems and software development. And IAR stresses how well visualSTATE integrates with the IAR Embedded Workbench of compiler and debug tools.

If you Google for shareware in this area, there are a lot of free/low-cost tools out there, many of which seem to generate Java as first choice. And Rhapsody’s modelling module is available as a free download, with the intention of sucking you in to the process. Artisan provides a free, time-limited evaluation version of Studio.

So what is involved? In simple terms, you draw diagrams of your system, formalised versions of the white-board images you use when describing something to other engineers (or even PR people). Depending on the conventions you use, different shaped blocks have different meanings, and the lines joining them use different symbols to indicate different interactions. Most modelling tools let you drag and drop these symbols and connections. You can start at the very highest level and then decompose the high-level blocks into lower-level ones. These diagrams can be used for communication, and, at the risk of repetition, they allow you to demonstrate to the end user or customer that you understand the problem much more easily than a with telephone-book-sized written description.

But then you can do more. You can check that your model is correct: most modelling tools have built-in verification tools that are able to find mistakes that have been made in the basic design that appear totally legal to code inspection and normal debugging approaches.

The next step is neat – the model then generates code. It does it quickly and to defined standards. The mainstream modelling tools normally generate C/C++ and a range of other languages, while many of those from the open source community generate Java. At this stage, you are back in the normal development cycle, ready to compile, integrate, debug, etc. Most tools allow you to add your own hand-coded segments to the generated code, and with Rhapsody it is possible to go backwards from legacy code into the model. IAR claims that visualSTATE is outstanding in creating very compact code for memory-limited applications.

Why would you want to go through these hoops when it is possible to create systems in other ways? Advocates of the modelling approach have an array of arguments. It makes more efficient use of human resources. From the management viewpoint the objective is to get a system developed and into the market as efficiently as possible. That means quickly and cheaply. If the best brains in the organisation are not hacking code or figuring out why a particular bug keeps appearing, but instead applying their intelligence to the highest level of design, that has to be good, surely?

By using models, it is easier to get buy-in on the design from other parts of the company: marketing people can more easily grasp the functionality, and it is easier to incorporate changes and refine the design when working with a model. Many systems now provide an option for generating a human interface – buttons, output screens, etc. – to simulate the behaviour of the final design: another way to ensure that what is being designed is what the specifier required.

The code created by a code generator may not be a marvel of elegance, but it can be compact, is correct, and, importantly, is produced quickly. Almost as significantly, changing the initial model and generating new code is much easier when producing variants. It is easier, for example, to create a family with the same basic hardware and different features embodied in software.

A feature that all the proprietary tools seem to have is automatic documentation: at the press of a button the current state of the design can be recorded and filed for future use in a variety of possible formats, including word documents, spreadsheets, state diagrams or even the model itself.

But if this approach is so great, why aren’t people falling over themselves to use these tools? Three main reasons seem to be an appropriate match between tool and application, cost, and cultural issues.

There is, for many developers, a mismatch between their needs and the mainstream modelling tools. The major suppliers’ products are expensive, often tens of thousands of dollars or more, and geared to group working – the tools are designed for defence/aerospace and safety critical applications, where the end user accepts high prices in return for products that are bullet proof. (Sometimes literally.) If you are working on a small project, on your own or as part of the typical small team, then they are clearly over-sophisticated. (An analogy is that it is rather like how Microsoft, in gearing Office for enterprise use, group working and so on, has made it no longer such a useful tool for the single user.)

Cost is a significant barrier to the wider adoption of model-based design. There are free/shareware tools, but these often have limitations and minimal support. If you want tools from a proprietary source, then it costs. The big suppliers are expensive, as we have discussed. Even IAR’s visualSTATE is priced in thousands of dollars, which is not expensive compared with the cost of a programmer’s time when debugging a product but still runs into the barrier that most companies won’t spend more than $1K on a tool.

Also there is a cultural issue: if you have spent the last few years honing your skills as a C++ programmer, let alone as an ace assembly language programmer for really memory-restricted applications, you are not going to want to hear that this piece of software can replace you. I remember assembler programmers for IBM mainframes in the late 1960s complaining that FORTRAN was so inefficient that it would never be useful. (And as for COBOL….)

Another argument is that the tools take time to learn, particularly when project deadlines are tight. Well, none of them claim to be completely intuitive, but the trade-off is increased productivity, not just on the project but on all future ones.

So is modelling just for big boys? Well, I hate to say it, but unless companies change their views on an appropriate level of expenditure for tools, yes it is, today. However I have heard that one company is planning an announcement that could change this scenario quite markedly. And if the rumour is true, there may soon be a tool that will be very cost-effective for the single user. And then modelling will no longer be just for the big boys.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Shift Left Block/Chip Design with Calibre
In this episode of Chalk Talk, Amelia Dalton and David Abercrombie from Siemens EDA explore the multitude of benefits that shifting left with Calibre can bring to chip and block design. They investigate how Calibre can impact DRC verification, early design error debug, and optimize the configuration and management of multiple jobs for run time improvement.
Jun 18, 2024
40,859 views