feature article
Subscribe Now

Minimizing the Pain of RTL Design Reviews

Design reviews conger images of engineers carrying reams of code printouts, filing single-file and head down into a room to be judged by others. The positive impact of design reviews has been proven though many studies, but does the preparation and process of the review have to be so painful? This paper provides a practical approach to design reviews that soothes the process and actually results in a positive experience.

The key to a painless design review process is to use techniques and tools as code is developed that contribute to a quality solution, minimizing the time spent in the review. Review time can then be spent on important project debates, such as algorithmic or architecture implementation decisions, instead of drowning in the minutia of line-by-line code discussions. Striving for perfection, automated design review techniques take the place of most manual techniques. In reality, there is always a mix of automated and manual techniques as Figure 1 shows.

20081125_mentor_fig1.jpg
Figure 1: Automated techniques provide the most efficient design review.

The Source Code

Upfront techniques applied to source code provide an excellent platform for painless design reviews. Consider the following ideas:

  • Version management. If a version management tool is in use, make sure the correct version of the code is checked out for review and create a difference report on the code base between the last reviewed version and the latest. If version management is not in place, consider using it on this project before the review takes place.

  • Comments. Designers tend to ignore comments in code, but they can help facilitate a smooth design review. Comments should take a narrative approach, explaining what is going on with the code and why certain choices were made. This approach has the amazing effect of the author finding logic flaws and defects before the review takes place. It also provides a guided conversation during the code review.

  • Layout. The team should agree to a consistent layout of code modules. For example: one module per file, order of statements, and use labels on constructs. A standard layout cuts down on review time because the code flows in a consistent manner from module to module.

  • Checklists. Consider creating a checklist for each code developer that contains a set of common errors to check as the code is written. These items can include particular style guidelines, typical mistakes seen in the past, and known best practices. Each designer can add his/her own items to watch out for, based on past experience.

  • Linting. Consider using a lint tool to automate the checklist process. A lint tool can catch many of the coding mistakes as the code is written, report on code complexity and other metrics, and enforce code layout preferences. Some lint tools can score the code based on violations found. Teams can set a minimum score required before the review takes place. During the review the author can present the lint report and concentrate only on why he/she believes an exception should be made for any remaining violations.

The Visuals

In some cases, a picture of the code provides a better basis for high-level review discussions. Some EDA tools can provide these views automatically or they can be created manually. Consider creating the following visual aids:

  • Reference map. Create a map of all the references to files external to the code under review. References can be: include files, import files, packages, and macros. If the code is object oriented, create a class tree in order to track down inherited methods. The maps often take the form of a spreadsheet or a bubble diagram and they are used to quickly trace back to the source during the review.

  • File map. If the code under review is spread out into multiple files, a simple file map can be useful in order to understand the topography. This map is often combined with the reference map.

  • Requirement trace map. This map shows the relationships between functional and verification requirements and the source code being reviewed. Sophisticated maps also trace to the testbench and simulation results. During the review process, missing requirements can quickly be identified and incorrect implementation of requirements at the code level is evident.

  • Visualizations. A high-level block diagram is useful for hierarchical structural code. Complex algorithmic code is easier to review using state machine or flowchart visualizations.

Review Process Considerations

The team chooses the review process. It can be a manual code review or interactive collaboration over the web. Regardless of the process, consider the following concepts:

  • Website. Consider assembling the code, associated documents, maps, and visualizations on a website. This allows easy access to the material via an HTML browser. Tools exist that automate this process, or the website can be created manually.

  • Scope. Studies have consistently shown that the most effective review is performed in an hour on 200 lines of code or less. Keep this in mind when targeting the code for review.

  • Version Management. Manage code changes using the version management tool.

  • Bug tracking. Record any defects found into a bug tracking system. If the team does not currently use a tool, consider an open source tool such as Bugzilla. Finding defects is not useful if they do not get fixed. Bug tracking tools manage this process.

  • Note template. Establish a template for recording the review feedback. It should minimally contain what was reviewed and discovered, who was involved, and any defect resolutions. Tie these notes into a bug tracking system and the version management system.

  • Review. Review the review process. Were defects found? What worked well? What needs improvement? Address these issues before the next review. Also, update those checklists or linter rules based on the defects that you found.

Conclusion

Implement a few of the techniques presented before the next design review and determine what works. Then, explore ways to automate the process. There are many tools out there that help remove the drudgery of preparing for and performing design reviews. These tools have the interesting side effect of improving the overall design process.

8 thoughts on “Minimizing the Pain of RTL Design Reviews”

  1. Pingback: GVK Biosciences
  2. Pingback: DMPK Services
  3. Pingback: zdporn.com

Leave a Reply

featured blogs
Dec 19, 2024
Explore Concurrent Multiprotocol and examine the distinctions between CMP single channel, CMP with concurrent listening, and CMP with BLE Dynamic Multiprotocol....
Dec 20, 2024
Do you think the proton is formed from three quarks? Think again. It may be made from five, two of which are heavier than the proton itself!...

Libby's Lab

Libby's Lab - Scopes Out Silicon Labs EFRxG22 Development Tools

Sponsored by Mouser Electronics and Silicon Labs

Join Libby in this episode of ā€œLibbyā€™s Labā€ as she explores the Silicon Labs EFR32xG22 Development Tools, available at Mouser.com! These versatile tools are perfect for engineers developing wireless applications with BluetoothĀ®, ZigbeeĀ®, or proprietary protocols. Designed for energy efficiency and ease of use, the starter kit simplifies development for IoT, smart home, and industrial devices. From low-power IoT projects to fitness trackers and medical devices, these tools offer multi-protocol support, reliable performance, and hassle-free setup. Watch as Libby and Demo dive into how these tools can bring wireless projects to life. Keep your circuits charged and your ideas sparking!

Click here for more information about Silicon Labs xG22 Development Tools

featured chalk talk

Datalogging in Automotive
Sponsored by Infineon
In this episode of Chalk Talk, Amelia Dalton and Harsha Medu from Infineon examine the value of data logging in automotive applications. They also explore the benefits of event data recorders and how these technologies will shape the future of automotive travel.
Jan 2, 2024
64,969 views