Not too long ago, secure software was the sole domain of the military and a few select governmental and financial agencies. Safe software was the sole domain of industries such as aerospace, medicine, and transportation. Software was either safe or secure, but in most cases, there was never a need to have both safe and securesoftware. But the times they are a changing. As digital devices become more ubiquitous and the convergence of multiple applications on a single device continues, safe and secure software now has a place in automotive, aerospace, and in industrial applications to name a few.
Growing complexity and multiple features have expanded to the point where today, keeping electronic data secure is critical to the safe operation of electronic devices.
Software Secure Systems – Then and Now
As with many technologies, the military was one of the first industries to use computers within both its communication infrastructure and mobile equipment. It was a natural evolution for software to move into civilian applications, such as avionics. First used in communication, diagnostics and guidance systems, software control systems have moved into the arena of flight control systems. The European Airbus 380 is a perfect example of an aircraft flown entirely by computer. Can you imagine the massive group collaboration that must have been formed to ensure the safe and secure operation of software in this context?
Medical devices are another area where software plays a key role in ensuring both operator and patient safety. Programmable electronic devices are deployed in everything from portable blood glucose monitors to implanted heart defibrillators. It is also an area where there is increasing pressure on care providers to reduce expenses, while patients demand faster, better quality of care.
Increasingly, automobile manufacturers are adding more computing power to their products. The reasons range from safety and environmental, to overall cost. Engine management software cleans exhaust systems. Anti-locking braking system software maximizes stopping power. It’s not uncommon today for modern luxury vehicles to contain upward of eighty or more programmable electronic devices.
As more personal, financial, and critical data is being transmitted and stored from a variety of embedded technologies, it’s essential that software developers be keenly aware of securing all electronic data. As the convergence of mobile phones, iPods, PDAs, even automotive systems continue, we are witnessing the transfer of high-level security data moving to less secure devices that are more vulnerable to attacks and security breaches.
Safety Standards do Exist
As pervasive connectivity continues to grow, the ability to deploy safety-critical devices that are also secure becomes a major concern. Connectivity provides a Pandora’s Box of opportunities for either malicious or accidental opportunities for a safety-critical device to be subverted. So while connectivity allows better data collection, coordination, and responsiveness, it also offers the opportunity to expose private data to those who don’t need it or worse, allow access to the safety-related functionality of the system.
The fact of the matter is that today’s end users are not fully aware of the potential for unsafe operation or failure of their device. It is incumbent upon us, the conscience software developers, to work with established standards (even thought they may not be required) to ensure that any potential failure is anticipated and accounted for.
From a historical perspective, there have been numerous standards that many US industries have already adopted. Military and avionics, aerospace, nuclear and power plants, rail, and medical are a few examples. These standards provide guidance as to how both software and devices should be designed and deployed. They vary in their rigor, guidance, application, and impact on development, but their goals are ultimately the same – to produce safe and reliable devices.
Software safety, software security, and software reliability are not one and the same. For example, while the goal of safe software is reliable and safe operation, the second tenet of developing safe software is to plan for it to fail and design accordingly so that it fails in a safe fashion.
International Electrotechnical Commission Weighs in
A decade ago, the International Electrotechnical Commission (IEC) issued the final version of its IEC 61508 specification governing the development of electrical, electronic, and programmable electronic safety-related systems.
The main objective of IEC 61508 was to provide guidance for developing devices that are functionally safe. In the context of IEC 61508, functional safety is defined as: “Functional safety is part of the overall safety that depends on a system or equipment operating correctly in response to its inputs. Functional safety is achieved when every specified safety function is carried out and the level of performance required of each safety function is met.”
The standard ensures that safety systems perform as specified, and if they fail, they fail in a manner that is safe. When discussing safety in this context, reliability is not implied, only that if there is a failure, it is a safe failure.
Security Trends to Consider
The convergence of device functionality and the explosive growth of new technologies make it possible that some compliant devices will be considered for use while others will be disqualified due to non-compliance. A few of these security trends include:
Federal Information Processing Standard
On May 26, 2006, the Federal Information Processing Standard (FIPS) 140-2 “Security Requirements for Cryptographic Modules” went into effect. The standard was developed in conjunction with the NSA and is published by the National Institute of Standards and Technology. It describes the requirements and standards that a hardware and/or software product must meet to be purchased for sensitive but unclassified (SBU) use by the government.
The standard has been adopted by the Canadian Communications Security Establishment as well as the American National Standards Institute.
Health Insurance Portability and Accountability Act
To improve the efficiency and effectiveness of the health care system, the Health Insurance Portability and Accountability Act (HIPAA) of 1996, Public Law 104-191, included “Administrative Simplification” provisions that required Health and Human Services to adopt national standards for electronic healthcare transactions. At the same time, Congress recognized that advances in electronic technology could erode the privacy of health information. Consequently, Congress enacted HIPAA provisions that mandated the adoption of federal privacy protection for individual health information.
Common Criteria Standard
The Common Criteria (CC) standard (ISO 15408) is an international security standard for computer systems. It is different from standards such as FIPS-140 in that it does not provide a list of security requirements. Instead, it outlines a framework in which system architects can specify the security requirements needed. The implementation of the security features in the products can then be evaluated (by certified testing laboratories) against the requirements. The goal of CC is to provide a standard method to define, implement, and evaluate a product used in a secure environment.
Savvy device manufacturers have recognized that producing standard-compliant devices, or devices that aid an organization with compliance, allows them to reap distinct advantages. Not only can device manufacturers market to niche areas, but it differentiates their products from the fray by implying greater quality because compliance standards have been met.
Being compliant also enables entry into regional markets where governments utilize lack of compliance as a barrier to trade. (One example, the many products in the United States that carry the Underwriters Laboratories (UL) seal to show that they have been tested and approved.) Needless to say, starting with a proven secure operating system and certified software stacks can turn compliance issues into competitive strengths.
Conclusion
Secure software is no longer a strict requirement among a few secretive government agencies, as is safe software is no longer a strict requirement for vertical industries such as aerospace, transportation, or nuclear plants where the physical isolation of stand-alone units once delivered some level of security. With the rapidly growing trends of new product capabilities and convergence – across all industries – the writing is on the wall for electronic devices, that range from individual devices to the older stand-alone units found in data centers or IT infrastructures, be developed so that all data is safe and protected from unauthorized access.
It’s increasingly apparent that “private data” used on a public system must be protected. From your trusted cell phone or PDA device, to a hospital encoder, to even the copy machine at the local Kinko’s – electronically transmitted and stored data at all levels must be made safe and secure. If standards are not yet part of your industry, Mentor has stayed ahead of the curve by offering the Nucleus OS – Secure Kernel, security software, and certified protocol stacks. Working with the Nucleus OS gives developers a best in class foundation on which to build their product. This is definitely one level of safety and security developers can apply to their embedded design right now.
The type of standards to be adopted varies depending on type of embedded device, its application, and exposure to its surrounding threat environment. Embedded system architects need to follow standards and regulations specific to their market if they are to develop the level of security and safety demanded by the marketplace today and in the future. Adopting standards now may provide a distinct differentiation for your product and establish marketable value ahead of your competition.
Todd Brian is a product manager for the Embedded Systems Division of Mentor Graphics where he is responsible for Nucleus OS and related products. Brian has spent more than 15 years working with embedded systems and software in consumer and office electronics. Prior to Mentor, Brian served in a variety of engineering and marketing positions at Konica Minolta, Inc. Brian earned a M.S. in Computer Science from the University of South Alabama and a Masters of Business Administration from Spring Hill College, as well as a B.S in Computer Science from the University of South Alabama.