Safety coding techniques for your application
The modernization of programming languages and the importance of better coding techniques is directly related to the evolution from mechanical computers to modern software development processes. We moved from a highly specialized, mostly mathematical, notation to high-level programming language close to human syntax [1], which was made possible by compiler technologies but it opened up the door to defects. High-level programming languages like C and C++ contain a huge number of undefined behaviors which various compilers can interpret slightly differently; this can cause unknown or unwanted side effects which will translate into defects.
Hunting and fixing defects can represent up to 80 % of the development time depending of the maturity of your development organization. This brings us to the obvious conclusion that code quality is a big concern, so why not avoid the defects at all and spend less time in debugging?
As a side note, we still use the concept of “bugs” and debugging in software but the term came from the Harvard University's mechanical computers when a moth was stuck in a relay and this event was documented as the first system “bug” or defect in computer history.
Repeating the same mistakes over and over
It is well known that developers in web, app, desktop or embedded tend to inject, accidentally, the same kind of mistakes into their source code over and over. This conclusion came from various respectable institutions like NASA, Bell Labs and MITRE, running several surveys and studies. Some examples of common pitfalls are allocations without deallocations in C++ code (or even in C code) and functions used without prototyping, so you do not get rigorous type-checking at compile-time. An outcome of this research is lists of best programming practices or recommended programming practices that identify risky and bad coding behavior.
There are many guidelines and coding practices available to improve code quality based on common mistakes and how to avoid them in the future. Some of the techniques and practices became well known standards (e.g. MISRA-C and CERT-C), especially in critical industries like automotive, medical and railway to ensure code safety and code security in the application. Functional safety standards such as IEC 61508 [2], EN 50128[3] and ISO 26262 [4] recommend (or highly recommend, depending on the safety integrity level (SIL) or Automotive Safety Integrity Level (ASIL)) the use of static and runtime analysis tools to be compliant with the standards. Defects in safety-critical systems could lead to severe consequences such as loss of lives or environmental damages.
Focus on reliability
Safety coding techniques are the combination of code quality, code safety and code security. Code safety focuses on the reliability of the software, while code security is all about preventing unwanted activity and keeping the system secure during an attack, and both rely heavily on the code quality that is the foundation of every solid application.
Safety coding techniques and standards drive software safety to ensure the needed reliability, but it is crucial to also enhance readability and maintainability of the source code. More efficient and readable code means future-proofing the source code with less defects and also making the reuse of code possible.
MISRA C is one of the most well-established software development standards to avoid common pitfalls and vulnerabilities, but there are additional guidelines like CWE and the CERT-C coding standard that are highly recommended for any embedded application. Let’s explore a bit further on the coding standards.
The MISRA C standard
MISRA C was developed by the Motor Industry Software Reliability Association. Its aims are to facilitate code safety, portability and reliability in the context of embedded systems, specifically those systems programmed in ISO C.
The first edition of the MISRA C standard, "Guidelines for the use of the C language in vehicle based software", was produced in 1998, and is officially known as MISRA-C:1998. There was an update in 2004 and another in 2012 to add more rules. There is also MISRA C++ 2008 standard based on C++ 2003. Lately, MISRA C:2012, Amendment 1 adds 14 additional rules with a focus on security concerns highlighted by the ISO C Secure Guidelines. Several of these rules address specific issues pertaining to the use of untrustworthy data, a well-known security vulnerability in many embedded applications.
MISRA helps you find issues before you check code into a formal build, so finding a bug this way makes it like the defect never happened. MISRA rules are – again – designed with safety and reliability in mind, but they also make code more portable to other tools and architectures.
CWE and CERT C/C++
CWE, the Common Weakness Enumeration, is a community-developed dictionary of software weakness types. CWE provides a unified, measurable set of software weaknesses in order to better understand and manage them and to enable efficient software security tools and services that can find them.
The CERT C/C++ Secure Coding Standards are standards published by the Computer Emergency Response Team (CERT) providing rules and recommendations for secure coding in the C/C++ programming languages.
Enforcing safety coding techniques
As a general recommendation, every embedded application should at least follow the CWE and CERT C/C++ standards. MISRA C is mandatory for safety-critical systems.
Following the same concept, during runtime you can still be susceptible to arithmetic issues, buffer overflow, bounds issues, heap integrity and memory leaks. Such errors can be detected by inserting specific instrumentation code or asserts at all places where a potential error can happen. However, adding instructions manually to check the condition and somehow report the issue at runtime is a very time-consuming task.
Applying all guidelines and standards means that you need to be compliant with almost 700 rules and requirements in addition to instrumenting your code. So, how can you enforce the safety coding techniques and keep up with all the rules?
Use automated tools
The best way to enforce software quality, safety and security is to use automated tools. This can be achieved with the use of a high quality compiler and linker that are preferably functional safety certified and combined with automated static analysis and runtime analysis.
The compiler and linker should support a modern programming language like the latest C (ISO/IEC 9899:2018) and C++ (ISO/IEC 14882, known as C++ with the latest C++17 revision) so that it will generate warnings for suspicions situations or for syntax weaknesses, e.g. volatile memory access whose order of evaluation could affect the logic of the application.
Warnings are your first-pass static analysis check and should never be ignored, particularly in a functional safety setting. The best recommendation is to turn the warnings into errors by changing the compiler settings to treat all warnings as errors. This will force the developers to fix all ambiguities in the code since all issues will be handled as genuine problems.
Static Analysis tools help you to find the most common sources of defects in your code, but they also help you to find problems that developers tend to not think or worry about when they’re trying to write their code, especially when they’re just putting up scaffold code to just get something working. These types of tools really help you develop better code because they enforce the coding standards. Additionally, there are dynamic or runtime analysis tools that catch and trigger defects that only pop up during runtime. A runtime analysis tool can find real and potential errors in the code while executing the program in a software debugger.
So when you look at all the defects that could be in your system, static analysis is good at finding some defects and runtime analysis is good at finding others. Sometimes there is overlap, but sometimes a defect can only be detected in one domain or the other. To get the best possible code analysis, you need to use both in conjunction with another and integrated with your top-notch build tools. The matrix below best represents the complete defect coverage when combining the different tools.
Catching loopholes
This effect can best be explained in the picture below taken from University Bielefeld in Germany [5]. This picture was taken by an anonymous contributor and circulated widely in 2005:
The easiest way to break a system is often to circumvent it rather than defeat it. This is mostly the case with software vulnerabilities connected to insecure coding practices. The picture above is a great analogy for this: the gate is in place per the recommendations and is functioning appropriately according to the specifications. However, the security measure was easily bypassed and until a runtime analysis tool was in place (in this case, snow), it was probably difficult to spot the flaw in the security system. Automated runtime analysis that scans your code for potential loopholes is a great way to detect these types of issues.
In this case, the safety vulnerability was fixed in hardware, namely the installation of concrete bollards according to Google Street View in 2020 [6]:
Coding standards help to future-proof your code and ensure ease of reuse. This means the code’s quality affects the code’s reusability and this is what mature organizations have in their culture when developing new products. It is a virtuous cycle to enforce the safety coding techniques and it reaffirms our premise: Everything simply starts with code quality.
This article is written by Rafael Taubinger, our Technical Marketing Specialist.
References
[1] Hopper (1978) p. 16.
[2] https://www.iec.ch/functionalsafety/standards/page2.htm
[3] https://standards.globalspec.com/std/14256883/EN%2050128
[4] https://www.iso.org/standard/43464.html
[5] https://wiki.sei.cmu.edu/confluence/display/seccode/Top+10+Secure+Coding+Practices?focusedCommentId=88044413
[6]https://www.google.cz/maps/@52.0366688,8.4912691,3a,75y,35.86h,98.92t/data
=!3m6!1e1!3m4!1soRzmyEEoGqUoVhlyGWmyEw!2e0!7i13312!8i6656