White Paper: Reaping the Benefits of Effective Verification in Avionics Compliance (DO-178C)

jim-thomas-webIn this article, Jim Thomas, Director of Software Testing at T&VS discusses the verification techniques that can benefit those seeking compliance to Avionics certification standards – where many organisations developing avionics software still suffer from poor verification practices. They get their software through certification in the end but by failing to employ effective verification they waste the opportunity to improve their software quality while at the same time meeting their delivery schedules and actually reducing development costs.

Jim has a PhD in Mathematics from Bristol University and over thirty years of experience in the software industry with a focus on high integrity software development.

For additional information on the Compliance solutions from T&VS, click on asureCOMPLY.


Complying with avionics standards in software development is sometimes seen as an expensive overhead. It’s necessary to do, but the cost of demonstrating compliance is seen as an unavoidable cost of being in the business.


Download PDF Version

Improving verification processes by paying more attention to effective verification during development and by embracing unit testing as the foundation for stepwise integration of pre-tested components only can bring advantages from shorter development times, dramatically shorter integration times and significantly reduced residual defects in the delivered software. But more than this, the evidence for demonstrating compliance is produced as development proceeds, not as a separate post-development activity, making certification a much less painful affair. Ultimately a higher quality, more reliable system is delivered to the end customer without the all-too-common software delivery delays.

While avionics standards encourage a structured approach for software development, many organisations don’t execute this well. In many cases there is pressure on the software team to start coding in order to meet timescales, so software requirements and design aren’t fully analysed and defined, unit testing is skipped and testing is carried out on a large amount of software. An external company is then paid to produce the documentation and unit tests for the approvals process based on the final software. This can happen even if the development has complied with a standard such as DO-178B and now with the DO-178C revisions.

Such an approach may find a number of defects in the end, but it typically isn’t that effective – the delivered software has a higher than expected residual number of defects. There are inevitably problems with the initial requirements specification and software design uncovered during testing, and significant numbers of defects are found in testing which can mean the software integration phase takes three or four times longer than expected, or the whole project has to be scrapped and has to start again. While this meets the letter of the standard it does not provide benefits for the developer – deadlines are often missed, development costs are much higher than expected and software quality is lower than anticipated.

Avionics standards

These standards deliberately do not define a particular life cycle or methodology. They define objectives and outputs which these can be produced in different orders, from the planning, requirements, design, coding, integration, configuration management, process assurance and verification processes. There are clear outputs for all these processes but how these are managed and delivered is not defined. And of course the number of objectives to meet increases the higher the assurance level and at the higher levels independence is even more important.

In an avionics design, the outputs are well known and clearly defined and so will have to be produced. Rather than producing them as an afterthought, integrating these outputs into the development process right from the beginning and verifying them as they are produced creates a higher quality product with a much greater chance of success.

Shift left

‘Shift left’ is the term used for verification approaches that put more emphasis on improving the quality of the outputs of the early stages of the development lifecycle (the left side of the V-shaped development, below), with more rigorous requirements and design stages. They are about shifting ‘testing’ in its widest sense to the left side of the V-model.

Figure 1: The traditional V-shaped software development lifecycle

This idea of shift left starts with the requirements phase stage. Every avionics project starts with the requirements stage, but this is not necessarily done well. The requirements may not be analysed in sufficient detail, for example not considering any product variants and integration testing alongside the unit testing requirements. There may also be ambiguities in some requirements, inconsistencies between requirements and the handling of error cases may not be fully addressed. All this leads to problems later in the life cycle.
The key is actually to start with a verification plan, which may seem to be putting the last things first. But by knowing how you are going verify what you have specified and designed, you build an effective framework for the development of the code. Then thinking about how you will test the source code, what test vectors and stimuli you will use from which tools and what environments all helps to construct a coherent and efficient design and verification flow.

One important opportunity to enhance the verification process that is often overlooked is to involve experienced testers right at the beginning of the project at the requirements stage. While domain-specific experts will put together the requirements, testers can assess the requirements attributes, identifying how testable and traceable the requirements are and, critically, highlighting areas of ambiguity and inconsistency in the requirements. They will also be looking at whether failure conditions are fully addressed, an issue that is not usually well covered and which causes significant problems further down the line when an unexpected failure mode emerges in a corner case.
Involving experienced testers in the design stages ensures that the software is designed with testability in mind, features to simplify testing are designed in, and traceability between requirements and design is assessed.

All this provides consistency and traceability in the development process so that the compliance artefacts can be tracked all the way through from requirements to final product. By having experienced testers involved at the requirements and design stages, they can also start working on specifying the software system test and the integration tests.
Testers involved during the design stage can also advise on designing the software to more easily achieve the appropriate level of structural coverage for the tests for each of the units and to allow it to be tested more easily on a host platform rather than just on the target, dependent on their function. Developing and executing tests on a host platform can significantly speed up test development and resolution of test failures.

A verification strategy encompasses reviews, analyses and of course testing. Testing small units of code in isolation provides confidence that they work internally and creates a set of pre-tested building blocks that go together quickly, ensuring that integration will go more smoothly. A unit of code may be about 50 to 100 lines of code, and maybe 20 to 30 units can be integrated together at a time avoiding the difficulties of integrating large amounts of software in one go. This requires a clear test plan for the units and for software integration. Building the software system with stepwise integration becomes dramatically easier with pre-tested components, minimising the need for time-consuming analysis of faults found at integration and software system level testing.

All this creates software with a lower level of defects as defects are avoided or detected earlier on in the life cycle, rather than removed at a later stage. The ‘shift left’ strategy produces higher quality code earlier in the process, de-risking the integration phase and improving the overall quality of the code.

Quality assurance

Existing quality assurance cycles with their focus on testing after development are sometimes seen as the way to naturally refine the software by identifying and removing defects introduced during development. The shift left approach places the emphasis on removing defects at the earliest opportunity, starting with the requirements stage and by involving the test team from the start of a project.

The execution of a quality process can also be variable. Requirements and design reviews can help improve the quality of the development process, but these can be handled well or ineffectively. If the same people who developed the requirements are developing the code, there is the risk of a lack of independence and of ‘common mode’ errors. If the developers can be challenged by independent parties and professional testers, the quality of the requirements, design and ultimately the code is enhanced, saving time and costs in the integration phase. A second pair of independent eyes helps to highlight potential problems that can be avoided or mitigated.

At higher levels of compliance this independence needs to be clearly demonstrated (see Table below), and this can be provided through experienced testers involved across the development life cycle.

LevelFailure conditionObjectivesWith independence
ENo Safety Effect00

Table 1: The number of objectives needed for different failure conditions in DO-178C, with the number of independent objectives required. Source: RTCA/DO-178C “Software Considerations in Airborne Systems and Equipment Certification”

The future

Software development does not stand still, and different approaches to verification and compliance are constantly being developed. This is highlighted in the move from DO-178B to DO-178C which was approved by the FAA in the US in July 2013.

DO-178C now includes guidance on formal methods and provides clarification of the definitions and boundaries between the key DO-178B concepts of High Level Requirements, Low Level Requirements and Derived Requirements. It also provides a better definition of the exit/entry criteria between systems requirements and system design, allowing high level models to be used.
The emerging tools for model based engineering use high level models to generate code and even RTL for silicon automatically. This has a number of benefits but there are issues with verifying a model based software development and with the complexities of integrating and testing multiple models. Can test cases be reliably generated from the model, and what level of independence does this give? There is a clear parallel here with hardware verification, where the issue of whether the verification model should be independent of the design model is currently a topic of debate.


Using a ‘shift left’ verification process can significantly enhance the quality, reliability and efficiency of code for avionics systems. Involving independent, experienced testers at the requirements and high level design stages improves the quality of the requirements and design, and allows the testers to start early with the production of effective system and integration tests. Developing small units that can be tested thoroughly in isolation, and integrating these into an overall framework simplifies the integration challenges for complex systems. As a result, verification becomes less about defect removal during testing and more about defect avoidance at all stages of the development lifecycle. All of this naturally produces the compliance artefacts that are needed by an assurance process such as DO-178C without having to go back and develop them at the end of the project. The resulting savings in integration time and the improvement in quality reduce costs and enhance reliability. These are both key considerations for any software project but particularly for avionics, and they can be delivered effectively by involving those independent, external test engineers early in the process.

2017-05-17T10:36:12+00:00 13th August, 2014|Compliance, Thought Leadership|
The T&VS newsletters inform you about industry news, events and information from T&VS. No spam, we promise and it is always easy to unsubscribe.
We never share your information. Read our Privacy Statement
Interested in Formal Verification?
Then why not attend the TVS Formal
Verification Bootcamp training?
The 2-day Formal Verification Bootcamp is for design and verification engineers looking to enhance their knowledge of formal verification and to learn how to write effective assertions to find and fix bugs. The course is a mix of presentations and hands-on development exercises.
Bootcamp Enquiry Form
If you are interested in receiving additional information on the course then simply email Mike Bartley (TVS CEO and Course Leader) by entering your details below.
Interested in SystemC?
FREE SystemC UVM Library Now Available
The TVS SystemC UVM library closely mimics UVM but gives users a license free UVM-based verification environment.
Have your product requirements been successfully tested and implemented?
Find out how asureSIGN can help you implement a successful Requirements Driven Verification and Test Strategy by visiting asureSIGN or enter your details and we will be in touch.
Course Dates and Pricing
To receive additional information, including course dates and pricing, please contact our training team who will be happy to help.
Download Request
Please complete the following form then click 'submit' to access the download.
Presentation Request
Please complete the following form then click 'submit' to gain access to the presentations.
Please complete the following form and then click 'submit' to gain access to the download.
Did you get what you were looking?

Let the testing experts help. We will run a FREE QA assessment which will include our top 5 recommendations to help maximise your testing.