A deep look into impact analysis in industry
Developing safety-critical systems is a very costly process, thus they typically evolve incrementally. However, making changes to the software of such a system might introduce unexpected “ripple effects” – a well-known phenomenon in safety engineering. To address this problem, safety standards mandate formal change impact analysis to determine how a system is affected by changes.
Many software systems are complex. Very complex. Understanding how a change will affect the rest of the system is really hard, no matter if it’s a bug fix or an increment in planned software evolution. In non-critical applications, the solution is typically to run a battery of automated tests to check if anything breaks. When human lives are at stake this is not enough, and domain-specific safety standards require formal change impact analysis BEFORE making any changes. But how is this activity conducted in various industry sectors? What triggers a change impact analysis? What are the major challenges? Do practitioners use any tool support?
These questions and many more are explored in this study. The initial ideas started as a collaboration between Simula and Lund University, but at the time for publication three out of four authors have changed affiliations: José Luis is now at Carlos III University of Madrid, I’m with SICS Swedish ICT AB, and Krzysztof is at Blekinge Institute of Technology… At least Leon is still at Simula! I think all of us worked on the study as a background activity, but we did so meticulously – and the paper ended up in IEEE TSE.
Most comprehensive survey available
We conducted a survey with engineers developing safety-critical systems. The survey required quite some effort by the respondents, but we managed to get 97 complete responses representing 16 application domains, 28 countries, and 47 safety standards – by far the broadest view on change impact analysis in industry published so far. Our study particularly explored: 1) when change impact analysis is done?, 2) if any tool support is used?, and 3) what the major challenges are?
Our results suggest that the most common reason for initiating a formal change impact analysis is due to changes to some specification, rather than changes to source code. In general, we show that change impact analysis goes far beyond the source code – there are numerous artifact types involved in the process. Change impact analysis is a fairly manual process, and if any automated tool support is available it typically operates only on the source code level. There is clearly room for more holistic support for change impact analysis! Moreover, the most commonly reported challenge to change impact analysis is lack of proper tool support. However, this is not a surprising finding when asking engineers about challenges… On the other hand plenty of other challenges are reported as well, such as inadequate system traceability, difficulties in cost estimations, and incomplete processes.
There is a lot of meat in this paper. The survey covered roughly 25 questions, and some of the answers contain a lot of information. One of the more interesting analyses we did was to check correlations between answers to different questions. The featured image of this blog post shows how a change impact analysis triggered by changes to one software artifact type impact others – a truly heavy figure to digest! Among other things, we show just how tightly connected the requirements specifications and the source code are – in all aspects studied in the survey.
Implications for Research
- A cross-domain survey that offers deep insights into state-of-practice change impact analysis.
- Empirical evidence of major challenges involved in change impact analysis, thus motivating further research on e.g. non-code impact, tool support, traceability, safety case management, and process improvements.
- A report of what tool support is currently used for change impact analysis – and what level of automation they offer.
Implications for Practice
- A benchmark enabling comparison of an organization’s change impact analysis to the broader state-of-practice.
- A discussion on which pairs of software artifacts that more distinctively co-evolve than others, an aspect not mentioned in safety standards.
- A list of the most common challenges to change impact analysis and twenty matching improvement areas – with process improvement & guidance being the most promising area.
José Luis de la Vara, Markus Borg, Krzysztof Wnuk, and Leon Moonen. An Industrial Survey of Safety Evidence Change Impact Analysis Practice, IEEE Transactions on Software Engineering, 42(12), pp. 1095-1117, 2016. (link, preprint)
Abstract
Context. In many application domains, critical systems must comply with safety standards. This involves gathering safety evidence in the form of artifacts such as safety analyses, system specifications, and testing results. These artifacts can evolve during a system's lifecycle, creating a need for change impact analysis to guarantee that system safety and compliance are not jeopardized. Objective. We aim to provide new insights into how safety evidence change impact analysis is addressed in practice. The knowledge about this activity is limited despite the extensive research that has been conducted on change impact analysis and on safety evidence management. Method. We conducted an industrial survey on the circumstances under which safety evidence change impact analysis is addressed, the tool support used, and the challenges faced. Results. We obtained 97 valid responses representing 16 application domains, 28 countries, and 47 safety standards. The respondents had most often performed safety evidence change impact analysis during system development, from system specifications, and fully manually. No commercial change impact analysis tool was reported as used for all artifact types and insufficient tool support was the most frequent challenge. Conclusion. The results suggest that the different artifact types used as safety evidence co-evolve. In addition, the evolution of safety cases should probably be better managed, the level of automation in safety evidence change impact analysis is low, and the state of the practice can benefit from over 20 improvement areas.