Lessons from the Aviation Industry: What Can We Learn for Humanitarian Security Risk Management?

Published: February 2, 2016 | By Christina Wille

Air traffic volume has expanded dramatically in recent years, yet the number of plane crashes has steadily declined since 1980. Air accidents peaked in the 1940s, prompting aviation experts to develop a new safety approach, and today, aviation is one of the leading industries in risk management. This article discusses the aviation industry’s safety concept and considers what the growing humanitarian community may learn for its own security and safety management.

160202 Aviation accidents and incidents

In the 1940s, research that investigated the high risk industries of the time – aviation and nuclear power generation – developed the ‘systems approach’ that conceives error as evidence of system failure and avoids blaming the individual who committed a human error.

Central to the aviation concept is the belief that human error is inevitable. The purpose of safety systems is to absorb errors. Safety systems should be designed to detect errors, allow for their interception, as well as providing mitigation of consequences in cases of non-interception. Most air traffic accidents still occur because of human error. However, many pilots’ human errors no longer lead to catastrophic accidents as safety systems mitigate these errors.

There are many aspects in this safety approach that are interesting for the humanitarian community. Particularly relevant is the adaptation of the aviation safety approach to medicine, where work practices have many similarities to humanitarian work.

How hospitals use aviation safety concepts – the example from ambulatory care in Maccabi Hospital, Israel

Many of the practices and attitudes common to health services seem very similar to the way aid workers and aid agencies function. Incidents reported to senior management are often the severe incidents rather than near misses, and many are caused by human error or failure to follow established practices. In addition, institutions view incidents and error as failure. Staff and managers do not feel able to discuss how their mistakes contributed to the chain of events in a way that institutional learning takes place.

In order to improve patient safety, Maccabi Healthcare Services, one of the largest non-profit Health Maintenance Organisations (HMO) in Israel, adapted the aviation safety concept to its own working practices in 1996. The approach centred on four main points:

  • Errors inevitably occur, and usually derive from faulty system design, not from negligence;
  • Accident prevention should be an ongoing process based on open and full reporting;
  • Major accidents are only the ‘tip of the iceberg of processes that indicate possibilities for organisational learning;
  • Incident prevention is a long term ongoing process rather than an episodic effort.

At Maccabi, an interdisciplinary work group and hotline were set up and reporting of near misses was encouraged. Rooted in the key principle that event analysis should serve for learning, not for blaming, official immunity from disciplinary acts was granted to those reporting events. The analysis was used to formulate specific recommendations that rectified processes conducive to error.

After five years, the hospital team summarised key steps they had taken (left column). Suggestions of how the humanitarian community might adapt these for humanitarian security management in order to protect staff and assets has been added to stimulate discussion among humanitarian actors (right column).


The key principles developed in Maccabi and suggestions for humanitarian security management

Key principles of Maccabi safety approach

Possibilities for adaptation to humanitarian safety and security management

Caring for the caregiver

People do not err maliciously and they need support from their organisation when they committed an error.


Care for staff

People do not willingly end up in dangerous situations and they need support from their organisation when they experience a threatening incident.

Event debriefing methodology

Using a broad definition of adverse events, focusing on the event rather than the resulting injury.

The definition of adverse event changed from ‘unexpected occurrence or variation involving death or serious physical or psychological injury, or the risk thereof’ to an ‘unexpected occurrence during medical care, involving physical or emotional injury, or the risk thereof’ (near miss).

Event debriefing policy

Encouraging event debriefing for any adverse event defined as an ‘unexpected occurrence perceived as threatening or with the potential to cause harm to staff or agency operations, including near misses’ to encourage staff to report any incident for the purpose of organisational learning.

Single event analysis

Allows for identification of localised errors. Here, learning takes place within short reaction times.

Multiple event analysis

Allows for mapping and integrative analysis of risk factors, encourages systemic thinking and supports medical as well as managerial decision-making.

Single event analysis

Conversation between affected staff, security managers and counsellors to support the victim and identify mitigation strategies.

Multiple event analysis

Reporting of categories of events in a systematic nature into a pooled database for mapping of patterns in system failures.

Introduction of changes

Creation of a strong connection between event analysis and the introduction of changes to alter work processes.

Introduction of changes

Will signal to staff that reporting and their experiences are taken seriously.

Each individual agency knows to what extent they already follow such principles, or may wish to follow them in the future. For multiple event analysis, however, it is not usually a realistic possibility for a single agency. The volume of individual event categories thankfully remains too low to carry out a meaningful analysis. Multiple event analysis is a task for the humanitarian community as a whole, and pooled data from multiple agencies provides the right material for multiple event analysis. It is in this spirit that Insecurity Insight runs the Security in Numbers Database as part of the Aid in Danger project. Contributing to this or other databases, including the Aid Worker Security Database or initiatives managed by in-country coordination mechanisms, provides an essential knowledgebase through which trends can be explored.

The Aid in Danger project advocates that multiple event analysis is an essential part in improving aid security. However, improved near miss reporting is also needed, increasing learning on measures that help to prevent a near miss from turning into a severe incident. Moreover, more of the insight gained from event analysis needs to be systematically translated into changed procedures, so that the number of severe incidents begins to decline. Learning from the successful systems and tools developed in the aviation sector, approaches to humanitarian security risk management could be meaningfully adjusted. However, it will take time before we reach a turning point and severe incidents start to decline despite growth in the aid industry.

Sources and Background Reading

Aid Worker Security Report 2014: Unsafe Passage: Road attacks and their impact on humanitarian operations, Humanitarian Outcomes, August 2014, https://aidworkersecurity.org/sites/default/files/Aid Worker Security Report 2014.pdf

From Aviation to Medicine: Applying Concepts of Aviation Safety to Risk Management in Ambulatory Care, Quality and Safety in Health Care, Wilf-Miron, R., I. Lewenhoff, Z. Benyamini, and A. Aviram, 12, no. 1 (2003): 35–39, http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1743670/pdf/v012p00035.pdf

How Many Planes Crash Every Year, And How Many People Die In Plane Crashes, International Business Times, 3 October 2014, http://www.ibtimes.com/how-many-planes-crash-every-year-how-many-people-die-plane-crashes-chart-1560554

On error management: lessons from aviation, Helmreich,R., BMJ Vol.320 18/3/2000 pp.781-785, http://www.bmj.com/content/320/7237/781