Five-point summary:
- Flying is incredibly safe; why it is so safe is something few people pay attention to.
- It’s safe because of a combination of many factors, one of which are the supporting structures and practices that enable broad-based, global learning.
- Enabled by a foundation of an appropriate culture, one of these supporting systems is ASRS, the Aviation Safety Reporting System.
- It’s a concept that allows for efficient industry-wide learning, and a concept that would be readily reproducible in many, many other industries that would benefit from it.
- Are those other industries learning from this valuable lesson? Of course not! Other industries need to turn the current process of STFU to something more like an ASRS.
When you travel by flying, what are you worried about?
It could be the less-than-optimal airport experience with long lines at security and other inconveniences; having an above-average-size person sitting on both sides of you as you unfortunately landed a middle seat; you might be dreading the food; or you could be anxious about the prospect of being near lots of people for hours during a pandemic.
It could be a lot of things, but you rarely worry whether you’ll get to your destination safely.
Yes, quite a few people are scared of flying, but even for the 2.5-5% of people who are scared of flying to the point of having a phobia, any safety concerns are irrational – flying is just about the safest means of travel, as catalogued year after year by Boeing’s Statistical Summary of Commercial Jet Airplane Accidents.
Why it’s so safe, however, is something ‘normal’ people pay little attention to.
It is a product of decades of hard work, and a combination of regulation, engineering advancements, technological progress, development of and improvements in CRM and other training, changes in culture and many other things.
One of those things is our topic today; in particular, the NASA-managed ASRS – Aviation Safety Reporting System – and the culture of open reporting and knowledge-sharing across the entire industry, globally, which makes systems like ASRS possible and extremely useful.
ASRS is a voluntary confidential reporting system that allows anyone involved in the aviation industry – pilots, air traffic controllers, cabin crew, rampers, maintenance technicians, dispatchers, and so on – to confidentially report near misses, close calls or incidents in the interest of improving aviation safety. The regulations prohibit the reports from being used for any enforcement or penalty purposes.
ASRS collects, analyses and responds to the reports, and collates them into the highly informative and interesting CALLBACK newsletter – free for anyone to subscribe to.
ASRS isn’t the only such game in town in aviation either; UK runs CHIRP (Confidential Human Factors Incident Reporting Programme), and Australia has REPCON, and so on. Australia’s CASA also hosts an interesting collection of Close Calls and publishes them in the great Flight Safety magazine.
This is normal – except it’s not, really
Given how well the system has worked for decades, one would think similar practices would have cropped up everywhere.
But no.
I encourage people working in other industries to think twice what is happening here.
If there’s a mishap or something bad almost happens, you are encouraged to openly report it for everyone to learn from, and you’re guaranteed to not be punished for doing so.
Not only are aviation accidents investigated with more rigour and openness than in any other industry – the accident reports make for fascinating reading – even potential incidents and close callsare often reported, documented, and learned from.
And not just confided to a colleague, warning them to not do something; not just within a team’s retro; not presented in the unit’s Fuckup Fridays session; not written up for an internal safety newsletter; but publicly, globally, among peers and competitors alike.
That’s unheard of for most industries and organizations.
I would argue that in most industries and organizations the gut reaction to something bad almost happening is very, very different:
Unless forced to come clean by e.g. regulated mandatory disclosure, the natural reaction is not filing an ASRS report, but another four-letter acronym: STFU.
If nobody finds out about what almost happened, that’s perceived as a good thing. No need to report an incident either, if there’s a chance nobody would find out if we didn’t – people are, after all, all too often incentivized to have no incidents or injuries or the like, which only leads to hiding them.
You don’t need to look far to see this is the case; even for severe incidents like the recent Optus data breach, the organizational gut reaction is to tell as little as possible to as few people as possible.
Or when an IOT or ICS system fails or is compromised, if an organization can keep it under a lid, for the vast majority of the time that’s exactly what they will seek to do.
Tell no-one.
That approach might buy an individual, or an organization, some, usually temporary, peace and quiet; uncomfortable questions don’t get asked, you escape negative organizational outcomes (which happen thanks to a retributive workplace culture), and business as usual can continue.
Even if nothing bad immediately happens, there are humongous negative consequences from this.
What also doesn’t happen is learning.
Few even in the organization will learn anything from the incident.
Nobody outside the organization learns from the incidents or close calls.
And when the ‘invisible’ failures inevitably lead to a very visible failure, trust is lost – and lives can be, too.
Urgently needed: ASRS for everything else
It should be clear I’m a strong advocate for creating ASRS-style systems in other industries, as well as the necessary supporting elements: just culture, avoiding moral crumple zones, as well as re-configuring legal liability concerns and other disciplinary systems.
There are some attempts at creating ASRS-like systems elsewhere; systems like the AI Incident Database (AIID) have explicitly been inspired by ASRS, and I applaud their existence. They need a much broader take-up, but for that to happen, there also needs to be a wholesale culture change.
Something that the tech industry in particular needs to get their head around is the fact that ASRS is not a system, or a technology, or something you can just drop in and have it magically fix things. It requires culture change, and in my experience, the dominant culture in most industries is very far from the kind of restorative just culture that would be required.
As the creators of AIID have stated,
intelligent systems require a repository of problems experienced in the real world so that future researchers and developers may mitigate or avoid repeated bad outcomes
While I believe every industry would benefit from open learning systems, there is one that really needs to get on board with this soon: IOT and related fields, because it is in the business of making systems more systemically risky.
Some questions to ask
To get an idea where your organization sits, imagine a scenario where your team of developers forgets to, say, secure a sensitive API because of poor or unclear practices. By a serendipitous coincidence, a team member catches it just before it goes into production, and it’s fixed.
What happens now?
- Will you ever find out about what happened, or will the developer just stay quiet because they fear you or another manager would be angry for almost letting a security disaster happen?
- If you find out about what happened, how will you react?
- Will you discuss it over a team retro, reminding everyone to not do that?
- Will you modify your team’s processes, tools, or checklists so that it doesn’t happen again?
- Will you fix the organisation-level processes that led into the error in the first place? Will your process organization help in that or just brush you off and tell you to RTFM more carefully?
- Will you disseminate an incident analysis to all developers and process people in the organization?
- Will you submit it to a system where you can report it anonymously but publicly so everyone in the world could learn from it
If you answered yes to the last two questions, I want to hear from you and learn everything that’s out there.
Ps. While I’ve used them as a model example here, all is not well even in the aviation domain; voluntary reporting systems are under all kinds of pressure, and operator focus on profits continues to conflict with goals of safe operation. Reporting to the ASRS is voluntary and can be motivated by a fear of violation; many other data sharing systems are also voluntary, and take-up is not as strong as it should be. The supporting scaffolds and structures that have allowed aviation to be such an amazing success story in the safety space are under stress, and we need to work to maintain and strengthen them – not allow them to be weakened on the altar of commercial efficiency or fear of legal liability.