The Altitude of Truth

In aviation, there is a critical boundary known as the Transition Level.

Below it, aircraft fly based on local pressure – the immediate, messy reality of the weather around them. Above it, they switch to a standard reference pressure (1013.2 hPa). That switch turns a patchwork of local settings into a shared frame of reference: flight levels.

This common reference is what allows high-speed traffic at high altitude to share a common language, a shared “truth,” and a unified vertical separation standard. It’s the foundation that makes RVSM (Reduced Vertical Separation Minimum) possible – a layer of airspace where jets can safely fly only 1,000 feet apart, supported by accurate instruments and, most of the time, automation.

Without this transition from local to standard, high-velocity traffic would be flying blind relative to one another, relying on fragmented local data in a globalized airspace. The result would be chaos – and eventually, catastrophe.

The Automating Society

Our society is currently approaching its own Transition Level.

We are ascending from a world of manual, local control into a high-velocity layer of automation, algorithmic governance, and AI – both generative and the pattern-recognition kinds. Decisions are being made faster, at greater scale, and further away from the humans who live with the consequences.

At low “altitudes,” humans can still see most of what’s going on and correct course by hand. But as we climb, something changes:

  • Automation becomes the default, not the exception. In high-altitude airspace, autopilots, flight management systems, and automated protections do most of the flying.
  • Tightly coupled systems mean small errors can propagate quickly.
  • Shared standards – the equivalent of standard pressure and RVSM – become non-negotiable if you want to avoid collisions.

Right now, many organizations are rushing to the flight levels – deploying powerful AI agents, integrating opaque models into critical workflows, and automating decisions that used to be taken by humans – without the necessary coordination, governance, or shared “reference pressure.”

We are effectively flying on local settings into a high-altitude environment.

When these systems fail, they don’t just glitch; they collide. And too often, it is the humans “in the loop” who become the moral crumple zones – absorbing the blame and the impact of systemic design failures.

The Mission

Transition Level exists to engineer this ascent.

We do not believe in stopping progress. We believe in calibrating it.

We help organizations in:

  • Safe transition: Move to higher levels of automation and AI assistance without losing situational awareness, accountability, or control.
  • Manual reversion: Know exactly when to disconnect the autopilot – when to override the automated decision, bring humans back to the centre, and hand-fly the aircraft.
  • System resilience: Build governance structures that can ride out turbulence and fail gracefully – not brittle systems that operate beautifully in a PowerPoint, then snap under real-world pressure.

We draw on the hard-won lessons of aviation safety – one of the very few domains that operates a tightly coupled and complex system safely – and apply those principles to the chaotic frontier of digital transformation and AI.

Where aviation learned to make RVSM and cockpit automation safe through standards, training, and disciplined governance, we help organizations do the same as they climb into the high-altitude layer of automated decision-making.


The Navigator

Sami Mäkeläinen Principal & Founder

You cannot navigate a storm if you haven’t studied the weather.

I have spent over 30 years in the engine rooms of digital disruption, working at the intersection of human systems and machine logic. From the early days of online commerce and banking to conducting research for Nokia to leading Strategic Foresight at Telstra and, my career has been defined by looking beyond the horizon.

I am not just a theorist. As a Senior Research Affiliate at the Institute for the Future (IFTF), a Senior Industry Fellow at RMIT FORWARD, I work on the leading edge of strategic foresight. But I also stay grounded in the operational realities of high-stakes environments as a member of the Royal Aeronautical Society’s Flight Operations Group.

My work has been featured at the RAeS AI in Civil Aviation Summit and RAeS Flight Operations Conference 2025 where I challenge the industry to think about how human operators and autonomous systems will coexist in the cockpit and beyond.

I blend deep technical expertise (MSc in Computer Science) with the pragmatic, safety-critical mindset of an aviation enthusiast. I understand that while “move fast and break things” works for a startup app, it is a disastrous strategy for critical infrastructure or complex organizational change.

I act as the Tenth Man or the Devil’s Advocate in the room – the voice that asks the uncomfortable question, traces the third-order consequence, and ensures that when your organization climbs to the flight levels, it doesn’t just get there, but gets there safely, and knows how to descend back if need be.


Ready to Calibrate?

The storm is not optional.

The crash is. 

Get in touch