Sitemap
About Me Stories

A publication dedicated to bringing out the stories behind the writers themselves. A place of autobiographies. Types of personal stories include introductions, memoirs, self-reflections, and self-love.

About Me — Philip Mann

6 min readMay 1, 2021

--

Photo of the Author

The Guy Who Sees the System Behind the Story

About Me — Philip D. Mann

The Guy Who Sees the System Behind the Story

I am an aviation safety expert and systems analyst who spent 17 years inside the Federal Aviation Administration learning how the world’s most complex airspace actually works — and more importantly, how it fails. When 67 people died in the skies over Washington D.C. in January 2025, news organizations worldwide turned to me to explain not just what happened, but why our systems allowed it to happen.

Since that tragic night, my analysis has reached over 17.5 million people through NewsNation Now, LiveNOW Fox, USA Today, the Associated Press, and Britain’s Morning Glory with Mike Graham. But I don’t chase cameras or craft soundbites. I translate the language of complex systems into insights that help people understand the invisible infrastructure that shapes our world.

What I Actually Do

I serve as Assistant Professor of Project Management at Embry-Riddle Aeronautical University, where I teach graduate students to see beyond project charts and timelines to the human systems that determine whether ambitious plans succeed or fail spectacularly. My students learn risk management not as an academic exercise, but as a discipline where getting it wrong can cost lives.

My expertise lies at the intersection of three critical domains: aviation safety systems built on nearly 30 years of operational experience, organizational behavior that drives real-world decisions, and emerging technology integration in safety-critical infrastructure. This unique perspective allows me to see patterns others miss — like how workforce reductions labeled as “efficiency gains” create cascading safety vulnerabilities that only reveal themselves in tragedy.

The SCAR Framework: My Answer to AI Chaos

I’m currently completing my first book: The SCAR Framework: A Systematic Approach to AI Decision-Making in Critical Systems. This isn’t another breathless celebration of artificial intelligence or a doomsday warning about robot overlords. It’s a practical decision-making tool for leaders who need to evaluate whether AI is the right solution for their transportation and infrastructure challenges — or whether they’re being sold expensive technology that will create more problems than it solves.

The framework emerged from watching organization after organization rush to implement AI without understanding how it would interact with existing human systems, safety protocols, and organizational cultures. SCAR provides the systematic evaluation criteria that have been missing from these conversations, balancing innovation potential with operational realities in environments where failure isn’t an option.

The Path That Led Here

My aviation journey began in Navy repair shops and on Army helicopters, where I learned that perfect procedures on paper mean nothing if they don’t work under pressure at 2 AM in bad weather. During my 17 years with the FAA, I witnessed the National Airspace System from every angle — as a Technical Operations specialist at Denver International Airport, an FAA Academy instructor training the next generation, a program analyst managing multi-million-dollar modernization efforts, and a business case specialist determining which safety investments would actually save lives.

This wasn’t a linear career path. I collected degrees like tools in a toolkit — a PhD in Organization and Management to understand why smart organizations make dangerous decisions, an MBA to speak the language of executives who control resources, an MPA to navigate government bureaucracy, and certifications in project and risk management because good intentions without proper execution kill people just as surely as bad intentions.

What Drives Me

I’m obsessed with understanding why things that shouldn’t happen keep happening anyway. Why do air traffic control systems designed with multiple redundancies still fail? Why do organizations ignore 15,214 close calls until 67 people die? Why do we implement AI solutions without asking whether AI is even the right tool for the problem?

The Reagan National collision wasn’t a bolt from the blue — it was the predictable result of systemic decisions that prioritized efficiency over safety margins. When the NTSB revealed that helicopters and commercial jets had only 75 feet of vertical separation, that controllers were working alone during complex operations because that’s how they had to “make it work,” that critical safety positions had been eliminated in the name of streamlining — I wasn’t surprised. I was angry.

My mission is to make these invisible systemic failures visible before they kill people. Every media appearance, every article, every classroom discussion is an opportunity to help someone see the system behind the story.

The Uncomfortable Truth Teller

I’ve learned that being right doesn’t make you popular, especially when being right means telling people their expensive solution won’t work or that their cost-cutting measures will eventually kill someone. But after investigating too many preventable tragedies, I’ve stopped caring about being liked and started caring about being useful.

When organizations hire me to evaluate their operations or when media outlets call for analysis, they don’t get feel-good narratives or corporate-speak. They get the unvarnished truth about what’s actually happening in their systems, why their current trajectory leads where it leads, and what it will really take to change course.

Beyond Aviation

While aviation safety remains my core expertise, the principles of systems thinking apply everywhere. I’m equally fascinated by why organizational change initiatives fail, why educational technologies don’t improve learning outcomes, and why risk management frameworks look perfect in PowerPoint but collapse in practice.

My approach to everything stems from a simple question: “What makes this position, choice, or belief make sense to those it makes sense to?” This isn’t relativism — it’s recognition that every dysfunction has an internal logic, and you can’t fix what you don’t understand.

The Academic and the Practitioner

Students often expect professors to live in ivory towers, but I’ve spent too many years in technical operations centers and accident sites to pretend that elegant theories matter if they don’t work in reality. My research focuses on the gaps between how we think systems work and how they actually behave under stress.

Current projects include developing better frameworks for AI adoption in critical infrastructure, analyzing how workforce expertise erosion affects safety culture, and documenting the hidden costs of treating human operators as liabilities rather than assets. This isn’t abstract academic work — it’s research driven by bodies I’ve helped investigate and disasters I’ve watched unfold in slow motion while decision-makers ignored warning signs.

What You Can Expect From Me

If you’re reading my work, attending my talks, or considering my perspective, here’s what you’ll get: systems-level analysis that reveals root causes rather than symptoms, uncomfortable questions about accepted practices, evidence-based insights rather than opinion, and practical frameworks for making better decisions in complex environments.

What you won’t get: simple answers to complex problems, validation for decisions you’ve already made, buzzword-filled presentations that sound smart but say nothing, or recommendations that prioritize appearance over effectiveness.

The Future I’m Building Toward

I envision a world where we stop being surprised by “unprecedented” failures that were actually quite precedented if anyone had been looking at the system holistically. Where AI enhances human decision-making instead of replacing human judgment. Where safety isn’t sacrificed for efficiency until efficiency is sacrificed for lawyers. Where organizations invest in understanding their true risk landscape before that landscape becomes a crater.

This isn’t utopian thinking — it’s entirely achievable with the tools and knowledge we already possess. What’s missing is the will to see systems as they actually are rather than as we wish they were.

Let’s Connect

I’m always interested in connecting with people who share my passion for understanding complex systems, improving safety culture, or just figuring out why things work the way they do (or don’t). Whether you’re dealing with an aviation challenge, wrestling with AI implementation decisions, or trying to understand why your organization keeps making the same mistakes, I’m happy to exchange ideas.

You can find me on LinkedIn, where I regularly share insights on aviation safety, organizational behavior, and the fascinating ways systems surprise us. If you’re interested in having me speak at your event, provide expert analysis, or help your organization see its blind spots, reach out.

Just know that I’m more interested in solving real problems than maintaining comfortable illusions. If you want someone to tell you everything is fine when it isn’t, I’m not your guy. But if you want someone who will help you understand what’s actually happening and what to do about it, let’s talk.

Dr. Philip D. Mann is currently completing “The SCAR Framework: Safety, Complexity, Accountability, and Resilience in Critical Systems,” a practical guide for evaluating AI adoption in critical infrastructure. He lives in Pennsylvania with two cats who demonstrate daily that complex systems (even small fuzzy ones) rarely behave as designed.

--

--

About Me Stories
About Me Stories

Published in About Me Stories

A publication dedicated to bringing out the stories behind the writers themselves. A place of autobiographies. Types of personal stories include introductions, memoirs, self-reflections, and self-love.

Philip Mann
Philip Mann

Written by Philip Mann

Aviation Safety Expert | National Airspace System Specialist | Aviation Education Leader

Responses (1)