← Work
Involo CRMSON

Involo CRMSON

Role

UX Researcher & Project Lead

Team

17-person interdisciplinary UX team

Domain

Aviation Mental Health

Year & Duration

September 2025Present

MethodsSemi-structured interviewsLarge-scale survey designQualitative thematic analysisStatistical testing
ToolsNVivoMiroFigmaQualtricsSPSSTableau
CollaboratorsSouthwest Airlines·Delta Airlines·IFATCA·University of Washington·University of Melbourne·Pilot Mental Health Campaign

Pilot Mental Health

Impact

Researched

6,000+ survey responses and 13 pilot interviews

Changed

a system-level trust problem, not individual stigma

Enabled

AI design principles for safe disclosure and confidentiality

Influenced

the design of CRMSON, a pilot AI mental health platform

How Might We

...design AI-mediated mental health support that pilots are willing to adopt, given the risks associated with seeking help?

mental health challenges and barriers

Understand the mental health challenges and barriers faced by aviation professionals

regulatory systems and disclosure risk

Map how regulatory systems and disclosure risk shape help-seeking behavior

safe, private mental health support

Identify how AI-mediated tools can provide safe, private mental health support in aviation

How this was done

01

Secondary Research

Literature review and published paper with AHFE, establishing known gaps in aviation mental health.

02

Stakeholder Interviews

Mapped how mental health reporting systems operate and how the existing CRMSON AI model was built.

03

Survey Data Analysis

Analyzed 6,000+ responses from pilots and ATCs globally, visualizing patterns of mental health avoidance.

04

Pilot Interviews

Semi-structured interviews with commercial pilots on blockers to help-seeking and openness to AI support.

The Spark

A Pattern Too Clear to Ignore

I began examining mental health in aviation through a literature review at the University of Washington's Directed Research Group. Study after study indicated the same pattern: aviation professionals experience significant psychological strain, yet avoid seeking support due to professional and regulatory consequences.

The industry commonly attributes the problem to stigma — but the research suggested something more nuanced: concerns around trust, disclosure, and career risk.

Research question: Why do aviation professionals avoid seeking mental health support when they need it most?

To investigate, I:

  • Analyzed 6,000+ survey responses from pilots and air traffic controllers worldwide, revealing patterns of healthcare avoidance, withheld disclosure, and reliance on informal peer advice
  • Conducted semi-structured interviews with commercial pilots to explore how trust, regulation, and professional culture shape mental health disclosure decisions

"Concerns about certification and career consequences often discourage aviation professionals from seeking mental health support." — Chawla et al., CHI 2026


The Challenges

A System Designed Around Disclosure Risk

Mental health support in aviation exists within a regulatory environment where disclosure can carry professional consequences. While support resources are available, concerns about certification, career impact, and privacy often determine whether aviation professionals feel safe seeking help.

1. Certification risk

"If something goes on your medical record, it can follow you for the rest of your career."

Professional consequences shape every support decision. The prospect of losing certification means pilots routinely weigh personal wellbeing against career survival.

2. Informal support preference

"Most pilots talk to other pilots and not even bother to even talk to a doctor."

Formal medical systems are avoided. Informal peer networks fill the gap — but lack structure and expertise.

3. Constant trade-off weighing

Aviation professionals frequently described weighing the value of mental health support against potential certification consequences before seeking any help.


The Research Process

Secondary Research

Purpose: Develop a foundational understanding of pilot mental health, reporting barriers, and emerging AI-supported interventions in aviation.

Research questions explored:

  • What are the major mental health challenges faced by pilots?
  • How do regulatory reporting systems influence disclosure and help-seeking?
  • What role could AI-supported tools play in providing mental health support in aviation?

Key findings:

  • Pilots face significant stigma and career risk concerns when reporting mental health issues within existing certification systems
  • Mental health support often operates through peer support programs and unions, running alongside — but separate from — formal medical systems
  • Emerging research suggests AI-mediated support tools may offer private, accessible pathways for reflection and early intervention

Findings from this phase informed a peer-reviewed publication at the AHFE Conference.


Stakeholder Interviews

Purpose: Understand how mental health support operates within aviation; map the regulatory environment, available support resources, and how pilots navigate reporting requirements.

Key discovery: Aviation Medical Examiners (AMEs) have broad authority to question and evaluate pilots during medical certification, making pilots reluctant to seek advice or disclose emerging mental health concerns — even informally.

Three consistent patterns emerged:

  1. Mental health disclosure is tied to medical certification — any disclosure during a routine exam can have immediate certification implications
  2. Disclosures are oversimplified through standardized forms — nuanced mental health experiences get flattened into checkbox responses
  3. Pilots avoid AME advice due to certification risk — even seeking informal guidance from an AME carries perceived risk

Mixed-Methods Surveys

Purpose: Understand broader perceptions of mental health in aviation through a combination of Likert-scale questions and open-ended responses.

Distributed across aviation communities and public channels, the survey gathered 6,000+ responses from pilots and air traffic controllers worldwide.

Finding%
Say mental health is a concern within the industry98%
Don't trust regulator mental health policies (FAA, EASA, etc.)76%
Wanted to but felt they couldn't seek support63%
Actually use peer support and union-based resources13%

These numbers revealed a stark gap: nearly everyone acknowledges the problem, but very few feel able to access meaningful support.


Semi-Structured Interviews

Purpose: Understand why pilots make the mental health decisions they do within aviation systems.

I conducted 13 semi-structured interviews with pilots across different experience levels. Interviews used vignette-based scenarios — realistic narratives designed to surface how pilots think through mental health challenges and whether AI-mediated support could realistically help.

Three major findings emerged:

1. Certification risk dominates every decision

Pilots consistently weigh certification risk before any support decision. The uncertainty alone — not just the certainty of consequences — is enough to push them toward silence.

2. AI as a private thinking space

Pilots found AI most acceptable as a private "thinking space" for sense-making before speaking to another person. The ability to process thoughts without creating a record was seen as uniquely valuable.

3. Data control determines trust

"It's not the tool I distrust. It's who controls the data behind it."

Trust in AI-mediated support depended less on the tool itself and more on clear answers to: who owns the data, where it's stored, and whether it can be accessed by regulators or employers.


Insights & Triangulation

Insights were validated across all three methods — secondary research, large-scale surveys, and semi-structured interviews. Triangulating findings identified consistent patterns shaping how pilots evaluate mental health concerns and navigate support decisions.

3 methods → 5 cross-validated insights

A key behavioral finding: pilots rarely move linearly through support decisions. Instead, certification risk, system distrust, and uncertainty about support pathways create feedback loops that delay — or entirely prevent — help-seeking.

"It's not that pilots don't want help. It's that the system makes you think twice about asking."


Design Principles

Interview insights informed design principles for AI-mediated mental health support in aviation, shared with stakeholders developing tools in this space.

  • Privacy-first architecture — no data accessible to regulators, employers, or certification bodies without explicit pilot consent
  • Low-risk engagement — entry points that don't require disclosure or commitment; easy to exit without leaving a record
  • Reflection before disclosure — tools designed to help pilots think through experiences privately before deciding whether to involve another person
  • Transparent data control — clear, plain-language explanations of who owns data, where it lives, and what happens to it
  • Aviation-specific context — interfaces and language that reflect the realities of aviation culture and regulatory environment, not generic wellness framing