Connect with:
Friday / May 17.

Flight Simulators for Mental Health: How AI Is Helping Train Crisis Counselors

ReflexAI's AI conversational simulator
Share

Stephanie Hepburn is a writer in New Orleans. She is the editor in chief of #CrisisTalk. You can reach her at .​

Blake, a 35-year-old veteran from Colorado Springs, is struggling. His friend Ashton, both of whom were in the Marine Corps, just died. “I’m kinda bummed and not really wanting to be around anymore,” he writes. Blake isn’t a person; he’s an AI simulation helping military veterans practice navigating tough conversations about mental health and suicide with their peers. People can chat with him on the HomeTeam app, which launched in November and is free for anyone to use. 

Sam Dorison, CEO and co-founder at ReflexAI, likens conversing with Blake and the company’s other AI personas to flight simulator training for pilots, helping users role-play and practice skills so that they become muscle memory or reflex. “That’s why we named the company Reflex,” he said, adding that effective, timely crisis response is lifesaving. 

The HomeTeam modules include communication, talking about suicide and establishing safety. After each module, the user chats with Blake. The first module teaches communication strategies like empathy, openness about uncertainty and open-ended questions. Empathy, says the voice-over, is a skill that can be developed and practiced. 

When interacting with Blake, the app provides a conversation checklist in case the user gets stuck or doesn’t know where to begin. It gives examples of how to start a conversation, ask open-ended questions, use empathetic statements and follow up about support networks. As the user works their way through the modules, conversations with Blake get more complex. 

ReflexAI received a Google.org Fellowship, resulting in $1 million in funding and help from a team of the tech giant’s employees, including software engineers, linguists, mental health counselors and content strategists. Among those who worked on the HomeTeam app are military veterans and family members, including linguist and researcher Erin MacMurray van Liemt, whose father, a Navy veteran, died by suicide. “I have pretty high hopes for this project because I lost my dad to suicide,” she said in a YouTube video for Google. “So for me — just to try and help and be part of that solution — if it’s useful, even at least to one person… it will all be worth it.”

The company’s AI simulator personas are also being used to facilitate role-play in training crisis counselors. In fact, that’s how the technology began in 2019. At the time, Dorison and his ReflexAI co-founder John Callery were working at the Trevor Project, a nonprofit that provides crisis intervention and suicide prevention services for LGBTQ youth.

Dorison — who spent four years at the Trevor Project as a volunteer crisis counselor and eventually became the company’s chief strategy and innovation officer — said calls and texts from LGBTQ youth were on the rise, and the nonprofit needed to address the increasing demand. That meant training more volunteer crisis counselors. “Over the years, trainees expressed wanting to engage in realistic conversations earlier in training,” he said. Google provided $2.7 million in funding and nearly 30 employees for six months, supporting the Trevor Project in developing Drew and Riley, AI conversational personas that simulated LGBTQ youth in crisis. 

The Crisis Contact Simulator, named one of Time’s best inventions for social good of 2021, launched early in the pandemic — a time when the Trevor Project’s crisis counselors, like many people throughout the United States, were at home balancing multiple roles at once. Trainees liked that they could train at their own pace and schedule, while trainers felt the AI chatbots helped the trainees to be more prepared. “By the time they got to the human-led role-play later in training, their skills were that much better,” he said, noting that conversations with the simulations helped them become more comfortable asking about suicide directly and neutrally. 

Word about the simulator caught on quickly within the Trevor Project, and soon, active crisis counselors were asking for access. “They wanted to try it,” he said, adding that feeling prepared and having the opportunity to practice a wide range of calls is vital for crisis counselors. 

Dorison and Callery founded ReflexAI in 2022, when crisis organizations began asking for the technology. Among their customers are Lines for Life, which operates 988 in Oregon, and Volunteers of America Western Washington, a large statewide 988 contact center that supports the Native and Strong Lifeline. Next month, the Veterans Crisis Line will roll out a large suite of the tech company’s simulations. “The veteran personas will be used to train new responders and also as skill refreshers for the line’s 1,200-plus crisis responders,” said Dorison.

The simulations are tailored to each crisis line and the populations they serve — “If you’re building simulations for particular communities, those communities must be involved in the development,” he said — while also including core principles of crisis line work like suicide risk assessment, harm reduction, lethal means safety and effective empathy. “Some scenarios apply across organizations; others are more specific to age, geographics, lived experiences and levels of suicidality.”

The AI simulations are also designed to exhibit complexity and display emotional valence and nuanced emotion. “Someone calling a crisis line can be both angry and scared or uncertain but optimistic,” said Dorison. They can also have intersectional identities, which he says is critical for helping crisis counselors understand what someone is experiencing in the world and how that might affect their mental health. “What those particular intersectionalities are can vary by center,” he said.

In addition to core principles of crisis line work, he believes crisis lines should incorporate simulations tailored to supporting vulnerable populations like military veterans. “There are 18 million military veterans in the United States,” said Dorison. “Every crisis line should have at least one simulation on supporting a veteran.” Feedback from the simulations is automatic, allowing trainees to see where they need more work. They can also repeat them right away if they want. 

How a crisis line center incorporates the AI simulations is up to them. Some, like Lines for Life, train in cohorts and come back together after trainees converse independently with the AI personas. “They’ll learn a lesson, do the simulation and then come back together as a group,” said Dorison. Smaller organizations might start with two simulations and expand from there. 

Dorison and his colleagues also develop simulations to help people navigate nationwide stressors like the pandemic or elections. For example, among their latest simulations are personas designed to help crisis counselors support people in the months leading up to the upcoming presidential election. “Election stress went through the roof in 2016 and 2020,” he said, adding that the simulations feature multiple personas experiencing election stress.

He hopes that the AI conversational personas will help enhance crisis line operations and improve training access, allowing both big and small organizations to role-play nuanced conversations with diverse personas facing a multitude of challenges. “If someone in my family were calling a crisis line on a given day, what would I want the training to include at the organization supporting them? That’s the core question we are trying to answer.”

 

Discover more from #CrisisTalk

Subscribe now to keep reading and get access to the full archive.

Continue reading