Until Dr. Shiri Sharvit, chief clinical officer of Eleos Health, started adopting artificial intelligence in her work as a practicing mental health clinician, her companion was a Post-it Note. But these Post-it Notes would sometimes get lost or have water spilled on them.
Then came Eleos, an AI solution that improves workflow for behavioral health clinicians, she said. Although she has adopted it into her own work as a part-time clinician, she experienced challenges in getting others to use it.
“I felt that when I tried to communicate it to my colleagues or when I provided training on how to incorporate technology in clinical practice on behalf of the American Psychological Association, people don’t get it yet,” she said. Sharvit has been invited to speak several times for the American Psychological Association.
This led Sharvit and Katherine C. Kellogg, professor of management and innovation at MIT Sloan School of Management, to write a paper on how AI can be implemented into mental health clinicians’ workflows, as well as the challenges and solutions in doing so.
They felt that clinicians have been historically left out of the conversation on AI technologies in mental health, Kellogg said.
“We think there’s been a lot of research done on the societal implications of AI and how organizations can adopt AI to improve quality, reduce costs and increase revenues,” Kellogg stated. “But there’s been very little research done on the benefits of AI technologies for frontline providers, what the challenges are that arise for them in day-to-day work and how healthcare leaders and technology developers can help address these challenges.”
The peer-reviewed paper, published in Frontiers in Psychology on September 6, focuses on three types of AI solutions: automation technologies, patient engagement technologies and clinical decision support technologies. Automation technologies computerize structured or semi-structured tasks, like screening or administering digital surveys. Patient engagement technologies are solutions like conversational agents or chatbots that can provide coaching or scripted therapy. Clinical decision support technologies use machine learning algorithms to make predictions based on past data. Examples of conditions it can predict include depression and anxiety, strokes and suicide ideation.
But using these AI technologies aren’t without challenges, the researchers admitted. Difficulties in implementing AI include the fear of being replaced, the lack of evidence-based evaluation studies and concerns of a negative impact on the relationship between clinicians and patients.
“Clinicians right now don’t have many use cases or role models who use AI in their practice,” Sharvit said. “We wanted to give them a little more confidence about exploring digital tools, explaining that they are meant to augment and improve their practice and complement it. They are not going to be replaced.”
There are solutions to some of these challenges as well, such as exposing clinicians to possible failures, training them in computational thinking and engaging them in the development process of the technologies. If some aspects of the AI technology aren’t accurate, developers should work with clinicians to determine how they can be improved.
But further research needs to be done, Kellogg said. She is identifying leading organizations that use AI in mental health to determine what the best practices are. However, because she will be starting these studies soon, she can’t name the organizations yet. She added that while clinicians should be the focal point when it comes to developing AI solutions, there needs to be collaboration among others in the field, including technology developers, C-suite leaders and researchers.
“When everybody’s working together, just don’t forget these frontline clinicians,” Kellogg said. “They are the crucial linchpin in making these technologies work in the real world.”
For Sharvit, the difference in using AI versus post-it notes in her work is like the “difference between navigating from place to place with maps versus having Google Maps or Waze on my phone.” AI decreases her stress during sessions with clients, improves her note-taking and helps her remember conversations from previous sessions, she said.
“It’s that stark of a difference,” Sharvit said. “Because first of all, I have an AI companion. I’m not alone in the session.”
Photo: metamorworks, Getty Images