close
close

How AI is being used to treat mental illness

Francesco Carta Fotografo/Getty Images

  • AI is being used in the underserved mental health sector to assist healthcare providers.
  • AI-powered software can suggest treatments through mobile apps and analyze therapy sessions.
  • This article is part of ‘Build IT’, a series on digital technology trends disrupting industries.

The convergence of human ingenuity and machine intelligence offers an innovative approach to personalized mental health care. By using AI technology, doctors and behavioral health care providers can provide customized treatments for people with conditions such as depression and addiction. They can also use AI to assess the quality of their services and find ways to improve as mental health providers.

These developments also raise important ethical and privacy considerations. As technology increasingly plays a role in mental health care, ensuring data security, confidentiality and equal access to services must be top priorities.

How an AI-powered mobile app provides treatment

Dr. Christopher Romig, director of innovation at the Stella psychiatric clinic, said he saw great potential in AI “helping with early diagnosis, personalized treatment plans and monitoring patient progress.”

There is a reason for this expected gain in momentum, he added: “Because there is such a massive shortage of mental health providers in this country, AI will become a key component in progress in support and interventions. “

Click Therapeutics, a biotechnology company that develops AI-powered software for medical treatments and interventions, helps patients through a mobile app. The software can work independently or in combination with pharmacotherapies to treat conditions such as depression, migraines and obesity.

The company’s algorithm collects and analyzes patient data, including symptom severity and sleep-wake cycles, from the app. It uses this information to identify patterns and correlations and provide tailored treatment strategies.

Click Therapeutics’ mobile app provides a personalized overview of a user’s health journey.
Click on Therapeutics

It also uses digital biomarkers such as smartphone sensors. For example, the sensors can monitor a patient’s heart rate to detect high stress; The algorithm can then recommend mindfulness exercises, relaxation techniques or cognitive behavioral therapy modules in the app. “They are bona fide therapies that change the brain,” Shaheen Lakhan, chief medical officer of Click Therapeutics, told Business Insider.

Patients can share these insights with their healthcare providers to better understand their conditions and behaviors. The measurement data can support treatment decisions and improve care outcomes. “You are the active ingredient, which means you have to engage with it,” says Daniel Rimm, head of product.

In January, Click Therapeutics announced that the Food and Drug Administration would help accelerate the development of the company’s software for treating schizophrenia. Research shows that this use case could significantly benefit from digital therapeutics.

Dr. Haig Goenjian, the lead researcher and medical director of CenExel CNS, told BI that patients who used prescription digital therapies in a schizophrenia-focused study said the approach “changed the way they socialize” and “they are better able to find their way in their lives’. symptoms of schizophrenia to function in the real world.”

“At the end of our studies, many patients asked how they could continue using this digital therapeutic,” he added.

How an AI platform helps mental health providers improve their services

The AI ​​platform Lyssn is another technology-driven mental health tool. It offers on-demand training modules for customers, such as behavioral healthcare providers, who want to improve engagement and sessions with their patients.

Providers can record therapy sessions with their patients’ consent and use Lyssn’s AI technology to evaluate factors such as speech patterns and tone of voice from both parties to better understand how to converse effectively and improve their approach to sessions.

“There is a need for more, and there is a need for better,” said Zac Imel, co-founder and chief psychotherapy scientist at Lyssn, referring to the nationwide shortage of mental health workers.

Michael Tanana, the Chief Technology Officer of Imel and Lyssn, said it is difficult to assess the quality of service because sessions between mental health professionals and patients are private and therefore difficult to monitor. Lyssn wants to hold providers accountable for improved care, especially because “the quality of mental health care is highly variable,” Imel said.

Lyssn’s dashboard shows quantified insights for qualitative factors such as showing empathy towards a client during a therapy session.
Lyssn

Tanana, who also co-founded Lyssn, added that “we need ways to ensure quality” as more people seek access to mental health care. The developers at Lyssn keep this in mind as they train their AI technology to recognize both problematic and successful conversation styles, Imel said.

For example, Lyssn can analyze a provider’s responses during conversations that require cultural sensitivity; this includes measuring how curious they are about the client’s experience and whether they feel anxious when talking about such topics. Based on the evaluation, the platform can provide providers with immediate feedback on their skills and suggest certain training and tools to help them learn and improve.

Darin Carver, a licensed therapist and assistant clinical director at Weber Human Services, uses Lyssn to improve patient outcomes. “Physicians have almost instantaneous access to session-specific information on how to improve their clinical work,” he told BI.

He added that supervisors also have access to skills-based feedback generated from session reports, which they use to turn doctors’ vague memories into hard facts about what skills they used and need improvement.

Carver said feedback and advanced analytics are essential treatment decisions. “We can drill down into what our real training needs are and which doctors and areas need help,” he said. “It’s been a game changer.”

Concerns about AI in mental health care

There is still a need for human-led regulation in the use of AI in mental health care. AI algorithms can perpetuate biases and stereotypes based on the data they are trained on.

To account for issues, Lyssn produces a detailed annual report evaluating the performance of its training and quality assurance models for serving people from historically marginalized communities. The company is also working with leading universities to assess the technology’s multicultural competency.

Strict compliance rules are also needed to protect patient privacy and confidentiality. For example, Lyssn uses encrypted data transfer and storage, two-factor authentication, and regular third-party compliance audits to help prevent data breaches. As technology-driven care evolves, Carver says, mental health professionals have an obligation to use AI ethically to improve people’s health and well-being.

Media is not supported by AMP.
Tap for full mobile experience.