By Hannah Zhihan Jiang

Shay Richard, 16, was not expecting much the first time she used Vive Teens, an AI-driven wellness app. Richard was diagnosed with anxiety and depression in 2019 when she was 13 years old. Today she is constantly stressed about schoolwork. 

“The first thing that came up was, ‘You’re not alone.’ Just reading that at first, I had such a sigh of relief. I felt like it might work,” she says describing her introduction to the app.

Richard says before using the app, she tried talking to friends but that did not help much. Her school provides counselling services, but she’s concerned about confidentiality. She says a friend visited the school counsellor but her parents and all the students found out. Richard goes to King David High School Victory Park. It’s an independent school under the auspices of the South African Board of Jewish Education (SABJE). 

“Mental health issues are something I deal with on a daily basis. For us, silence has become the language that we are fluent in.”

Shay Richard, 16.

The app has an AI chatbot called AIMY. It asks Richard what she needs help with, what she’s feeling and what tools could be helpful. The chatbot can’t engage in open-ended conversations, but Richard finds the mental health tips it provides, like breathing exercises, beneficial. “There’s endless, endless information. Just keeping up with those self-care activities has boosted my mental health completely. This app is like my safety net.”

Affordable, confidential AI service

Richard says she’d like the app to include more information to help her understand her feelings rather than simply providing a solution. 

Vive Teens, launched in 2021, provides their services at R30 per person at six independent schools. Vive Teens’ head of business development and products, Evan Kagan says eight more schools are about to sign up.

The app also has a “red flag” system, triggered when a user shows particular interest in suicide, self-harm and depression. When this happens the AI chatbot asks the user if they want to speak with a professional. And their “confidant,” set by the user, will be notified simultaneously. This year, the app preempted six suicides so far, says Kagan.

Kagan says partnering with public schools is more difficult but it is a long-term goal. He says the Vive Teen’s tech team is exploring ways to make the app accessible under a low-data environment. 

A supplement, not a replacement

The 2023 Mental Health Conference held in Pretoria in April, also highlighted the potential of AI in mental health. Researchers and healthcare workers discussed innovative solutions to tackle the severe deficiency of mental health resources in South Africa. The World Health Organisation’s Global Health Observatory recorded only 850 psychiatrists in South Africa in April 2019. This means there are 1.52 psychiatrists for every 100 000 people. This is based on the population of 55.6 million people generated by the 2016 census.

Researchers say AI therapy chatbots could be used as an additional mental health resource that is more confidential, accessible and cost-effective. Other advantages of AI include multilingual support, no need to physically travel to meet a therapist and being available 24/7.

Nevertheless, researchers note that AI is not a replacement for professional mental health care but a supplement. On the one hand, some experts say most AI-driven tools cannot yet replace psychiatrists’ or psychologists’ roles in conducting crisis intervention and qualitative interaction such as reacting to non-verbal cues. 

“If someone is severely suicidal, I don’t think AI is equipped right now to deal with that. For someone who’s feeling, today, they need social support but they don’t feel like they can talk to someone because of stigma or shame, that’s the potential of AI.”

Stephan Rabie, senior research officer in the HIV Mental Health Research Unit of the University of Cape Town. 

Safety & ethical concerns

On the other hand, there haven’t been enough clinical trials to assess the effectiveness as well as the safety and ethical risks of AI therapy, especially in the South African context, experts warn. 

A 2019 research found that among the 293 apps offering a therapeutic treatment for reducing depression and/or anxiety symptoms, only 3.41% of them had research to justify their claims of effectiveness. 

Most large language models used to train AI are from Western countries, especially the U.S., “representing a largely left-leaning philosophy,” says Japie Greeff, deputy director of the School of Computer Science and Information Systems at North-West University. 

“We’ve got a culturally diverse, ethnically diverse country. We really need to adapt the algorithm of robots to be applicable to our setting. We can’t expect what works in the U.S. to work here,” says Rabie. 

Some experts also raised privacy and ethics concerns. A 2019 study of popular apps for depression and smoking cessation available in app stores found 81% of apps transferred users’ private data to Facebook or Google, but only 41% disclosed this in their privacy policy. Additionally, ethical issues are raised as some users reportedly “fell in love” with chatbots as the AI learnt about the user through conversations and had an emergent personality. 

It might be safer to use “older AI” which is more quantitative in nature and can provide more focused care: recognise certain terms and give a candid response, says Greeff.

“One thing concerning is the hype around artificial intelligence. At the moment, people seem to think that AI can do just anything. It could be more dangerous than useful because there are very hard limits to what it can usefully do,” says Greeff.

AI as first-line mental health support 

Coming out of the 2023 Mental Health Conference, Greeff says he and other researchers are setting up a team to evaluate existing AI-driven or digital platforms that have the potential to use AI as a first-line mental health screening tool in a telehealth-driven system in the public sector.

An AI-driven tiered approach would free up mental health resources. On the first level, AI-driven tools would provide mental health information and “screen out” people who might be just going through a “grieving period” but have not necessarily developed mental illness. “Artificial intelligence could free up the time of the clinicians to see those people that are actually at a point where they do need intervention,” says Greeff.

The second tier would be telehealth: booking an appointment to see a healthcare worker online to assess whether an in-person intervention is necessary. The last tier would be in-person appointments.

However, a pushback to innovation might be the lack of access to the internet and smartphones in South Africa. About one-third of the country’s population use smartphones and 28% of people do not have access to the internet. 

“They often share a smartphone in a family in a household, so we have a woman enrolled in our study, but the contact number we have for her is a mother’s number. If you are talking about personal things on the app, what does that mean for the privacy of your immediate family?” says Rabie.

Rabie says a potential solution could be to build apps that are data-free or integrate the chatbot into WhatsApp, the primary mode of communication in South Africa. – Health-e News

Author