AI to aid young Aussies in crisis, but kid gloves still needed

Kids Helpline investigating how AI might help provide a more targeted, personalised experience to young people who are looking for help.

Young person distressed, head in their hand waiting alone in tunnel.
hikrcn / Shutterstock

The mental health of young people in Australia took a big hit in 2020 as the COVID-19 pandemic ravaged the economy and provided job insecurity for many thousands of parents across the nation.

Brisbane-based charity yourtown’s Kids Helpline phone and online counselling service responded to 176,012 contacts from young people between five and 25 years old during 2020, an increase of 21 per cent on 2019. Duty of care interventions related to child abuse were also up by 62 per cent over the same period.

Kids Helpline was established in 1991 and in the early 2000s, it was the first counselling service worldwide to introduce real-time webchat capabilities.

yourtown’s chief information officer, Helen Vahdat tells CIO Australia that it’s now time to consider how the service best evolves to meet the needs of children and young people into the future. Vahdat and her team are investigating how artificial intelligence technologies might help provide a more targeted, personalised experience to young people across Australia who are looking for help.

It’s work like this that sits at the heart of yourtown’s digital transformation, Vahdat says, stressing that young people want different experiences, for instance preferring to talk in ‘chat’ or access ‘self-help’ information or resources as and when they need it.

Vahdat admits that the service, as it stands at the moment, is not yet fully ‘connected and seamless.’

“Young people might want to interact with our social media content and then speak to a counsellor. Our aim is to ensure that there is a seamless interaction, which currently is not easily undertaken,” she says.

AI in a crisis

Vahdat says AI can potentially determine whether someone is distressed by the tone of their voice, ensuring they move more quickly through to a counsellor.

But her and the team are still trying to work out how these technologies can augment counsellors’ activities and be used in a way that is ethical. In other words, based on yourtown’s service model, how do counsellors determine one young person’s acute needs over another?

Kids Helpline counsellors are highly-skilled in supporting the needs of young people, but with demand exceeding capacity, there is currently no way of identifying a young person in crisis who may not be getting straight through to a counsellor.

yourtown has already implemented Amazon Connect, a group of cloud-based AI services that helps contact centre agents transcribe, translate and analyse customer interactions for its employment services, which are also offered to young people.

Vahdat and her team have completed a proof-of-concept (POC) to determine if this AI technology could be used as part of the Kids Helpline service.

"With a POC, we need to make sure that we don't have loss of service, loss of capability and things like that. For instance, we need to make sure the product can provide our counsellors with direct access to emergency services without comprising maintaining connection with the young person," Vahdat says.

"So, we need to make sure that with this technology that the reality of our operations not only informs the POC but that there are no impacts to the critical nature of our work."

yourtown must now decide on one of two things: deploy the AI technology across the Kids Helpline call centre using the existing principal of answering each call as it comes up in the queue. Or use AI technologies to detect distress in the caller’s voice or pick up certain words over the phone or in a chat box to alert counsellors to an immediate issue.

This is where ethics come into the equation, says Vahdat.

She stressed that AI technologies are clearly very useful at a transactional level in the context of a traditional call centre. But Kids Helpline is unique because it’s staffed by counsellors who are often supporting young people in crisis.

“The topic, event or issue range that we cover is all-encompassing. The most important thing is that we never cause harm, so the question is, ‘how do we make sure that technology is adding to our ‘no harm’ mandate?’ she says.

“Technology can help quite a lot of give these young kids what they want but it’s about finding a balance.”

Copyright © 2021 IDG Communications, Inc.

Download CIO's Roadmap Report: 5G in the Enterprise