A mother whose son killed himself was horrified to find his Google search history contained multiple sources detailing how to take his own life.
Stina Jensen says she is appalled that such information was so easily available to her son Keelian, who took his own life at the age of 19 after reading several websites and a video on YouTube, the video sharing site owned by Google.
Mrs Jensen, who was taken completely by surprise by the death of “loving and gentle” son, told 9News how internet search giant Google should be doing more to prevent people finding horrendous suicide material.
She said: “The night before he died, he started out searching very normal things.
“He was right into Xbox and he had searched when this new game was coming out. He searched the moon and stuff like that, very normal things.
“I clicked on what he had clicked on. I was shocked because I thought any child out there could do this.
“The search terms he typed in were very specific and should have raised red flags.”
The tragic story of Keelian, from Coominya in south east Queensland, has echoes in the UK case of Molly Russell, whose dad Ian Russell has campaigned for tech giants to take urgent steps to protect young people and stop hosting harmful content.
Molly was just 14 when she killer herself just days before her 15th birthday in 2017 after viewing suicide material on Instagram.
Mr Russell, from Harrow, north London, who runs the Molly Rose Foundation in memory of his daughter, has urged: “For the safety of young people, the platforms need to do something quickly to make the internet safer.”
He said last year: “It’s still all too easy to find such dangerous content.”
He added: “In the hours between us saying ‘sleep well’ and the terrible dawning of next day, Molly’s only other influence must’ve come from beyond our house, beyond our love and protection – from the internet.”
In May last year we reported how Health Secretary Matt Hancock met Facebook, Snapchat, Google and Instagram and they agreed to fund the Samaritans to help identify dangerous content and create a best practice guide to tackling it.
The Government will appoint an independent regulator to hand out fines and hold tech bosses personally liable for harmful content.
But legislation might take two years.
Speaking alongside Mr Russell, Andy Burrows, associate head of child safety online with the NSPCC, said: “Until we have legislation passed the Government should monitor whether platforms are playing ball with this interim code of practice [and] name and shame those that drag their heels.”
In this latest case, Keelian had begun searching for information on how to take his own life at around 1am.
He had struggled with mental health issues and his family said he wore his heart on his sleeve, was sensitive and affectionate, but appeared to be in a good place before his death last month.
His death took everyone by surprise, with no warning signs.
When searching using the words Keelian did that morning, an ad for a free, 24-hour telephone crisis support service does appear.
But Mrs Jensen says such sites which her son found – and any relating to self-harm – should be blocked to stall people in the depths of despair, and give them time to rethink, or give their loved ones a window of time in which to help them.
She told 9News how it didn’t seem like her son had put much thought into what he was doing and his Google searches went from normal stuff straight to the suicide content.
A spokesperson for Google said in a statement: “Our hearts go out to Keelian’s family. When people search for queries relating to suicide, we show Lifeline’s 24-hour helpline number to connect vulnerable people with the help and advice they need.
“Suicide and mental health are societal challenges that government, health experts, individuals, and organisations across many industries need to come together to solve, and we’re committed to finding more ways to help people get support and care.”
An online petition started by Keelian’s aunt Camilla Jensen has attracted thousands of signatures. It calls on Google to do more to block content which gives people information on how to end their life.
– Samaritans (116 123) operates a 24-hour service available every day of the year. If you prefer to write down how you’re feeling, or if you’re worried about being overheard on the phone, you can email Samaritans at [email protected]