Frankie took her own life after months of being served up self-harm content. Her parents pleaded with the platform for information; for months all they got were automated responses.
First published: Dec 2022.
T he hardest thing Judy Thomas has ever had to endure is losing her daughter Frankie to suicide.
“I’m not just saying this because she’s our daughter, but she had a really wacky sense of humour,” she said.
Frankie was five years old when the family was told she had autism, and the diagnosis did not diminish her spirit.
When she wasn’t riding rollercoasters, collecting badges in judo, or scaling climbing walls, she spent her time playing music – her passion was the bass guitar.
Yet in September 2018, on a day seemingly like any other, Frankie took her own life at the age of 15 without warning.
It was only after their daughter’s death that Judy and her husband Andy learned she had spent months looking at disturbing content online at school, content that depicted self-harm and glorified suicide.
They had carefully monitored Frankie’s access to the internet at home; they both had password-protected laptops, kept an eye on her when she went online, and tracked her phone habits.
But when Frankie enrolled at a specialist education school, she was given her own computer and access to an iPad to help her with work at school, and her parents told the school to regularly monitor her online activity.
The school’s e-safety filter – which would have blocked access to harmful content – was later found to have not been working properly at the time of the teen’s death. The system failure allowed Frankie to spend two and a half hours browsing harmful content that day. Neither the school nor Ofsted, despite its inspections, had any idea students were able to freely access such sites.
“Frankie hadn’t been accessing the lesson content at all, she’d been online looking at horrendous things… not just that morning but at many other times we discovered,” Judy said.
“There were a whole range of sites about self-harm, suicide and ‘fun ways’ to do it. On and on and on. Hours of it… she was living this life, and not feeling like she was able to tell us. We had no idea.”
Four out of the five stories Frankie read in the hours leading up to her death glorified suicide, and it was one of these user-published stories that Frankie copied later in her bedroom.
In the days after her daughter’s death, Judy and Andy began writing letters to the websites demanding answers about how she had been able to access such content, and why it was still online.
They pleaded with a social media platform to let them see the material Frankie viewed on a secret account, and for months all they got were automated responses, only finally receiving a substantial letter over a year later.
“I cannot tell you how many letters we’ve written,” Judy said. “We would be happy to find nothing, but it would help us get closure… Andy and I have always said we would not be bitter nor vengeful, as it would not bring Frankie back.”
The parents penned letters to various strands of the Government, including the Department for Education and the Draft Online Safety Bill Committee. The Online Safety Bill would force websites to protect users from harmful content. Ofcom would serve as the independent regulator – yet the new measures have remained in limbo since they were promised in the Conservative Party’s 2019 general election manifesto.
The hardest thing Judy Thomas has ever had to endure is losing her daughter Frankie to suicide. | CREDIT: GOOD LAW PROJECT/JUDY THOMAS
It is finally due back in Parliament with a view to be implemented in summer, however, it has been accused of having been “watered down” following the removal of vital provisions.
“What I find really difficult is that it seems like there is a complacency or lack of interest or urgency,” Judy said. “I feel if somebody involved in the Bill, God forbid, lost a child to something they’d seen online, they’d sort it straight away.”
Frankie’s parents are clear about what they’re calling for.
“There must be age verification, and it’s really important it’s not just the well-known sites,” said Judy, noting Frankie was able to access ‘mature’ content for over 17s on online literature platform Wattpad simply via a search engine.
“Online content, even if legal, needs to be carefully checked because it could be harmful like Frankie’s was. The last stories Frankie accessed did not have the word ‘suicide’ in their title, they did not have the word ‘suicide’ in their content, but they glorified suicide. We also need regulation of algorithms which recommend similar material, which Frankie then went on to access.”
Judy and Andy also believe that schools must be legally required to check their e-safety filters are working as they should by blocking harmful content and alerting teachers, and keep records to show Ofsted during inspections.
The parents have written to each and every Secretary of State for Education since the 2021 inquest into Frankie’s death – there have been five – and want more than warm words from ministers in return – they want to see a policy change.
“We don’t just want a voluntary scheme… social media companies can’t be left to regulate themselves. And we need something now – even if it’s just in the interim because the Bill will take years – that covers all sites that children can access.”
“They can’t be casual about this. We don’t have a daughter anymore, she’s dead, as is Molly Russell, and how many others.”
In recent weeks, Judy and Andy have been invited to a meeting with the deputy director for Independent Education and School Safeguarding, officials from the Department for Education’s Digital, Data and Technology, the children’s commissioner, and their local MP Michael Gove, whom they praise for his continued support.
Good Law Project believes the death of Frankie Thomas, Molly Russell and other children in similar circumstances must be a turning point in online safety. Through our Online Harms campaign, we want to force social media companies to stop using recommender algorithms, which can further expose users to harmful content, and ensure the Government makes the internet a safer space through its long-promised Online Safety Bill.
— AUTHORS —
▫ Good Law Project, a not-for-profit campaign organisation that uses the law to protect the interests of the public.
GoodLawProjectonly exists thanks to donations from people across the UK. If you’re ina position to support their work, you can do so here.