Parents could be alerted when children access disturbing content on phone
Parents would receive alerts if their children search for disturbing content on phones, under plans being considered by ministers.
The proposal is expected to be included in the government’s consultation on smartphones and social media, which will be published this month.
A ban on social media and sales of smartphones for under-16s will form part of the consultation, which aims to build on measures in the Online Safety Act to protect children. Esther Ghey, the mother of Brianna, 16, who was murdered by two teenagers last year, proposed the parental alert system this year.
She said: “If any words are being searched like the words that were searched during the run-up [to Brianna’s murder] … it could be flagged up on a parent’s phone, and then parents are aware of any concerning things that children are looking at.”
The alerts could form part of “family centre” parental controls already available in social media apps such as TikTok, Instagram, and Snapchat.
Parents can already control settings, restrict types of content, view friend lists and see with whom their children have been communicating. Ministers will also look at whether the alert system could be built into the device itself, rather than on certain apps.
However, the plan faces opposition from some campaigners, experts and the industry for being too intrusive.
Ian Russell, whose daughter Molly took her life aged 14 after viewing disturbing content on social media, said the alert system may end up doing more harm than good.
Russell, who chairs the Molly Rose Foundation, said: “It is only right that the prime minister is listening to the groundswell of concern about online safety, but by rushing to introduce new measures there is a real risk of introducing unintended consequences that do more harm than good.
“In the six years since Molly died, I’ve always supported online safety measures that are grounded in evidence which shows they will work. Hurried proposals are unlikely to meet that crucial test.
“We already know the evidence shows that overly intrusive parental controls can result in worse safety outcomes, weaken trust between children and parents, and make it less likely for children to disclose harm and receive the support they need.
“It’s vital that we respond to the appalling harm faced by children online with cool heads and by doing what will work as intended. To that end, politicians should remain focused on delivering a strengthened Online Safety Act, and keeping this as a priority will best protect children from online harms.”
Those in the industry have raised concerns about privacy and the practicality of the measure. If a child does not their parents to know about their sexuality or if they are victims of abuse, the alerts could be problematic, they say.
Industry sources also question who would define the search terms to be used, which could be evaded or throw up too many benign alerts. A Whitehall source admitted the measure was “technically difficult but easier than the tech industry would like to admit”.
Earlier plans to make the changes part of a voluntary charter for tech companies have been shelved, because it is believed action needs to be mandatory.
• A smartphone ban would let tech firms off the hook
The proposed ban on social media and smartphone sales for under-16s has been opposed by the NSPCC. Sir Peter Wanless, its chief executive, called the plan a “blunt proposal’.
Writing in The Times, he said: “Blanket bans on smart phones or social media for everyone under-16 is an unnecessarily blunt proposal that would punish young people for the failures of tech companies to properly embed child safety into the services they design. It would take away any imperative for them to fix the problem they’ve created. We’d be letting them off the hook.”
The importance of addressing child safety online was highlighted by new Ofcom research that showed that about a quarter of children aged five to seven own a smartphone. The use of social media sites has also risen, with 38 per cent of that age group using apps such as WhatsApp, TikTok and Instagram, despite terms of service requiring users to be at least 13.
Jason Elsom, chief executive of Parentkind, said: “These ideas are welcome but simply don’t go far enough to protect children. There is huge support among parents for technology companies to do more but in the end we need to restrict the sale and use of smartphones by children. We urgently need the political parties to commit to putting a smartphone ban in their manifestos. It’s like asking smokers not to smoke.”
Post Comment