Image source, Getty Images
Ofcom has warned that social media sites that fail to comply with new online safety rules could be named and shamed and banned from use by under-18s.
The media regulator has published a draft code of practice that will require tech companies to put in place stronger age verification measures and redesign their algorithms to keep children away from “harmful” content.
But parents of children who died after being exposed to harmful online content said the proposed new rules were “insufficient”, with one saying the BBC's changes were happening “at a snail's pace”. he said.
Meta and Snapchat said in a statement that they have special safeguards in place for those under 18 and provide tools for parents to control what their children can see on their platforms.
Article Information Author, Tom Singleton Role, Technology Reporter May 8, 2024
Updated 56 minutes ago
The other companies did not respond to BBC requests for comment.
Ofcom boss Dame Melanie Dawes said businesses that breached the draft code of conduct would be “named and shamed”, with tougher measures being considered, such as banning social media sites aimed at children. It revealed that.
Esther Gay, whose 16-year-old daughter Brianna was murdered by two teenagers in February 2023, told BBC Breakfast that she was “really concerned” about Ofcom getting its regulations right. He said he believed that.
But she said the full extent of the problem remained unclear.
Lisa Kenevan, whose son Isaac died at age 13 after taking part in an online “Blackout” challenge, said the pace of change was not fast enough.
“The sad thing is that it's happening at a snail's pace with Ofcom and social media platforms taking responsibility. The reality is there will be many more incidents,” she told BBC Breakfast.
Caption of the video: “How many children do we not know about?” – Brianna Gee's mother
Following the introduction of the Online Safety Act, Ofcom's job is to enforce new, tougher rules. These rules dictate what technology companies must do to comply with the law.
Ofcom said they included more than 40 “practical measures”.
At the core are requirements around the algorithms used to decide what appears on people's social media feeds.
Ofcom said tech companies should set up algorithms to filter out the most harmful content from children's feeds and reduce the visibility and prominence of other harmful content.
Other proposed measures include forcing companies to implement stricter age verification when displaying harmful content and stronger content moderation, including so-called “Safe Search” features on search engines. This includes requiring the introduction of
Speaking on BBC Radio 4's Today programme, Melanie described the new rules as a “big moment”.
“Young people are fed harmful content on their feeds over and over again, and this has become normalized, but that needs to change,” she said.
These new measures are expected to come into force in late 2025, according to Ofcom's timeline.
Melanie added: “We will publish the league table so the public can see which companies have made changes and which have not.”
image captionIan Russell, Melanie Dawes and Esther Guy spoke for nearly 30 minutes about Ofcom's new measures
Mrs Melanie met Guy and Ian Russell, who took their own lives at the age of 14 in 2017.
In 2022, a coroner concluded that she died of self-inflicted injuries while suffering from depression and the negative effects of online content.
They are part of a group of bereaved families who signed an open letter to Chancellor Rishi Sunak and opposition leader Sir Keir Starmer.
In it, they call on politicians to do more to keep children safe online, including “committing to strengthening online safety laws in the first half of the next parliament”.
They are also calling for mental health and suicide prevention to be integrated into school curricula.
“We will be studying Ofcom's latest proposals carefully and are disappointed by their lack of ambition so far,” they added in the letter.
“Step up”
The government claims the measures announced by Ofcom will “bring about fundamental changes to the way children in the UK experience the online world”.
Technology Secretary Michelle Donnellan has called on major technology companies to take the standard seriously.
“My message to the platforms is to work with us and be prepared,” she said.
“Instead of waiting for enforcement or hefty fines, take action now to meet your responsibilities.”
Bruce Daisley, the former UK head of Twitter and YouTube, told BBC Radio 5 Live Breakfast that technology for verifying under-18 age would need to be improved for the proposal to work.
“The challenge, of course, is identifying who the young users are, so the implication for all of us is that age verification is taken up a notch,” he said.
Most tech companies contacted by the BBC did not respond or declined to comment on the record.
“As a popular platform for young people, we know we have an additional responsibility to create a safe and positive experience,” a Snapchat spokesperson said in a statement.
“We support the aims of the Online Safety Act and are working with experts to inform our approach to safety on Snapchat.”
A Meta spokesperson also said the company wants young people to “connect with others in an environment where they feel safe.”
“Content that incites violence or encourages suicide, self-harm, or eating disorders is against our rules and we remove content as soon as we become aware of it.”
contact
Have you been personally affected by the issues raised in this story?