UK Technology Secretary Peter Kyle stated that the UK's internet safety laws are "very unbalanced" and "unsatisfactory," following calls from activists for stronger regulations. This statement comes after Ian Russell, the father of Molly Russell, publicly criticized the existing laws. Molly Russell took her own life at age 14 after being exposed to harmful online content, and Ian Russell believes the UK is "going backwards" on this issue.
In a letter to the Prime Minister, Ian Russell pointed out that the Online Safety Act, designed to force tech giants to take more responsibility for content on their sites, needs amendments and should mandate a "duty of care" for companies. Kyle expressed his "disappointment" in the Act, which was passed by the previous Conservative government in 2023, during a BBC interview.
The Conservative government initially planned to compel social media companies to remove some "legal but harmful" content, such as posts promoting eating disorders, in the legislation. However, this proposal sparked strong opposition from critics, including current Conservative leader Kemi Badenoch, who feared it could lead to censorship. Badenoch stated in July 2022 that the bill was "not fit to be law," adding "we shouldn't be legislating for hurt feelings." Another Conservative MP, David Davis, suggested it could lead to "the largest accidental restriction of free speech in modern history."
Ultimately, the plans for adult social media users were scrapped, replaced with a requirement for companies to give users more control to filter out content they do not want to see. However, the law still requires companies to protect children from legal but harmful content. Kyle stated that the section on legal but harmful content had been removed from the bill, "so I've inherited a very unbalanced, unsatisfactory legislative solution." While he did not promise to amend the existing legislation, he said he was "very open-minded" on the issue.
Kyle also said that the Act contains some "very good powers" that he is using "actively" to tackle new safety issues. He also mentioned that in the coming months, ministers will gain the power to ensure online platforms provide age-appropriate content. He emphasized that companies that fail to comply with the law will face "very tough" sanctions. A Whitehall source told the BBC that the government has no plans to scrap the Online Safety Act or pass a second bill, but instead wants to work within what they see as the bill's limitations. The source also said that ministers are not ruling out further legislation but want to be "flexible and fast-moving" to keep up with rapidly changing trends.
In his letter, Ian Russell pointed out that "ominous" changes in the tech industry are placing greater pressure on the government to act. He stated that Meta boss Mark Zuckerberg, who owns Facebook and Instagram, and Elon Musk, owner of social media site X, "are at the forefront of a complete industry reset." He accused Zuckerberg of moving from safety to a "laissez-faire, anything-goes" model and "going back to the harmful content that Molly was exposed to."
Earlier this week, Zuckerberg said Meta would be scrapping fact-checkers and moving to a system already introduced by X, which allows users to add "community notes" to social media posts they believe are untrue. This marks a change from Meta's previous approach, launched in 2016, of having third-party auditors check false or misleading posts appearing on Facebook and Instagram. Content flagged as inaccurate would be ranked lower in users' feeds and labeled with additional information for viewers about the topic.
Zuckerberg defended the new system, saying the auditors were "too politically biased" and it was time to "get back to our roots around free expression." The move comes as Meta seeks to improve relations with incoming US President Donald Trump, who has previously accused the company of censoring right-wing voices. Zuckerberg said the change, which applies only in the US, would mean content moderators would "catch less bad stuff" but also reduce the number of "innocent" posts that are removed.
In response to Russell's criticism, a Meta spokesperson told the BBC that "there is no change to how we treat content that encourages suicide, self-harm and eating disorders," and said the company will "continue to use our automated systems to scan for high-risk content." When asked about the change, Kyle said the statement was "an American statement for American service users," adding: "One thing that hasn't changed is the law in this country." He emphasized, "If you come and operate in this country, you have to obey the law, and the law says that illegal content has to be taken down."
The provisions of the Online Safety Act, which will come into force later this year, compel social media companies to prove they are removing illegal content, such as child sexual abuse, material that incites violence, and posts that promote or facilitate suicide. The law also requires companies to protect children from harmful material, including pornography, material that promotes self-harm, bullying, and content that encourages dangerous stunts. Platforms are expected to adopt "age-assurance technology" to prevent children from seeing harmful content. The law also requires companies to act against illegal, state-sponsored disinformation. They should also take steps to protect users from disinformation if their services can be accessed by children.