The online safety bill amendments will mandate platforms like TikTok to ensure the safety of young users, preventing harm or injury
Under revisions to the online safety bill, social media companies will be compelled to safeguard children from encountering hazardous stunts and challenges on their platforms. The legislation will specifically address content that encourages or promotes dangerous challenges or stunts likely to cause serious injury, emphasizing the need to protect individuals under the age of 18 from such material.
TikTok has faced backlash for hosting content that includes risky dares like the blackout challenge, where users were encouraged to choke themselves until losing consciousness, as well as a challenge that promoted climbing unstable milk crate structures.
The application has implemented a ban on such stunts, clearly stating in its guidelines that it prohibits the display or promotion of hazardous activities and challenges.
Additionally, the online safety bill will mandate social media firms to take proactive measures in preventing children from accessing the most dangerous forms of content, including materials that encourage self-harm and suicide. Tech companies may be obligated to utilize age verification measures to prevent individuals under the age of 18 from encountering such content.
As part of further modifications to the legislation, anticipated to be enacted this year, social media platforms will be obliged to implement more stringent age verification procedures to prevent children from accessing pornography. This aligns them with the measures outlined in the bill for mainstream websites like Pornhub.
Platforms that host or permit the publication of adult content will need to introduce “highly effective” age-checking mechanisms, including tools like age estimation based on a selfie, to ensure accurate age verification.
Further amendments involve the obligation for the communications regulator Ofcom to create guidelines for technology companies regarding the protection of women and girls in the online space. Once the act is enacted, Ofcom will oversee its implementation and will be mandated to seek input from the domestic abuse commissioner and victims commissioner during the development of these guidelines. This collaboration aims to ensure that the guidance accurately represents the experiences and perspectives of victims.
The revised bill will introduce criminalization of the distribution of deepfake explicit images within England and Wales. Additionally, platforms will be required to prompt adult users to indicate their preference regarding exposure to self-harm, eating disorder promotion, or racist content.
Upon enforcement of the law, violations will be subject to fines of £18 million or up to 10% of global turnover. In severe cases, Ofcom will have the authority to block platforms.
Lady Kidron, a non-affiliated member of the House of Lords and advocate for children’s online safety, expressed optimism, referring to it as a “positive development for children.” The government has also confirmed its adoption of changes to facilitate bereaved families’ access to the social media histories of deceased minors.
Richard Collard, NSPCC’s Associate Head of Child Safety Online Policy, expressed satisfaction, stating, “We are pleased that the government has acknowledged the necessity for stronger safeguards for children within this critical legislation, and we will closely examine these amendments to ensure their practical effectiveness.”
Paul Scully, the Technology Minister, emphasized the government’s commitment to establishing the online safety bill as the “global standard” for protecting children on the internet. He emphasized that the government will not tolerate situations where children’s lives are endangered, whether it is due to facing abuse or being exposed to harmful content that can have a profoundly negative impact on their well-being.