Tuesday, October 15, 2024

Child Safety Advocates Urge Stronger Regulations for Online Platforms Following Roblox Allegations

 

Child safety campaigners are calling on the UK's communications watchdog to enhance its enforcement of new online safety laws after a troubling report accused the gaming platform Roblox of failing to protect its young users.

Roblox, which boasts 80 million daily users, has been criticized by the US investment firm Hindenburg Research. The firm claims that the platform exposes children to serious risks, including grooming, pornography, violent content, and abusive language. Hindenburg, which has taken a "short" position on Roblox’s stock, described the platform as an "X-rated paedophile hellscape," citing alarming instances such as accounts named after convicted sex offenders and the presence of child pornography.

In response, Roblox has strongly rejected these allegations, stating that safety and civility are central to its operations. The company emphasized that millions of users enjoy safe experiences on the platform daily and that it takes any incidents seriously. Roblox added that it regularly reviews and removes harmful content and utilizes text chat filters to block inappropriate communication.

With one in five Roblox users under the age of nine and most being under 16, the platform enables players to create and share their own games while interacting with others in chat rooms. Although Roblox has parental controls and age recommendations for certain content, critics argue that these measures are insufficient.

Child safety advocates stress that the findings highlight the urgent need for the UK communications regulator, Ofcom, to rigorously implement the Online Safety Act. This legislation mandates that platforms protect children from harmful content and is backed by codes of practice currently being developed by Ofcom.

The Molly Rose Foundation, created by the parents of Molly Russell, who tragically took her life after encountering harmful online material, stated that Ofcom’s response to risks posed by platforms like Roblox will be critical. Chief Executive Andy Burrows emphasized that the report reveals a systemic failure in online safety measures, urging Ofcom to take decisive action.

Beeban Kidron, a campaigner for child internet safety, echoed these sentiments, calling for a more stringent implementation of safety measures within tech platforms. She highlighted the need for consumer-facing products like Roblox to have built-in safety features to prevent predators from targeting children.

An Ofcom spokesperson affirmed that the Online Safety Act would significantly enhance online safety in the UK, granting the regulator a range of enforcement powers to protect users. Under the act, platforms like Roblox must implement measures to shield children from harmful content, prevent grooming, remove child abuse images, and conduct robust age checks.

Roblox has stated its commitment to comply with the Online Safety Act and has been actively engaging with Ofcom’s consultations on the new regulations. The company looks forward to the final codes of practice that will guide its efforts in ensuring a safer environment for its young users.

No comments:

Post a Comment