Australia demands gaming platforms specify grooming, extremism prevention measures
The Australian Government's eSafety office has issued legally enforceable transparency notices to Microsoft, Roblox, Epic, and Valve, demanding that they outline in detail how their platforms prevent child grooming and the spread of extremism.
Concerns over online risks
The eSafety office, an independent agency established in 2015 to combat youth cyberbullying and online child sexual abuse material, has expanded its focus to protect all Australians from a wide range of online risks.
Commissioner Julie Inman Grant warned that platforms like Roblox, Minecraft, Fortnite, and Steam are being used by sexual predators to groom children and by extremist groups to spread violent propaganda and radicalize young people.

Gaming platforms a magnet for predators
Inman Grant pointed out that predatory adults are well aware of the fact that gaming platforms are among the most heavily used online spaces by Australian children, serving not just as places to play but also as environments for socializing and communicating.
According to eSafety's research, around 9 in 10 children aged 8 to 17 in Australia engage in online gaming.
Inman Grant cited numerous media reports of grooming taking place on the four targeted platforms, as well as terrorist and violent extremist-themed gameplay, including Islamic State-inspired games, recreations of mass shootings, and adaptations of World War II concentration camps.
She also noted that Steam has been identified as a hub for extreme-right communities, although specific examples were not provided.
The eSafety office stressed that compliance with the transparency notice is mandatory, with daily penalties of up to AUD$825,000 applicable to companies that fail to respond.
Roblox responds to esafety's demands
In a statement to IGN, Roblox outlined several measures it currently employs to prevent grooming and extremism, including policies prohibiting content that promotes or glorifies terrorist or extremist organizations, swift removal of such content, and the use of AI Technology to review images and text before publication.
Roblox also announced plans to introduce age-based accounts for children under 16, with more restrictive content access, communication settings, and parental controls.
The company emphasized its commitment to safety and collaboration with eSafety to ensure Australian children are protected online.
