Australian regulator asks Steam, Fortnite, Minecraft and Roblox to explain protections

Australia's digital safety regulator has sent enforceable requests to Steam, Fortnite, Minecraft and Roblox, seeking documentation on systems, staff and technical measures aimed at preventing grooming and extremist influence among young players

The Australian government’s online safety regulator has taken a formal step aimed at the major interactive gaming services used by children and teenagers. The eSafety Commissioner has issued legally enforceable notices to four large platforms—Steam, Fortnite, Minecraft and Roblox—demanding detailed disclosure about how each service detects, prevents and responds to serious online harms. The notices seek information about both technological safeguards and operational resources, making clear that the regulator expects transparent, documented procedures rather than informal assurances.

The companies targeted must provide the requested material within a specific timeframe, and the consequences for ignoring the demand are severe. Under the notice regime the platforms typically have 30 days to respond, and failure to comply can trigger fines that reach AU$825,000 per day for each day of noncompliance. These measures form part of a broader push by Canberra to hold digital services accountable for the safety of children who interact in what have become large-scale social environments.

Why the regulator stepped in

The action stems from recurring reports that gaming spaces can be exploited for a range of serious abuses, including grooming, sextortion and the spread of extremist narratives. Commissioner Julie Inman Grant has emphasized that online games and their ancillary messaging systems are often the initial point of contact between adults with malicious intent and underage users. The pattern frequently observed involves an initial approach inside a game, followed by migration of the conversation to platforms or channels that are harder to monitor. Because many young Australians access multiplayer services, these environments have a comparable reach to mainstream social networks and require similar scrutiny.

What the notices require

The regulator’s notices ask platforms to document their internal systems and the personnel dedicated to protecting minors, as well as the technical measures they deploy in line with prevailing cybersecurity expectations. Regulators want clarity on automated detection tools, moderation workflows and escalation paths when potential abuse is identified. In addition to operational detail, the notices seek evidence of policy design and implementation that addresses radicalization vectors, in-game recruitment tactics and the transfer of conversations to private or encrypted channels.

Company responses and legal backdrop

Some platform owners have acknowledged receiving notices and said they are reviewing the regulator’s requests while reaffirming their commitment to user safety. For example, Microsoft has stated it is examining the notification in relation to Minecraft, and Roblox has previously faced intense legal pressure in other jurisdictions. In the United States, Roblox reached settlements with states over past failures to protect minors, resolving matters for a combined total exceeding US$23 million. The Australian notices sit alongside potential civil actions and highlight how regulators across different countries are converging on similar accountability demands.

Specific industry changes and implications

The scrutiny has already prompted some platforms to revise product features aimed at younger users. Roblox, for instance, announced plans to roll out age-tiered accounts starting in June 2026, splitting profiles into categories with different chat and access controls to better match age-related risks. More broadly, the push underscores the technical and organizational challenges companies face when monitoring real-time interactions, protecting users in cross-platform conversations and balancing privacy with safety. Regulators are increasingly demanding a mix of automated systems and human moderation teams to close gaps that purely algorithmic approaches may miss.

Broader regulatory context

Canberra’s move is part of a larger regulatory trend: Australia implemented limits on social network access for under-16s with rules that came into force in December 2026, and the eSafety Commissioner has been granted stronger enforcement powers in recent years. The gaming notices therefore reflect both a domestic strategy to protect minors and an international shift toward more rigorous oversight of digital platforms. For companies that operate globally, this creates a dynamic regulatory environment where compliance demands can vary but the expectation of demonstrable child protection remains constant.

Looking ahead, the notices will test how quickly platform operators can produce verifiable evidence of protective measures and whether regulators will press for further changes. With large parts of childhood social life now taking place inside interactive games, the scrutiny over moderation, design and accountability is likely to intensify. Companies that can show robust systems, clear staffing and effective technical controls will be better placed to navigate this evolving landscape, while those that do not respond risk significant penalties and follow-on legal exposure.

Scritto da Roberto Marini

Top gaming laptop picks for every budget and need