Roblox says “everyone is responsible for kids’ online safety” as they face lawsuits over child protection

The platform introduces age checks and chat limits while defending itself in court.

(Image via Roblox)
TL;DR
  • Roblox's parental advocacy executive emphasized shared responsibility for child safety while the company faces lawsuits alleging it failed to protect minors.
  • The platform recently added age verification and chat restrictions between minors and adults, but parents say more controls are needed.
  • Lawsuits claim children were exposed to predatory behavior due to inadequate moderation and design choices that enable dangerous contact.
Community Reactions
How do you feel about this story?
👍
0
👎
0
😂
0
😡
0
😢
0

Roblox’s head of parental advocacy made a public statement to Vulture framing children’s online safety as a shared responsibility across parents, platforms, and society. The timing raised questions because the company is currently defending itself against multiple lawsuits from parents who claim Roblox failed to protect their children from harm.

The lawsuits allege that children were exposed to predatory behavior, inappropriate contact, and exploitation on the platform. Parents claim Roblox’s moderation systems were inadequate and that the platform’s design choices enabled dangerous interactions between minors and adults.

Roblox has introduced new safety measures in response to mounting criticism. The platform now requires age verification for users and has implemented restrictions on direct messaging between minors and adults. Chat features are limited or disabled entirely for younger users in many cases.

The platform offers parental controls that let parents manage who their children can communicate with and what content they can access. But some parents argue these controls don’t go far enough. They want options to restrict interactions to approved friends only, disable in-game purchases completely, and set time limits directly within Roblox rather than relying on device-level controls.

Roblox faces a unique safety challenge because it combines three risk factors in one place. The platform has a massive youth audience, robust social features including chat and friend systems, and user-generated content at enormous scale. This makes consistent moderation difficult.

Critics point to gaps in enforcement. Some predatory accounts reportedly receive only temporary bans after being reported, or no ban at all. Others note that bad actors can use Roblox to make initial contact with children before moving conversations to other messaging apps.

The platform uses Robux, a virtual currency, for in-game purchases. Many experiences heavily monetize through microtransactions. Parents have called for better controls to hide or disable these purchase prompts for underage users.

Safety advocates are pushing for stronger measures. They want mandatory human review of serious reports rather than automated systems alone. Some propose a friends-only interaction mode that completely blocks communication with strangers. Others want purchase approval systems that require parent passwords before any Robux transaction.

Explore More
Meet the Editor
mm
Head of Spilled