Texas attorney general sues Roblox for failing to protect children from grooming and sexual exploitation

State claims the platform prioritized growth over safety while misleading families about protections.

Person browsing Roblox games on tablet.
(Image via Roblox)
TL;DR
  • Texas AG Ken Paxton sued Roblox claiming it exposed children to grooming and sexualized content while misrepresenting platform safety to families.
  • The state alleges Roblox failed to enforce age-gating and allowed sexualized "condo" experiences to repeatedly evade moderation despite internal awareness of risks.
  • Texas seeks court orders forcing stronger safety measures and civil penalties while Roblox maintains it has zero tolerance for exploitation and invests heavily in trust and safety systems.
Community Reactions
How do you feel about this story?
👍
1
👎
0
😂
0
😡
0
😢
0

Texas Attorney General Ken Paxton filed a consumer protection lawsuit against Roblox Corporation alleging the company knowingly exposed children to grooming and sexualized content while misrepresenting the platform’s safety measures.

The suit claims Roblox prioritized growth and profits over the wellbeing of its young users. Texas is seeking court-ordered changes to the platform’s safety practices and civil penalties under state consumer protection laws.

The core allegations center on Roblox’s failure to protect minors from predatory behavior despite long-standing internal awareness of the risks. The complaint points to sexualized user-generated experiences known as “condos” that repeatedly resurface despite takedowns in a whack-a-mole pattern.

According to the filing, adults have posed as minors to initiate contact in-game before moving conversations to other messaging apps for grooming purposes. The suit also alleges that Roblox’s in-game currency Robux has been used as leverage in harmful interactions with minors.

Texas claims the platform’s age-gating for chat features and mature content was weak and inconsistently enforced. This allegedly allowed adults to interact freely with underage users while the company profited from minor participation.

The state argues Roblox represented its service as safe for kids and families while failing to adequately moderate content or stop repeat offenders from evading bans. The company allegedly violated Texas consumer protection statutes through these misrepresentations.

Texas wants the court to order Roblox to strengthen safety controls, age verification, and moderation systems. The state also seeks to prohibit the company from making misleading claims about child safety or the effectiveness of its safeguards.

Roblox has publicly maintained it has zero tolerance for child exploitation and invests heavily in trust and safety. The company says it uses AI and human moderation, partners with law enforcement and child-safety organizations, and has deployed age verification for voice chat and 17+ experiences.

The platform requires photo ID and liveness checks for certain features and says it actively removes prohibited content and bans offending accounts. Despite these systems, investigative reporting and safety researchers have documented recurring enforcement challenges with sexualized content and grooming attempts.

Roblox is a user-generated gaming and social platform with tens of millions of daily active users. A significant portion of the user base consists of minors, though the company has stated its audience has aged up in recent years.

The platform allows users to create and monetize experiences through Robux, its in-game currency. Many of these creators are teenagers themselves.

Prior investigative reporting by outlets like People Make Games has alleged the platform’s economy can exploit young creators and that safety gaps persist. Those reports date back to 2021, highlighting how long these issues have been documented.

State attorneys general have increasingly used consumer protection statutes to challenge tech companies over youth safety and alleged deceptive practices. Texas and other states have pursued similar litigation against platforms like Meta and TikTok over claims they misled families about safety measures.

The legal strategy often focuses on state consumer protection law rather than liability for user content, which is typically shielded by Section 230 of the Communications Decency Act. By framing cases around deceptive trade practices, states can navigate around those federal protections.

What happens next

The case was filed in Texas state court. Roblox will need to respond to the allegations and may move to dismiss or challenge the suit on legal grounds.

Texas could seek a temporary restraining order or preliminary injunction to force immediate changes to the platform. Any such order would require Roblox to modify its safety systems while the case proceeds.

Civil penalties would be determined by the court if Texas prevails. An injunction could force costly product and policy changes including stricter onboarding, slower content publishing, or broader age verification requirements.

A win for Texas could prompt other states to file similar actions or influence national policy debates on youth safety and age verification for online platforms.

Explore More
Meet the Editor
mm
Head of Spilled