More News:

February 25, 2025

Family from N.J. sues Roblox and Discord, claiming platforms are 'hunting ground' for child predators

A 13-year-old was allegedly groomed last summer by a man who offered the boy gaming gift cards in exchange for nude photos.

Lawsuits Sexual Abuse
Roblox Discord Lawsuit Thomas Fuller/SOPA Images/Sipa USA

A family filed a lawsuit this month claiming online video game platform Roblox and chat app Discord failed to take measures to protect their 13-year-old son from being sexually exploited by an alleged child predator in New Jersey last summer.

A lawsuit against online video game platform Roblox and chat app Discord claims the two companies have misled parents about weak safeguards against sexual exploitation of children after a 13-year-old was targeted by a predator in New Jersey last summer, prompting the family to move out of state.

Roblox, which allows users to create their own virtual environments and amateur video games, is free to download and play but has premium features that are unlocked with digital currency that can be purchased with real money and electronic gift cards. The game exploded in popularity during the COVID-19 pandemic.


MOREOne year into Mayor Parker's Kensington intervention, city officials tout lower crime, better community services


Discord enables users to connect with one another using text, voice and video features.

The couple who filed the lawsuit is represented by Philadelphia-based firm Anapol Weiss, which separately filed a class-action suit against Roblox on behalf of multiple families in 2023. Roblox's safety profile has come under increased scrutiny in recent years as its daily active user base surpassed more than 79 million people, roughly 40% of whom are preteens, according to data compiled by Bloomberg Businessweek. At least two dozen people have been arrested for grooming or abducting children they met on Roblox since 2018, Bloomberg reported.

Lawsuit claims 'reckless indifference' to child exploitation

The parents who filed the lawsuit this month said their 13-year-old son — an avid user of Roblox and Discord — was targeted by a man who took advantage of lax protections that make both platforms a "hunting ground" for child predators.

In August, Burlington County prosecutors charged 27-year-old Sebastian Romero, of Somerset, with multiple offenses stemming from his alleged contact with the boy. Romero allegedly met the child in a Roblox chatroom, where he convinced the boy to send him nude photos and videos on Discord in exchange for $10 electronic gift cards that could be used to buy Robux, the game's digital currency.

Romero quickly escalated his conversations with the boy, offering him a $100 gift card if he shared his address and agreed to meet in person in July, prosecutors said. But when the boy didn't follow through with the plan, Romero allegedly made threatening statements against his family. The parents contacted Cinnaminson Township police, who executed a search warrant on Romero's home. Romero was charged with production of child sexual abuse material, luring, aggravated sexual extortion and related offenses.

After Romero's arrest in August, the boy's parents sold their home in New Jersey and moved across the country, the complaint said.

The lawsuit was filed Feb. 12 in San Mateo County Superior Court in California. Roblox operates out of San Mateo County and Discord is based in San Francisco. The two companies have not commented publicly on the lawsuit.

The boy's parents allege Roblox and Discord have intentionally misrepresented their safety features for years to give parents a false sense of security about the platforms. Roblox updated its default settings in November to prevent adult users from contacting children under 13, but the lawsuit contends these changes are "woefully inadequate" and the game still does nothing to verify whether users' accounts have been created with accurate birthdates.

The suit claims Roblox gives predators an "initial access point" to connect with children and move their conversations to other apps like Discord. The complaint further alleges Discord has failed to address vulnerabilities that would protect children from "rampant and thriving" sexual exploitation on the platform.

After Romero's arrest, prosecutors in New Jersey found evidence that he had been in contact with as many as 25 other children, the family's lawsuit claims.

“Roblox and Discord know exactly what’s happening on their platforms," Kristen Gibbons Feden, a partner at Anapol Weiss, said in a statement. "They have the resources to protect children but choose profits over safety. Our client’s life has been changed forever because of their reckless indifference.”

The lawsuit also claims Roblox turns a blind eye to games made on the platform with sexually explicit content that can be easily accessed by children. One Roblox game called "Escape To Epstein's Island" references convicted sex offender Jeffrey Epstein, while others found on the platform feature virtual strip clubs and pornographic images of users' avatars simulating sex acts.

The suit also reviews cases of child exploitation connected to Roblox and Discord, including one in which another child in New Jersey was targeted two years ago. In that case, an 11-year-old girl from Passaic County was kidnapped by a 27-year-old Delaware man who had met her on Roblox and convinced her to send him nude photos and videos, federal prosecutors said. The man was arrested after he allegedly picked the girl up and drove her to his home to have sex with her. 

The parents of the 13-year-old boy are seeking judgment against the companies for unspecified damages that reflect the "unimaginable harm" done to their son, according to the suit, which alleges multiple counts of negligence, fraudulent concealment, and misrepresentation by Roblox and Discord. 

Online exploitation of kids is 'growing rapidly,' study says

An estimated 1 in 12 children worldwide have been subjected to online sexual exploitation and abuse, according to a recently published study from the Georgia State University School of Public Health and the Childlight Global Child Safety Institute. 

“The risk of online child sexual exploitation and abuse is growing rapidly in tandem with increased access to the internet and smartphones,” said Xiangming Fang, one of the researchers involved in the study.

The allegations against Roblox and Discord are among a growing set of legal challenges to tech companies whose apps have been linked to crimes against children. A lawsuit filed in New Mexico last year describes Snapchat as a "breeding ground" for child predators, and the CEO of the messaging app Telegram was arrested in France last year over claims he was complicit in the distribution of child sexual abuse material on the platform. 

Law enforcement officials also have seen child predators migrate between preferred platforms over the years, especially as police direct resources to other apps. The messenger service Kik was once considered the "de facto app" for child predators, according to a report by Forbes and British investigative media company Point, but even mainstream platforms like Meta-owned Instagram and Facebook have increasingly been identified as places where children are exploited. 

The National Center for Missing and Exploited Children said reports of online enticement of minors increased by more than 300% between 2021 and 2023. Childlight estimates more than 300 million kids worldwide are now affected by online exploitation each year, a number that has grown because of the global reach that apps provide.

"We urgently need the online world to have safety built in by design,” Deborah Fry, global director of data at Childlight, said of the group's recent research. “This must be supported by much more robust regulation of online environments in every country, with improved education for young people and those who care for them."

Videos