In 2025, multiple high-profile lawsuits have been filed against the gaming platform Roblox and various other digital apps, with parents of minors alleging the platform has become a haven for child predators and child sexual exploitation.
This includes:
- A September 2025 lawsuit filed against Roblox and Discord by the parents of a deceased 15-year-old boy who took his own life after experiencing sexual abuse and grooming
- An August 22, 2025 lawsuit filed by a mother and her 10-year-old child in North Carolina
- An August 2025 lawsuit filed in New York by the family of a now 12-year-old child
- An August 2025 lawsuit filed in Missouri by a 13-year-old child and their family
- Two separate lawsuits filed in July 2025 by minors against Roblox and chat app Discord in California
Beginning of Lawsuits Against Roblox and Other Chat App Platforms
Two of the earliest lawsuits that shocked the internet community were filed by two 14-year-old girls in July 2025. The minors accuse the popular gaming platform Roblox and chat app Discord of facilitating sexual abuse and grooming. Both suits, filed in California courts on July 9 and 17, claim that the platforms provided a “hunting ground for child-sex predators” by failing to implement adequate safety measures to protect minors.
Allegations of Grooming and Attempted Rape
The July lawsuits, filed under “Jane Doe” pseudonyms, provide chilling accounts of how both girls were targeted by predators while using the platforms. According to the filings, the girls met men who pretended to be teenagers and began building trust through interactions on both Roblox and Discord. The predators reportedly used tactics like offering virtual currency (Robux) and manipulating the teens into meeting in person.
One case details an attempted rape in 2024, during which law enforcement intervened and arrested the perpetrator mid-assault. The other lawsuit recounts a brutal assault in 2022, in which the predator was later convicted of soliciting a child and committing lewd acts.
Similarly, a lawsuit filed in September 2025 against Roblox and Discord detailed how a young victim thought he was communicating with a peer before eventually being groomed into turning off parental controls on Roblox and sending sexually explicit photos and videos via Discord.
Roblox Accused of Prioritizing Growth Over Safety
Many of the lawsuits accuse Roblox of prioritizing growth and profit over child safety. In the July 2025 lawsuits, Roblox and Discord are charged with not taking sufficient precautions to prevent minors from becoming victims of grooming, sexual abuse, or exploitation. The plaintiffs argue that these companies have long been aware of the dangers on their platforms but have failed to implement effective safety measures to protect children.
Roblox, which allows users to create and play games, and Discord, a popular app for text and voice communications, are criticized for not having robust screening systems or age verification processes. The lawsuits claim that had these basic safety features been in place, the predators would not have been able to access the minors, and the harm would have been avoided.
Legal Challenges and Tech Company Defenses
While the lawsuits raise important concerns about child safety, it remains unclear how they will proceed in light of legal protections that tech companies often enjoy. Section 230 of the Communications Decency Act shields platforms like Roblox and Discord from legal liability for user behavior. This could complicate the legal process and may limit the accountability of the companies involved.
The Impact of Online Grooming and Sexual Exploitation
The July lawsuits filed by two young survivors are part of a broader wave of litigation targeting platforms that may not adequately protect children from online sexual predators. The issue of online grooming and abuse has garnered increased attention in recent years, with many advocates calling for stronger regulations to protect vulnerable users from harm.
As the world becomes more digitally connected, tech companies have a responsibility to create safer spaces for users, especially minors. With the rise of online gaming and social media platforms, it is essential that these platforms implement rigorous safety protocols to prevent exploitation and ensure the well-being of their users.
How Survivors Can Seek Help
If you or a loved one has experienced harm through online platforms, it is crucial to understand your legal rights and options. At Helping Survivors, we are committed to supporting survivors of sexual abuse, assault, and harassment. We can connect you with trusted legal resources to help you navigate the complexities of seeking justice.
If you have experienced harm or if you have questions about your rights as a survivor, reach out to Helping Survivors today. Our team is here to empower you with the information and support you need to take the next step in your healing and justice journey.




