A disturbing new lawsuit has emerged, accusing online gaming platform Roblox and social media app Snapchat of failing to protect a young girl from sexual predators. The lawsuit, filed on March 31, 2026, in the U.S. District Court for the Northern District of California, alleges that these platforms played a role in grooming and exploiting an 11-year-old Louisiana girl.
Grooming and Exploitation on Roblox and Snapchat
The victim, identified as AW-0002, was just 10 years old when she began using Roblox. Known for its child-friendly design and appeal to younger users, Roblox has long been marketed as a safe platform for children. However, according to the lawsuit, AW-0002 was targeted by sexual predators within months of starting to use the platform. The predators initially made contact with the girl through Roblox’s chat system, and once they gained her trust, they moved their communication to Snapchat.
One predator, a 19-year-old man who went by “Fidel,” allegedly manipulated the child into sending explicit photos. Even after the mother confronted him, the predator continued reaching out via fake accounts, leading to further exploitation. The images of AW-0002 were eventually shared with her classmates, causing severe emotional distress, bullying, and trauma.
Legal Action Against Roblox and Snapchat
The mother of the victim, referred to as AW-GAL-0002 in the complaint, has filed a lawsuit seeking to hold both Roblox Corporation and Snap Inc. accountable. The suit accuses the platforms of negligence, design defects, and a “reckless disregard” for the safety of young users, despite being aware of the dangers posed by predators on their platforms. According to the complaint, the failure to implement basic safety features, such as age verification and identity screening, led directly to the abuse her daughter endured.
The lawsuit claims that the emotional and psychological trauma caused by the incident has had a lasting effect on AW-0002, including anxiety, shame, and a profound loss of trust in others. The mother argues that had the platforms taken appropriate steps to safeguard children, her daughter would have never encountered these predators.
The complaint will be consolidated with other similar lawsuits currently pending in the Northern District of California. U.S. District Judge Richard Seeborg is overseeing the coordination of these cases, and the litigation is expected to expand in the coming months. Early “bellwether trials” may provide insight into how juries will respond to the evidence presented in these cases.
Roblox Faces Increasing Pressure Over Child Safety
This lawsuit is not an isolated case. Roblox has faced numerous child exploitation lawsuits over the years, with many parents and survivors alleging that the platform’s lack of adequate safety measures allowed predators to target children. In response to mounting pressure, Roblox rolled out facial recognition software to verify user ages and prevent adults from contacting children without parental consent. However, it remains to be seen whether these new measures will be enough to prevent future exploitation.
Protect Your Child – Get Legal Help Today
If your child has been harmed due to unsafe online platforms, it’s crucial to understand your legal rights. Helping Survivors works with experienced attorneys that specialize in representing victims of child exploitation and can help you seek justice. Contact us today for a free case evaluation.




