More than 3,238 sexual assault survivors helped since 2023

Fargo Snapchat Image Investigation Involves 65 Teen Victims

Frustrated girl at night in her home using Snapchat

North Dakota officials say 27 minors have been referred to juvenile court following an investigation into the alleged creation and distribution of sexually explicit images involving Fargo-area students.

According to reports, North Dakota Attorney General Drew Wrigley described the case as “unprecedented” and said investigators identified about 65 victims connected to the alleged Snapchat activity. The Cass County State’s Attorney’s Office confirmed that 27 juveniles were referred to juvenile court for potential charges.

The investigation reportedly involves images shared through Snapchat, including some allegedly created with artificial intelligence and others involving real photos. Because the case involves minors, prosecutors have said they cannot discuss individual juvenile matters publicly.

Investigation Began After Image of Middle School Student Was Reported

The case began in April 2025, when a West Fargo school resource officer learned of an explicit image allegedly created using the face of a middle school student. The investigation expanded months later, when authorities executed search warrants at Fargo Davies High School and seized more than 50 cell phones, according to the reporting provided.

It was previously reported that students at Davies High School were under investigation by the North Dakota Bureau of Criminal Investigation for allegedly creating and distributing sexually explicit images on Snapchat. Fargo Public Schools said at the time that it was cooperating with investigators.

By February 2026, the North Dakota Attorney General’s Office and the BCI had completed their investigative work and submitted materials to the Cass County State’s Attorney’s Office for charging review.

Officials Say the Conduct Should Not Be Dismissed as a Mistake

Wrigley said the alleged conduct should not be minimized as young people making a “silly” mistake. He emphasized that creating or sharing sexually explicit images of minors can carry serious legal consequences.

According to the report, authorities are reviewing potential charges under North Dakota law, including provisions related to disseminating obscene material. Wrigley also warned that images can spread quickly through digital platforms, making the harm difficult to contain once content is shared.

For victims and families, that rapid spread can create fear, humiliation, privacy concerns, and school-related stress. When images are created or circulated without consent, the impact may continue long after the first share.

AI-Generated Explicit Images Raise New Safety Concerns

One especially concerning part of the investigation is the alleged use of artificial intelligence. Officials said some images were AI-generated, meaning a student’s face or likeness may have been used to create explicit content without their consent.

This kind of digital exploitation can be deeply damaging even when an image is manipulated or fake. Victims may still face bullying, harassment, social isolation, or anxiety because classmates or others believe the content is real.

For parents, educators, and students, the case is a reminder that technology-related sexual harm is not limited to physical contact. Sharing, creating, requesting, saving, or threatening to distribute explicit images can all create serious consequences for victims.

What Victims and Families Can Do After Digital Sexual Exploitation

Victims and families impacted by nonconsensual explicit images may have several options. These can include reporting the content to a school, contacting law enforcement, asking platforms to remove images, preserving evidence, and speaking with a trauma-informed advocate or attorney.

It is important not to blame victims for what happened. A student whose image, face, or body was used without consent did not cause the harm. The responsibility belongs with the people who allegedly created, shared, saved, or encouraged the spread of the content.

Families may want to document messages, screenshots, usernames, dates, and school reports before content disappears. Snapchat and other platforms may delete or hide messages quickly, so preserving available evidence can help schools, law enforcement, or attorneys better understand what happened.

Get Legal Help From Helping Survivors

If you or your child was impacted by nonconsensual explicit images, AI-generated sexual content, or school-related sexual exploitation, Helping Survivors may be able to connect you with resources and legal support.

Helping Survivors helps victims and survivors understand their rights and legal resources after sexual abuse, assault, harassment, and digital sexual harm. 

Contact Helping Survivors today to learn more about your legal rights and options.

Have you experienced sexual assault or abuse?
Helping Survivors can connect you with an attorney if you may have a case. While we cannot report a crime on your behalf, your safety is important. Please contact your local authorities for further assistance.

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Want To Speak With A Lawyer?

Understand your legal rights and options as a survivor of sexual assault and abuse.
white man in suit smiling
helping survivors badge