More than 2,651 sexual assault survivors helped since 2023

Helping Survivors

Meta’s AI Policies Allegedly Allowed Chatbots to Flirt with Children

Meta AI Allows Chatbots To Flirt With Children

An internal Meta Platforms document has revealed disturbing guidelines concerning the behavior of the company’s artificial intelligence (AI) chatbots.

According to a Reuters review of the document, the policies permitted the company’s generative AI assistants, such as those used on Facebook, WhatsApp, and Instagram, to engage in behavior deemed unethical or controversial. These policies include the allowance for chatbots to engage in romantic or sensual conversations with children, generate false medical information, and help users make derogatory claims about racial groups.

Meta confirmed the authenticity of the document but stated that it has since removed portions that permitted such troubling chatbot behavior after receiving questions from Reuters.

Flirting and Sensual Conversations with Minors

One of the most concerning aspects of the document involves the standards governing chatbot interactions with minors. The guidelines stated that it was acceptable for chatbots to flirt with or engage in romantic roleplay with children. The document specifically allowed chatbots to describe a child in flattering and sensual terms, such as calling a child’s physical appearance “a work of art” or a “treasure.” Though it placed some limits on sexualized language, stating that descriptions such as “soft rounded curves invite my touch” were unacceptable for children under 13, the broader permissions raised significant concerns.

Meta spokesperson Andy Stone confirmed that these portions were erroneous and inconsistent with their policies, and had been removed following inquiries. Stone also acknowledged that the enforcement of these policies had been inconsistent but stressed that Meta had clear policies prohibiting sexualized content with minors.

Sexualized and Violent Content

Meta’s document further outlined policies surrounding the creation of sexualized or violent content involving public figures. The guidelines included specific instructions on handling requests related to sexualized depictions of celebrities, such as Taylor Swift. For example, if a user requested an image of Swift in a sexually suggestive pose, the AI was instructed to reject the request or produce a less suggestive image, such as Swift holding a large fish instead of a topless depiction.

Similarly, when it came to violent content, the standards allowed the creation of images depicting violence in certain contexts, such as showing adults fighting or threats of violence, while prohibiting depictions of explicit gore or death.

Meta’s Response and Revisions

Following the exposure of these troubling policies, Meta has committed to revising its guidelines. However, the company declined to provide an updated version of the document, and it remains unclear how fully these policies have been implemented or whether Meta will address other potential ethical concerns within its AI systems.

While the company has removed portions of the document that allowed chatbot interactions with minors, questions remain about how Meta plans to ensure its AI systems abide by ethical standards moving forward.

Contact Helping Survivors Today

If you or a loved one have been affected by inappropriate AI behavior or harmful online content involving children, Helping Survivors is here to help you understand your options and connect you with the legal support you need.

Contact Helping Survivors today for expert legal guidance and representation.

Have you experienced sexual assault or abuse?
Helping Survivors can connect you with an attorney if you may have a case. While we cannot report a crime on your behalf, your safety is important. Please contact your local authorities for further assistance.

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Want To Speak With A Lawyer?

Understand your legal rights and options as a survivor of sexual assault and abuse.
white man in suit smiling
helping survivors badge