UNICEF Warns 1 Million Children Exposed to AI Deepfakes

The UN agency urges governments to expand child pornography laws to cover AI-generated content, following reports of widespread sexualized deepfake images affecting children globally

Over 1.2 million children worldwide reported seeing manipulated sexualized images of themselves in the past year, according to a recent Interpol study across 11 countries. The UN Children’s Fund (UNICEF) is now calling for urgent action to protect children from AI-generated sexual abuse images, known as deepfakes.

The warning comes in the wake of the Grok scandal, an AI chatbot on the platform X, owned by Elon Musk, which circulated semi-nude and sexually explicit images of children and women without consent. These deepfakes—hyper-realistic images and videos created with artificial intelligence—pose a growing threat to minors online.

UNICEF has urged governments to update the legal definition of child pornography to include AI-generated content. “The harm from the abuse of deepfakes is real and urgent. Children cannot wait for the legal gap to close,” the organization stated.

The agency also called on AI companies to build safeguards into their products, and social media platforms to strengthen content monitoring with automated detection tools. On Sunday, the United Kingdom became the first country to announce plans to criminalize the creation of synthetic child abuse images.

The Grok scandal prompted an investigation by the European Commission, and French authorities raided the platform’s Paris offices under judicial order. While xAI, Elon Musk’s company behind Grok and X, pledged to halt the generation of images of real people where prohibited, recent reporting by Reuters revealed the chatbot still produces non-consensual nude images despite warnings.

UNICEF emphasizes that children remain highly vulnerable: in some countries, one in 25 minors—or roughly one student per classroom—has been affected by deepfake sexualized content.

Follow tovima.com on Google News to keep up with the latest stories
Exit mobile version