White House extracts voluntary commitments from AI vendors to combat deepfake nudes

Written by
Kyle Wiggers
Published on
Sept. 12, 2024, 3:20 p.m.

The White House has announced that several major AI vendors, including OpenAI and Microsoft, have committed to taking steps to combat nonconsensual deepfakes and child sexual abuse material.

Adobe, Cohere, Microsoft, OpenAI and data provider Common Crawl said that they’ll “responsibly” source and safeguard the datasets they create and use to train AI from image-based sexual abuse. These organizations — minus Common Crawl — also said that they’ll incorporate “feedback loops” and strategies in their dev processes to guard against AI generating sexual abuse images. And Adobe, Microsoft and OpenAI (but not Cohere) said they’ll commit to removing nude images from AI training datasets “when appropriate and depending on the purpose of the model.”

The commitments are voluntary, it’s worth noting. Many AI vendors opted not to participate (e.g. Anthropic, Midjourney, etc.). And OpenAI’s pledges in particular are suspect, given that CEO Sam Altman said in May that the company would explore how to “responsibly” generate AI porn.

The White House nonetheless touted them as a win in its broader effort to identify and reduce the harm of deepfake nudes.

Weekly newsletter
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.
Read about our privacy policy .
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.