YouTube to disclose AI-altered videos from creators soon.



TLDR:

Key Points:

  • YouTube is implementing a new requirement for content creators to disclose when they are using AI-generated content in their videos.
  • The disclosure will be used to power an “altered content” warning on videos.

YouTube is rolling out a new requirement for content creators: You must disclose when you’re using AI-generated content in your videos. The disclosure appears in the video upload UI and will be used to power an “altered content” warning on videos. Google previewed the “misleading AI content” policy in November, but the questionnaire is now going live. Google is mostly concerned about altered depictions of real people or events, which sounds like more election-season concerns about how AI can mislead people.

Google gives examples of when a disclosure is necessary, and the new video upload questionnaire walks content creators through these requirements. The labels will start rolling out “across all YouTube surfaces and formats in the weeks ahead, beginning with the YouTube app on your phone, and soon on your desktop and TV.” The company says it’s also working on a process for people who are the subject of an AI-manipulated video to request its removal, but it doesn’t have details on that yet.