YouTube Disclosure Rules for AI-Generated Likeness and Synthetic Media
AI tools can create voices, faces, scenes, music, images, translations, edits, and realistic video effects faster than ever. That creates a trust problem for viewers. They need to know when something that looks or sounds real has actually been meaningfully altered or synthetically generated.
YouTube addresses this with altered or synthetic content disclosure rules. Creators must disclose content that is meaningfully altered or synthetically generated when it appears realistic. This is especially important when the content makes a real person appear to say or do something they did not say or do, alters footage of a real event or place, or creates a realistic-looking scene that did not actually happen.
The rule is not that every use of AI needs a disclosure. Using AI to brainstorm titles, write an outline, clean up audio, create captions, repair footage, or generate a non-realistic fantasy scene may not require the altered content disclosure. The rule is focused on realistic and meaningful changes that could mislead viewers about what is real.
This guide explains YouTube disclosure rules for AI-generated likeness, synthetic voices, altered footage, realistic scenes, Dream Screen, Dream Track, labels, privacy complaints, and how creators should document AI use internally.
The Short Answer
YouTube requires creators to disclose meaningfully altered or synthetically generated content when it seems realistic. You must disclose when content makes a real person appear to say or do something they did not say or do, alters footage of a real event or place, or generates a realistic-looking scene that did not actually happen.
You can disclose this during upload using the Altered content setting in YouTube Studio. If you disclose, YouTube may add a label in the expanded video description or player experience.
Content made with YouTube generative AI tools, such as Dream Screen or Dream Track where available, may be automatically disclosed by YouTube, so creators do not need to take extra disclosure steps for those tools.
What Counts as Altered or Synthetic Content?
Altered or synthetic content can be fully or partially created or changed using audio, video, or image tools. This includes AI tools, editing software, voice cloning, generative video, synthetic music, image generation, face swapping, and deepfake-style technology.
The key questions are:
- Does it look or sound realistic?
- Is the change meaningful?
- Could viewers believe the event, scene, voice, or action really happened?
- Does it involve a real person, real place, or real event?
If the answer is yes, disclosure is likely needed.
When You Must Disclose
YouTube requires disclosure when altered or synthetic content appears realistic and meaningful.
Examples include:
- Making a real person appear to say something they never said.
- Cloning someone else voice for narration or dialogue.
- Making a public figure appear to do something they did not do.
- Generating realistic footage of a real city during an event that did not happen.
- Altering footage of a real protest, disaster, crime, or news event.
- Creating a realistic scene that viewers may believe occurred.
- Generating realistic music or performance content where viewers may misunderstand the origin.
If the content could change how viewers understand reality, disclose it.
When Disclosure May Not Be Needed
YouTube gives examples of uses that generally do not require creator disclosure when they are not realistic or not meaningful in the relevant way.
Examples can include:
- Colour correction
- Lighting filters
- Background blur
- Vintage effects
- Video sharpening or upscaling
- Audio repair
- Caption creation
- Idea generation
- Using AI to create a video outline, title, thumbnail, or infographic
- Gameplay footage
- Obviously fantastical scenes, such as someone riding a unicorn
These uses may still need normal honesty and disclosure in some contexts, but they may not trigger the YouTube altered content setting.
AI-Generated Likeness of Real People
AI-generated likeness is one of the highest-risk areas. If you create or alter content so a real person appears to say, do, endorse, confess, perform, or experience something they did not, you need to think about disclosure, consent, privacy, defamation, impersonation, and harm.
High-risk examples include:
- A fake celebrity endorsement
- A synthetic political speech
- A cloned voice of a real person
- A fake arrest or crime confession
- A realistic fake medical or financial recommendation
- A synthetic scene involving a private individual
Disclosure may not be enough if the content is harmful, deceptive, harassing, or violates other policies.
Voice Cloning
Cloning your own voice to create voiceovers or dubs may not require disclosure in the same way, according to YouTube examples. But cloning someone else voice to create voiceovers or dubs can require disclosure if it seems realistic.
Consent matters. Do not clone another person voice for commercial, political, medical, financial, or reputational content without clear permission and review.
Even when disclosure is present, a realistic cloned voice can mislead viewers if the video packaging is unclear.
AI Music and Synthetic Performance
YouTube examples include synthetically generating music as content that may require disclosure. Synthetic music can also raise separate rights, likeness, monetisation, and Content ID questions.
If viewers may believe a real artist, singer, public figure, or musician performed something they did not perform, disclosure is important and other policies may apply.
Do not rely only on the altered content checkbox if the content could also be impersonation, copyright infringement, or misleading metadata.
How to Disclose in YouTube Studio
During upload, YouTube Studio includes an Altered content setting for creators using a computer or mobile device.
The basic process is:
- Upload your content in YouTube Studio.
- Go to the Details section.
- Find Altered content.
- Select Yes if the content meets the disclosure requirements.
- Continue the rest of the upload process.
If you select Yes, YouTube can add a disclosure label to help viewers understand how the content was made.
What Happens After You Disclose
When you disclose altered or synthetic content, YouTube can add information in the expanded video description or player area. YouTube also has a How this content was made section that may explain whether content was meaningfully altered or synthetically generated.
In some cases, YouTube may proactively add a label if content is undisclosed and there is risk of harm to viewers. Creators may not be able to remove some proactive labels.
This means hiding disclosure is not a safe strategy.
YouTube AI Tools and Automatic Disclosure
If a creator makes a post or Short using YouTube generative AI tools such as Dream Screen or Dream Track where available, YouTube says creators do not need to take extra steps to disclose. The tool automatically discloses the use of AI for creators.
For other AI tools, creators need to disclose during the upload flow when the content meets the altered or synthetic disclosure requirements.
Do not assume all AI tools are automatically handled. That applies to YouTube tools, not every third-party app.
Privacy Removal for Synthetic Likeness
If someone uses AI or synthetic tools to create content that realistically looks or sounds like another person, that person may be able to ask YouTube to remove it under the privacy complaint process.
This matters for creators using real people in synthetic content. A person may object even if the video was intended as parody, commentary, or entertainment.
Use consent, context, and clear labelling before publishing realistic synthetic likeness.
Disclosure Does Not Make Everything Allowed
Disclosing altered content does not automatically make it safe or allowed. Other YouTube policies still apply.
Synthetic media can still violate rules around:
- Impersonation
- Harassment
- Misinformation
- Medical misinformation
- Scams
- Privacy
- Copyright
- Sexual content
- Child safety
Disclosure is a transparency requirement. It is not a permission slip.
Business and Agency Workflow
Businesses and agencies should document AI use before upload.
Checklist:
- Which AI tools were used?
- Was any real person likeness used?
- Was any real voice cloned?
- Was a real event or place altered?
- Could viewers think the scene really happened?
- Is consent documented?
- Is disclosure needed in YouTube Studio?
- Is additional on-screen or description disclosure needed?
- Could the content trigger policy, legal, or trust issues?
Do not let AI disclosure decisions happen casually at the final upload screen.
Common Mistakes to Avoid
Avoid these mistakes:
- Thinking every AI use needs the altered content setting.
- Thinking no AI use needs disclosure.
- Cloning someone else voice without consent.
- Creating realistic fake public events without disclosure.
- Using fake likeness in medical, financial, or political advice.
- Assuming a disclosure label prevents all policy risk.
- Forgetting third-party AI tools are not automatically disclosed by YouTube.
- Failing to document AI use for client work.
FAQ
When do I need to disclose AI-generated content on YouTube?
You need to disclose meaningfully altered or synthetic content when it appears realistic, especially if it changes a real person, place, event, or scene.
Do I need to disclose AI used for scripts or titles?
Usually not under the altered content setting if it was only production assistance, such as outlines, scripts, thumbnails, titles, or captions.
Do YouTube AI tools disclose automatically?
YouTube says Shorts or posts made with YouTube generative AI tools such as Dream Screen or Dream Track can be automatically disclosed.
Does disclosure make deepfakes allowed?
No. Other policies around privacy, impersonation, harassment, misinformation, scams, and copyright still apply.
Can someone request removal of AI content that looks like them?
Yes. YouTube allows privacy complaints for realistic altered or synthetic likeness that looks or sounds like a person.
Final Thoughts
YouTube AI disclosure rules are about viewer trust. If content looks or sounds realistic and has been meaningfully altered or generated, viewers need to know.
Use the Altered content setting when required, especially for realistic synthetic likeness, cloned voices, altered real events, and realistic scenes that did not happen. But remember that disclosure is only one layer. Consent, safety, policy compliance, and viewer trust still matter.
The safest creator habit is simple: if viewers could mistake synthetic media for reality, disclose it clearly and think twice before publishing.
No comments yet.
Leave a comment