YouTube Launches AI Tool to Spot and Remove Deepfakes of Popular Creators
With its new likeness-detection feature, YouTube invites creators to verify who they are and who they aren’t.
Topics
News
- Google, Anthropic Discuss Expanding AI Partnership with Multibillion-Dollar Cloud Deal
- 4 Tools Middle East Retailers Can Leverage to Have Upper Hand
- YouTube Launches AI Tool to Spot and Remove Deepfakes of Popular Creators
- Lianhe Sowell to Set Up $132.5 Million Robotics Facility in the UAE
- As AI Progress Accelerates, Global Leaders Call for Ban on Superintelligent Systems
- Andrej Karpathy Says AI Agents Are Not There Yet

YouTube has begun rolling out a tool to help content creators hunt for copies of themselves. The AI-driven “likeness detection” system is designed to spot and report videos that use a creator’s face or voice without permission. For now, only a small group of creators in YouTube’s Partner Program have access to the tool, which forms a part of Google’s broader effort to address deepfakes circulating online.
As shown in this video by YouTube, the feature requires the creators to prove their identity by submitting a government ID and a short video of their own face. Once verified, YouTube’s algorithms begin scanning the platform for videos that might feature them — altered, imitated, or entirely fabricated. If the match looks suspicious, the creator can request its removal.
YouTube says the tool functions in a manner similar to its long-running Content ID system, which automatically detects copyrighted material in uploads. But unlike Content ID, this new tool focuses on protecting a creator’s visual identity — their face and other personal attributes that could be replicated by generative AI systems.
The first batch of creators received access today through email invitations, and YouTube plans to extend the rollout to others in the coming months. Early testers were warned, however, that the system is still experimental. YouTube acknowledged that some flagged videos might include the creator’s real face rather than an AI-generated one, calling this an “expected limitation” during the testing phase.
The development of the feature dates back to late 2024, when YouTube first announced its plans to build AI likeness detection tools. The company began pilot testing in December with talent represented by an American agency named Creative Artists Agency (CAA). According to YouTube, this early collaboration was meant to give high-profile creators access to technology that could help them track and manage AI-generated imitations of their likeness.
The timing is apt. Generative AI tools are now capable of producing uncannily lifelike videos. Google’s own Veo 3.1 model can render cinema-like portrait and landscape scenes, and the company has hinted at future integrations with YouTube. Moreover, OpenAI’s video generators Sora and Sora 2 have gained traction, showing how simple it’s become to produce lifelike videos featuring real people’s faces and voices.