Key Highlights
- OpenAI strengthened guardrails for its video generator Sora 2 after actor Bryan Cranston raised concerns.
- The new version of the AI tool will prohibit users from replicating real people’s likenesses without explicit permission.
- SAG-AFTRA and talent agencies are working with OpenAI to ensure voice and likeness protections in Sora 2.
- OpenAI CEO Sam Altman expressed commitment to protecting performers’ rights against misappropriation of their voice and likeness.
Background on AI Video Generation
The rapid advancement of artificial intelligence (AI) technology, particularly in the realm of video generation, has raised significant concerns among professionals in the entertainment industry. AI tools like Sora 2, developed by OpenAI, have the potential to create realistic videos that can replicate public figures’ likenesses and copyrighted characters without their explicit consent. This capability poses a major threat to performers’ rights and intellectual property.
New Guardrails for Sora 2
Following widespread criticism over Sora 2’s ability to generate unauthorized video content, OpenAI took swift action to improve its guardrails. The AI giant faced backlash when videos of actor Bryan Cranston appeared on the app alongside AI-generated clips of other celebrities and copyrighted characters. Cranston approached SAG-AFTRA, the union representing more than 150,000 film and TV performers, highlighting the need for stronger protections.
Collaboration with SAG-AFTRA
In response to these concerns, OpenAI collaborated with SAG-AFTRA and several talent agencies. The joint statement released on October 16, 2025, outlined a new opt-in protocol designed to prevent unauthorized use of performers’ likenesses. OpenAI CEO Sam Altman emphasized the company’s commitment to safeguarding performers’ rights by stating, “We are deeply committed to protecting performers from the misappropriation of their voice and likeness.”
Impact on Entertainment Industry
The incident with Sora 2 highlights the broader challenges facing the entertainment industry as AI tools become more prevalent. Many professionals have expressed reservations about these technologies due to fears that they could undermine artists’ control over their work. For instance, before signing onto the statement, talent agency CAA had criticized OpenAI for exposing clients and their intellectual property to significant risks.
Actor Bryan Cranston also emphasized the importance of addressing this issue in his post: “I brought up my concerns with Sora 2 to SAG-AFTRA after feeling deeply concerned not just for myself, but for all performers whose work and identity can be misused in this way.” His efforts have resulted in a positive resolution, as Cranston expressed gratitude towards OpenAI for its improved policies.
Future Implications
The collaboration between SAG-AFTRA, talent agencies, and OpenAI sets an important precedent for the future of AI video generation tools. As these technologies continue to evolve, it is crucial that they incorporate robust safeguards to protect performers’ rights and intellectual property. The NO FAKES Act, which aims to hold individuals and platforms accountable for producing or hosting unauthorized deepfakes, has gained support from industry leaders like OpenAI.
OpenAI’s commitment to improving its AI tools demonstrates a growing awareness of the ethical implications associated with such technologies. However, as Bryan Cranston stated, “We must ensure that we respect our personal and professional right to manage replication of our voice and likeness.” This ongoing dialogue between technology developers and industry professionals will be essential in shaping the responsible use of AI in the entertainment sector.