Using and Authenticating Celebrity AI

Using and Authenticating Celebrity AI

Note: This article relates to the December 2024 VIP+ special report “Generative AI: Deepfakes & Digital Replicas,” available to subscribers only.

Digital replicas are becoming available for talent to exploit for creative and commercial use with AI.

Not all talent will. Even with the protections established under SAG-AFTRA, some actors are wholly rejecting the use of AI, stipulating “no AI” in contracts. Yet some transactions are beginning to happen, and more are expected. One VIP+ source anticipated talent digital replicas would at some point become “ubiquitous,” while another felt most talent would have digital replicas within the next decade.

Used correctly, with consent and compensation, AI conceivably scales opportunities for talent past ordinary limitations. As some argue, AI versions of talent allow a celebrity to be in many places at once or perform “impossible jobs,” such as personalized interactions with fans at scale through a chatbot, all without requiring talent to physically do the work.

Use case ideation is still early, as is the thinking about how talent should be compensated and valuations for certain utilizations of a talent digital replica. But the potential is broad. Projects could originate from film, TV, gaming or animation studios, sports leagues and major or minor brands, but they could also be initiated by AI companies or talent themselves. New uses will emerge as talent reckon with their own options and interests amid evolving tech capabilities.

As talent digital replicas begin to be utilized, entertainment industries across film, TV, gaming, music and more will need to build or integrate technical mechanisms into workflows to ensure that use provides the so-called 4 C’s for talent: consent, control, credit and compensation.

The development of solutions that will accomplish these requirements is still early stage. However, there is general agreement that a standards-based approach gaining industrywide and cross-industry adoption will be needed, even as proprietary services are developed and used.

“It’s very important that individuals are able to identify where their name, image, likeness has been utilized. The first problem we need to solve is who owns the rights to the name, image and likeness, who has the right to actually utilize it and give permission for its use. If someone creates a scanned image of themselves, they should own and control that,” Renard Jenkins, president of the Society of Motion Picture and Television Engineers told VIP+.

“Then we need to create a pathway to traceability, to be able to track and trace what’s happening to an asset you own from the point it’s created, to have the ability to audit every action and see who’s utilizing those assets,” he continued. “And then how do we provide a way for improper use to be swiftly taken down or payment requested for its usage? All of that is going to take a much larger effort across the industry, with everyone saying this is something important they want to work on together.”

From project origination to content distribution, several technical components are starting to come together to allow talent to make, own, control and monetize their digital replica for authorized, employment-based use and content creation. How exactly different technologies and providers should be assembled is now being carefully considered.

Processes that can enable authenticated employment-based use of talent digital replicas would need to include methods for the following:

  1. Consent: Ensuring talent has a way to approve or decline the creation and any use of their digital replica asset
  2. Data Capture: Creating a digital asset of talent likeness
  3. Data Storage and Management: Securely housing, transferring and/or tracking digital replica assets
  4. Content Creation: Using talent data to create content or experiences
  5. Provenance: Applying hard-to-break mechanisms to enable real-time traceability and verification of name, image, likeness and voice assets and derivative content for their entire lifecycle, such as with embedded watermarks, cryptographic metadata, hashing or blockchain records.
  6. Compensation: Establishing payment models and triggers to ensure talent is fairly paid for the use of their digital replicas, including residuals for ongoing use or AI training.

Get more critical data and analysis in VIP+’s subscriber report …

Read the Report

Originally Appeared Here