Brands and agencies, take note: Whether or not your commercials are subject to SAG-AFTRA, with these new New York bills, additional disclosures and consents are now required when using synthetic performers and digital replicas in ads.
A Double Feature
On December 11, New York Gov. Kathy Hochul performed a double feature when she signed into law S.8420-A/A.8887-B and S.8391/A.8882, “first-in-the-nation legislation” designed to (a) enhance transparency with respect to the use of artificial intelligence (AI) in advertising and (b) increase individual protections for those in front of, and behind, the cameras.
Behind the Scenes: What These Bills Mean
S.8420-A/A.8887-B: No more guessing whether that advertisement popping up on your social media feed contains a real human or one that is machine-made. New York wants audiences to know when avatars created by generative AI are in the mix. Accordingly, S.8420-A/A.8887-B amends New York’s general business law to require advertisers to include clear and conspicuous disclosures when an ad contains an AI-generated synthetic performer (an asset digitally created using generative AI that appears as a real person but is not based on or recognizable as an identified natural performer). These disclosure obligations do not apply to advertisements or promotional material for expressive works, such as films or television, provided that the use of a synthetic performer in the ads is consistent with its use in the expressive work.
S.8391/A.8882: Earlier this year, we wrote about New York State’s Fashion Workers Act, which requires advertisers to obtain written consent from a model prior to the creation or use of the model’s digital replica. Well, S.8391/A.8882 extends these consent rights to the estates of deceased performers, providing additional safeguards against the unauthorized use of an individual’s name, image, or likeness. Specifically, the bill updates New York’s post-mortem right of publicity law (NY CLS Civ R §50-f) to prohibit the commercial use of a vocal and/or visual digital replica of a deceased individual without obtaining prior written consent from the individual’s heirs or executors. Digital replicas are defined as “computer-generated, highly realistic, electronic performances” that are “readily identifiable as the voice or visual likeness of an individual” but for which the actual individual did not perform or the actual individual did perform but the fundamental character of the performance or appearance has been materially (and digitally) altered.
In an era where deepfakes are more common than sequels, these laws are intended to balance innovation with integrity, ensuring that technology enhances creativity while simultaneously protecting personal and consumer rights.
The Closing Credits
- Introduce Your Avatars: Disclose when using AI-generated synthetic performers in your ads.
- Consent: Get clear consent from performers (or their estates when recycling historic ads) if you plan to use them in or as inspiration for a digital replica. Don’t bury this in your forms. It should be clear and detailed as to the term and purpose of proposed use.
- Historic Ads and Post-Mortem Rights: The consent requirement applies to performers post-mortem. So if a performer is now deceased, you need consent from their estate if you plan to feature an ad using that performer’s digital replica.
- Penalties: The bills are effective immediately and impose civil penalties of $1,000 for the first violation and $5,000 for each subsequent violation.
- Production Guidelines: To avoid disruptions to current campaigns or impacts on future promotions, brands and agencies are encouraged to incorporate the obtaining of consent for the use of digital replicas and to include disclosures when ads contain AI-generated synthetic performers. Add these points to your talent and production guidelines for work that features New York-domiciled performers or work that will be disseminated in New York. Other states may follow suit in the coming years.
Post-Credit Scene for the Kids
The Federal Trade Commission and state regulators are very focused on how AI-generated content impacts children. We recommend extra caution when (a) using children as performers or models for digital replicas and (b) showcasing these AI-generated synthetic performers in children’s advertising.