The Problems Lurking in Hollywood’s Historic AI Deal


Actors can rely on the right of publicity, also known as likeness rights, to protect them if a studio clearly infringes on their image. But what about a synthetic performer that displays, say, the gravitas of Denzel Washington but is not, technically, Denzel Washington? Could that be claimed as a “digital replica,” which the contract states requires consent to use? How easily will an actor be able to defend more nebulous traits? With some legal weight, a studio might argue that its AI performer is simply trained on the performances of great actors, like any budding thespian, in much the same way a large language model “digests” great works of literature to influence the writing it churns out. (Whether or not LLMs should be allowed to do this is a matter of ongoing debate.)

“Where does that line lie between a digital replica and a derived look-alike that’s close, but not exactly a replica?” says David Gunkel, a professor in the Department of Communications at Northern Illinois University who focuses on AI in media and entertainment. “This is something that’s going to be litigated in the future, as we see lawsuits brought by various groups, as people start testing that boundary, because it’s not well defined within the terms of the contract.”

There are more worries concerning the vagueness of some of the contract’s language. Take, for instance, the stipulation that studios do not need to seek consent “if they would be protected by the First Amendment (e.g., comment, criticism, scholarship, satire or parody, use in a docudrama, or historical or biographical work).” It’s not hard to imagine studios, if they were so inclined, bypassing consent by classifying a use as satirical and using the US Constitution as cover.

Or take the discussion around digital alterations, specifically that there is no need to seek consent for a digital replica if “the photography or sound track remains substantially as scripted, performed and/or recorded.” This could include changes to hair and wardrobe, says Glick, or notably, a gesture or facial expression. That in turn raises the question of AI’s effect on the craft of acting: Will artists and actors begin to watermark AI-free performances or push anti-AI movements, Dogme 95-style? (These worries begin to rehash older industry arguments about CGI.)

The precarity of performers makes them vulnerable. If an actor needs to pay the bills, AI consent, and possible replication, may one day be a condition of employment. Inequality between actors is also likely to deepen—those who can afford to push back on AI projects may get more protection; big-name actors who agree to be digitally recreated can “appear” in multiple projects at once.

There is a limit to what can be achieved in negotiations between guilds and studios, as actor and director Alex Winter explained in a recent article for WIRED. Much like he noted for the WGA agreement, the deal “puts a lot of trust in studios to do the right thing.” Its overriding accomplishment, he argues, is continuing the conversation between labor and capital. “It’s a step in the right direction regarding worker protection; it does shift some of the control out of the hands of the studio and into the hands of the workers who are unionized under SAG-AFTRA,” says Gunkel. “I do think, though, because it is limited to one contract for a very precise period of time, that it isn’t something we should just celebrate and be done with.”



Source link

About The Author

Scroll to Top