AI explained: AI in film and television
Manage episode 428278441 series 3402558
Continuing our new series on artificial intelligence, Christian Simonds and Henry Birkbeck discuss the use of AI in film and television. AI features in every stage of production – from pre-production, through production, to post-production – and reliance on AI will continue to increase as it evolves. The discussion centers around the legalities that management in the industry should be aware of, as well as the recurring questions and issues raised by clients in both the UK and U.S.
----more----
Transcript:
Intro: Hello, and welcome to Tech Law Talks, a podcast brought to you by Reed Smith's Emerging Technologies group. In each episode of this podcast, we will discuss cutting edge issues on technology, data, and the law. We will provide practical observations on a wide variety of technology and data topics to give you quick and actionable tips to address the issues you are dealing with every day.
Henry: Welcome to our new series on AI. Over the coming months, we will explore the key challenges and opportunities within the rapidly evolving AI landscape. Today, we will focus on AI in film and TV. My name is Henry Birkbeck. I'm a senior associate in the London office at Reed Smith, and I'm speaking today with Christian Simonds, who is a partner in the New York office. Christian and I have previously written an article on AI in the film and TV space, and there's probably quite a lot to mention and not a huge amount of time to cover it. But I guess I'll start maybe Christian just by asking, you know, have you seen recurring themes coming up with clients asking about AI in the film and TV space so far?
Christian: Yeah, I think in terms of, you know, the film and TV industry as a whole, it's kind of always been kind of at the forefront of technology, particularly in terms of how to properly utilize it. Just not only from a budgetary perspective, but also from a creative perspective, right? And obviously, you know, AI has been such a hot topic, particularly with respect to the guilds during the strikes of 2023. So there is a lot to kind of unpack in terms of how it's being integrated into film and TV production. You know, I think the general consensus is that about two thirds of every budget for a AV project in the kind of the film and TV space is kind of made up of labor. Right. And particularly now in relation to kind of the economy and where it is, there's been a heightened scrutiny of, of each line item in a, in a particular budget. And, and, and as a result, it's kind of driven the need or reliance on AI as a potential solution to mitigating certain costs, labor costs. And again, I know it's not ideal from an individual employment perspective, but from an overall budget perspective, it is something that I see the studios and production companies on the independent level embracing as it relates to trying to drive down costs of a particular budget it kind of, AI kind of plays into each stage of production on, it plays into development, pre-production, production, and post. And when you're navigating that as it relates to the legalities of how it's used, yeah, there are certain issues that come into play vis-a-vis what the guilds have agreed to kind of in their most recent ratification at the end of the strikes. And how to ensure that you're adhering to what those requirements are, at least aware of what they are. In addition to that, just from a copyright perspective and other considerations in terms of how AI is used in that kind of development stage. So there's kind of a lot to unpack within kind of the lifespan of a production as it relates to AI and how it's been used. And the reality is it's going to be used and it's going to continue to evolve and be, be relied on to a greater degree, particularly with respect to certain elements of the production process on the, on, on the VFX side in particular, obviously certainly in, in, in the development stage and, and, and the editing stage. And there's, there's certain things that, that it really has a, a, a tremendous value from a timing perspective, a cut down and production timing and, and, and, and other elements that I think will, will benefit the production as a whole. But yeah in terms of legalities yeah there's a lot to kind of unpack there I'm happy to touch on each of those in the time that we have.
Henry: Well I think the first one which you did touch on obviously is that the guild strikes of 2023 and clients and and those those people in the industry in the US were probably a lot closer to it but for the benefit of those of us that weren't in the US do you do you want to give a very quick recap of kind of where where they ended up?
Christian: Absolutely. Yeah. In terms of the Screen Actors Guild, SAG, there are really four main components to how they regulate artificial intelligence. So you've got your employment-based digital replica, which is basically a replication of a performer's employment or participation in the production. So it's literally taking the performer himself and portraying him in a scene that they didn't actually shoot or a performance that they didn't actually shoot. A good example is if you see in a movie someone talking to themselves. That second digital replication is employment-based digital replica. Right. What does that mean in terms of what you need to do vis-a-vis SAG? It means you need to get mandatory consent from the performers when you're going to do that. So it needs to be disclosed at the outset, either from the performer himself or if the performer is no longer around, you need to get consent from an authorized representative from his estate or the union itself. The contracts need to be reasonably specific as to the description of how it's going to be used. And if you're going to use it outside of the project, i.e. kind of that beyond a kind of one picture license, you're going to need additional consent to do that. And, you know, they need to be, the performers will need to be compensated, you know, for that digital creation as though it was themselves, right? So you need to take into consideration that residuals obviously Obviously, we'll need to be paid on whatever amounts are paid in connection with that. And then you've got independently created digital replica, which is basically using digital replicas created kind of using materials that the performer has provided in a scene that they didn't actually shoot. Right. So you're not actually using a previous performance in the movie itself, but you're like literally digitally replicating the performer in a scene that he didn't otherwise participate in. So again, you need to consent from the performer when you do that. You need to be reasonably specific in terms of how you're describing that use in the contract. Again, obviously, he's entitled to compensation for that use. And obviously, that compensation is entitled to payment of fringes.
Henry: Has this kind of been met with or met positively, I guess, in the industry since? Because I know that in 2023, you know, there was a big kind of reputational issue around AI and there was a lot of speculation about whether these, you know, protecting the performer's rights, was this being overblown a little bit? Or do you think, you know, are people comfortable with where they landed?
Christian: I think with respect to those two elements, I think yes, right? I think that the general consensus is they do adequately protect the actors when you're exploring those types of digital replica usages. I think the real wildcard here or the area of real concern is around generative AI, right? So basically, taking data, materials, prior performances, facial expressions, their image likeness, and actually creating new content from that material from an actor. I think that's really where we start to enter into an area where people aren't necessarily comfortable, particularly on the talent side. And again, the guild is clear that if you're going to do that, you need to get consent from the actor, right? So if you're going to use his image or likeness or prior performances to kind of feed a gen AI tool to create a new piece of content from those materials, you're going to need consent from that actor. You got to notify the union when you're going to do such a thing, you know, and the compensation around that usage is usually specifically negotiated with the talent themselves. And you basically need to continue to keep that talent informed how those materials are being used vis-a-vis the gen AI. So that really is a touchy area. I mean, I think a lot of people are familiar with what happened with Scarlett Johansson when she basically said that her voice was being utilized for purposes of chat GPT. Recently, they claimed that they had used an actress's voice that they had brought in that sounds similar to hers, but it wasn't her voice. I mean, so it just shows you the heightened sensitivity on the talent side in terms of of how their, you know, image like this voice is being used within the gen AI space. So yeah, there is, there's a lot of sensitivities around that usage.
Henry: Okay. So it's interesting that there's an ongoing obligation as well, which I guess makes sense, but you know, it's a burden, I guess, on the production company. And the other question I had relating to guilds was, was the writers guild. And I think the other thing that seemed to make the headlines internationally was about how particularly large language models and generative AI tools can be used for screenwriting and at what point. There's always a question with AI and copyright and ownership. And I was wondering if you could speak a little bit on where they landed in terms of who, if anyone, can own a script that has been written by an AI tool.
Christian: You know, within the US, the copyright law is clear in the sense of, you know, anything that's gen AI created is not copyrightable, right? And if you are utilizing elements of materials that were gen AI created in your copyrightable material, you have to disclose what those materials are, and those materials will not be protected within the overall piece of IP, right? So you could copyright the script, but if certain elements of it were pulled from gen AI sources, those elements of your script will not be protectable. So the copyright law is fairly clear on that. In terms of the WGA and how they're trying to protect their writers from being replaced by AI, it's saying, hey, obviously with respect to signatory companies, you have to be clear that... gen AI produced material will not be considered kind of literary material under the WGA, right? So what does that mean? So if you can't give an AI generated screenplay to a writer and say, hey, go rewrite this, and you potentially won't be entitled to separate rights because you're writing something that's based on underlying materials, WGA says absolutely not. We're basically going to exclude that gen AI-created screenplay or treatment or Bible from the kind of separated rights discussion and say, Hey, that writer that contributes those first writing services, you know, for purposes of taking that gen AI material and, you know, either polishing or rewriting it or, or, or developing further into the screenplay, that will be considered the first step of the writing process. And that writer will be the writer that will be entitled to the writing credits around the material moving forward. So yeah it's almost like it's fine for studios to to provide gen AI materials for purposes of writers developing a screenplay or or assisting them with their writing services and again if they're going to do that they need to fully disclose that to the writer when they're doing it but those materials will basically be excluded from kind of the chain right when the WGA is considering what makes up the overall literary material that's going to be considered for purposes of writing credits residuals before etc. And I think again you know it's it's a good place to land for writers it doesn't necessarily solve the AI issue as a whole because it's still going to be utilized for purposes of coming up with ideas potentially on the studio side because it's it's something that they can quickly do and and also it has added benefits in terms of saying, hey. It can analyze a screenplay or it can analyze an idea and say, hey, here's the likelihood of the success of this idea or screenplay based on, you know, historics of screenplays like it in the past. Or, hey, I think the second chapter of this story should be changed this way because it's going to be better received because of X, Y, or Z, right? I think that tool will be beneficial. So it's not totally carving out the usage of AI as it relates to the development of a literary material. And I think it could be utilized in a positive way, which is great. I just don't think it will ever be able to fully remove physical writers from the process. I think the WGA did a sufficient job ensuring that.
Henry: Yeah. It's interesting because obviously we're talking mostly about the US here, but I think most other markets in the film and TV industry were watching very closely what happened in 2023. And certainly in the UK, there's kind of been largely a following suit. You know, we haven't had the same kind of high profile developments, but PACT, which is the Producers Alliance in the UK, has since issued guidance on AI and the use of AI in film and TV productions. And, you know, they don't. They kind of stop short of taking a hard stance on anything, but they do talk about being very mindful and aware of the protection of the rights of all the various people that might be involved and how to integrate AI into production. So I think, I think the US position is, is really the kind of the market leader for this. And, you know, there's, there's a slight nuance in the copyright law is slightly different in the US and the UK and, you know, how AI relates to that. And, and lots of other jurisdictions, of course, there's implications there. But I think so far, the UK market seems to be broadly following what's happening in the states.
Christian: It's interesting, too, because the states here are starting to take a position on AI. And there's, at least in New York, there's a few bills that are being considered currently. There's three bills, one of which I think probably has a likelihood of getting passed, which deals with contracts around the creation of digital replicas. And it kind of tracks what SAG has already said. But basically, any contract between an individual and an entity or individual for performance of personal or professional services as it relates to a new performance by digital replication basically is contrary to public policy and will be deemed null and void unless it satisfies three conditions, one of which is the reasonably specific description of the intended use of the digital replica. But it adds like an interesting element, which is that the person who is on the other side, whose performance is potentially being digitally replicated, needs to have been represented by council or a member of a guild, which is interesting, right? So it kind of adds a little, an extra level of protection. And again, this is going to be state specific. So it'll be interesting how this kind of impacts other states or what other states are potentially considering. And then there's two other bills that are currently in place. One is on the advertising side, you know, in connection with disclosing synthetic media as it relates to advertising. But the other one that's interesting, just from a film financing perspective, is that they're taking a position that, you know, productions that spend money on AI digital replication or AI usage might not qualify for the New York tax rebate, which is very interesting.
Henry: Oh, really? Wow.
Christian: They think that that one won't necessarily pass. Certainly the first one will because it's kind of already in line with what SAG has said. But yeah, that second one, I think, will probably get shelved. But just an interesting one to consider.
Henry: Yeah, and it's really kind of, I guess, both of them showing that there is this protection element being added in and trying to... It's almost like holding these AI devices slightly at a distance to stop them kind of... Becoming a source of, I don't want to say evil in the industry, but kind of going too far overstepping the mark.
Christian: Absolutely. And it's interesting too, because it's almost like, obviously you've got your defined protections within SAG and copyright law, and now obviously what's being considered by legislation. But the reality is a lot of this is being driven just by public policy in terms of the public's rejection of AI to a degree, right? I think people are generally like scared of it right so the knee-jerk reaction is to say no let's continue to promote you know the employment of real people right?
Henry: Yeah.
Christian: I think you know and and I think this also plays into how AI is used you know in films and again you know i think it's going to evolve to a point where it'll be tough to distinguish between AI and real right but you know, a good example being Irishman or some other films that have used significant AI to basically de-age people, you know, and people see it and like, that looks ridiculous, right? Like, I think there's a general knee-jerk reaction to doing it, right? And whether that changes over time based on how the AI technology evolves. Particularly from a visual perspective is TBD. But but yeah I think a lot of it is driven by public perception of AI right.
Henry: Yeah yeah and I think you know it is interesting what you were saying before and we've seen this in the UK as well is that initially it was like okay well how is this gonna affect producers and you know and there's kind of efficiencies there but actually we're seeing studios and commissioners really embrace it and and look for ways to cut the cost of production as well and and you know i think it's It's just going to, like you say, it's going to touch basically every aspect of film and TV production and streamline things.
Christian: Yeah, I think there are, I mean, I think there are real positives in terms of how it can be integrated into the production process. I know we touched on a few, but, you know, also like dubbing, right, localization of content and, you know, basically extending the reach of a piece of AV IP, right? So if you can do it in a way where it looks natural, because I know there's always kind of been a visceral reaction to seeing something that is dubbed really poorly, but if it can evolve to a point where it's almost seamless, it may have a better impact in terms of the breach of content. Again, I think there are different schools there in terms of whether investing in that makes sense in certain places or if it's just easier just to dub it and release it and how much of an impact it's actually having. But I do think in certain jurisdictions or certain areas, a seamless localization could have value to the reach of a particular piece of AV IP.
Henry: Yeah, yeah, absolutely. And it could kind of move things geographically as well. I know, in the UK, we have this a lot where, you know, the cost of post production is a lot cheaper in countries close to the UK. And so as a result of that, you quite often see productions that will be, you know, 80% of it will be of the money will be spent in the UK. And actually, the final 20%, which, you know, often doesn't qualify for the tax credit over here anyway, gets outsourced to somewhere in Europe, where it is much cheaper to have the post production be carried out remotely, or even to Canada, or somewhere where there's a kind of better incentive package. And actually, if you could streamline that whole process, and you've got, you know, a company that or an AI tool that can, you know, do all the grading and all this kind of stuff that was cutting and these previously quite labor intensive activities, then there's no reason why you couldn't bring a lot of that production cost back on shore and reduce it. So it may kind of move where people are located in the market as well.
Christian: Yeah, I think, you know, and it's not necessarily, you know, solely issue for, you know, scripted projects. You know, I think also in the US, it's even kind of led into the doc space, right? So I know, you know, the APA in the US, which is the Archival Producers Alliance, which basically is a document variance, you know, basically said as well, like, hey, just be careful how you use AI in this space, even though it's not, you know, governed by a auild necessarily, right? Like still, or how it's used in the non-scripted space, you know, it doesn't really have guild coverage depending on what it is. They're at least saying, hey, like, just be careful how you use it because, you know, what we're doing, particularly in terms of how we're depicting history or events, historical events or things that have happened, you know, AI, you don't want to, you don't want to change it to a point where you're basically changing the perception of history, right? Or generating things that are so AI oblivious to the realities of what actually happened, right? Like you might potentially, it might have a negative effect. And it's funny, it's more of like an ethical line that documentarians are trying just to be mindful of as it's kind of integrated into that space as well. Because look, when you don't have kind of primary source material to utilize for purposes of trying to depict something in a documentary, Yeah, it may be easy to duplicate it using AI, right? Generated by AI. But at the same time, that may have implications that you might not have thought about, which is, you know, how is this telling a story that might not actually be accurate to the underlying facts?
Henry: I mean, I think that's a pretty good kind of starting point. Obviously, these are issues and discussion points that have a lot of depth to them and have been discussed a lot in the industry internationally already. And we're kind of in a point now where it's like, let's wait and see how this stuff shakes out in practice. And certainly, as we've discussed, the US has kind in a lot of areas that there is now a bit of a direction. So it'll be interesting to see how things unfold. And I know for the most part in the UK, the production industry sort of follows what's happening in the states. And in respect of international productions, they have to kind of be aligned to a degree. So it will be really interesting to see how this develops in the coming months and years.
Christian: Yeah, for sure. And like Luke said in the beginning, there is no question that it is a part of the filmmaking process and will continue to be so at an ever-increasing degree. So it's kind of unavoidable. And I truly think it could be utilized in a way that's beneficial to filmmaking, both from a budgetary perspective, but also like, hey, if you can reallocate a bunch of the spend from what otherwise was labor-intensive, time-consuming elements of production, re-allocate that to talent or other things, elements of the process that can compensate certain individuals in a way that they weren't before. I think that can be beneficial as well. And I think, you know, there will be a point where, you know, it'll really help independent filmmaking in particular, right? Because that always is, budgetary constraints are always paramount in that space. And if you can do things that otherwise cost a ton of money previously for cents on the dollar using AI, moving forward to the extent it kind of evolves quality wise, I think you'll see an uptick in really quality, you know, independent filmmaking.. Again, there's never going to be a universe, in my opinion, that totally circumvents the utilization of real people, but AI can certainly be utilized in a beneficial way to help the process.
Henry: Great. Thanks very much, Christian. And thanks everyone for listening. Tune in for the next episode on our series on AI.
Outro: Tech Law Talks is a Reed Smith production. Our producers are Ali McCardell and Shannon Ryan. For more information about Reed Smith’s Emerging Technologies practice, please email techlawtalks@reedsmith.com. You can find our podcasts on Spotify, Apple Podcasts, Google Podcasts, reedsmith.com, and our social media accounts.
Disclaimer: This podcast is provided for educational purposes. It does not constitute legal advice and is not intended to establish an attorney-client relationship, nor is it intended to suggest or establish standards of care applicable to particular lawyers in any given situation. Prior results do not guarantee a similar outcome. Any views, opinions, or comments made by any external guest speaker are not to be attributed to Reed Smith LLP or its individual lawyers.
All rights reserved.
Transcript is auto-generated.
85 episoder