Skip to main content

What are the legal risks of using generative AI in games development?

Andrew Velzen of law firm MBHB explores the potential issues that can arise regarding IP law, privacy law and more

Image credit: Adobe

Generative artificial intelligence, or GenAI, presents many opportunities in the gaming industry.

Many of this year's biggest developer events, including GDC, have been awash with companies touting the use of GenAI for creating dozens of maps/levels, to improve development workflows, performing QA tasks, or even responding directly to in-game actions from a player. Additionally, big industry players are actively developing hardware to support the use of GenAI, such as NVIDIA's unveiling of the SUPER series of GPUs at the start of the year.

Given all this, it is easy to imagine a not-so-distant future where GenAI plays a substantial role in the development and/or gameplay of most games. Among all the excitement, though, there are also legal risks posed when using GenAI in gaming. Publishers will need to address these stumbling blocks before fully integrating GenAI.

Andrew Velzen, MBHB | Image credit: MBHB

The potential legal risks are wide-ranging across many different areas of the law. Intellectual property law (e.g., trademarks, copyrights and patents), privacy law, and tort law/contract law, among others, could all be implicated when using GenAI.

Some of these legal risks relate to the use of GenAI purely in the development of a game before it is released into the wild (what I will call 'DevAI' herein), while others relate to the use of GenAI during gameplay (what I will call 'LiveAI' herein), and many are present in both.

Intellectual Property Law

Perhaps the biggest elephant in the room, and the one most people are taking about (both inside and outside of gaming), is how GenAI intertwines with IP law. There are clearly many important questions to tackle, such as: are there any IP issues if the model was trained on protected content? What if the model outputs content that is, itself, protected IP? Can you obtain IP protection on the output of your GenAI model? Let's dig a little deeper.

Avoiding IP Infringement based on AI Inputs and Outputs

The discourse around IP infringement came up earlier this year as a central part of the Palworld debate (i.e., discussions about whether Palworld does or does not infringe any copyrights owned by Game Freak, The Pokémon Company or Nintendo).

For both DevAI and LiveAI, one salient question is whether the use of a GenAI model, which was trained on copyrighted content, inherently constitutes copyright infringement that can give rise to legal liability.

This question has percolated up to the federal courts recently outside of the context of gaming. Perhaps the case that best epitomizes this is the lawsuit lodged by The New York Times against Microsoft and OpenAI over the use of copyrighted Times' articles to train ChatGPT's large language models (LLMs).

Microsoft has argued that using the Times' articles is "fair use," which is a doctrine that, for broader policy reasons, permits someone to commit acts that would otherwise constitute copyright infringement without being subject to liability. Whether it does or not will obviously be up to the court. This decision (and those like it in other related cases) will go a long way towards setting a precedent regarding how training GenAI models is viewed in terms of potential copyright infringement.

While the issue above relates to the input of GenAI models (e.g., how they were trained), another potential IP issue arises as it relates to the output of GenAI models. Namely, what if the output of the model is, itself, protected by one or more copyrights, trademarks or patents?

For example, if you used DevAI to produce a town level and the town had the Coca-Cola logo plastered on various signs, this could potentially constitute trademark infringement. Likewise, if you use DevAI to generate a new 3D character model for your game, and the character is identical to Mario, Nintendo would have a plausible cause of action for copyright infringement.

Game developers need to be active participants in directing (if not also in training) theGenAI model if they would like to protect the outputs produced

The risk of a DevAI outputting protected content (or content that is only negligibly different from protected content) is substantially increased when the DevAI was trained on the protected content itself. For example, if you set out to make a JRPG using DevAI, and you train the DevAI on only Square Enix games, it would not be overly surprising if, when prompted to output a male protagonist with blonde hair and a sword, the DevAI spits out Cloud Strife, or something eerily similar.

There are ways to mitigate this risk (e.g., have a human, or a separate AI trained on every game ever made, review all the outputs from the DevAI to ensure they are not overly close to content protected by copyright, trademark or patent). However, none of these techniques are foolproof.

Given all the above, perhaps the safest avenue is to avoid using GenAI models that are trained on copyrighted content altogether. Even when using GenAI models that are only trained on data not subject to copyright, though, other IP issues can still pop up.

Ensuring IP Protection of AI-Generated Content

When making use of DevAI, it will be important to most companies to be able to protect the output of the DevAI. For instance, if DevAI is used to design new content (e.g., a new weapon model, a new level, a new character, a new storyline) or new technology (e.g., new multiplayer communication protocols, new controller mechanisms, new console designs), the game developers would ideally be able to obtain copyright and/or patent protection on the DevAI's output so that others cannot later merely reproduce it without permission. Obtaining copyright or patent protection can prove somewhat challenging when GenAI is involved, though.

In the patent space, according to the ruling of Thaler v. Vidal, it is now established that an AI cannot be listed as an inventor on a patent application (i.e., GenAI, on its own, cannot get a patent, no matter what it generates).

Likewise, in the copyright space, the U.S. Copyright Office has reiterated that it will "not register a copyright for works that are produced by a machine or mere mechanical process that operates randomly or automatically without any creative input or intervention from a human author" (i.e., GenAI, on its own, cannot get a copyright, no matter what it creates). GenAI does not sound a death knell for IP protection, so long as special precautions are taken to ensure that whatever is produced by the GenAI is protectable.

For example, the U.S. Patent and Trademark Office's Inventorship Guidance for AI-assisted inventions states that a human "providing routine or expected inputs to an AI system could be an exercise of normal skill expected of one skilled in the art that is considered insignificant in quality," thereby precluding obtention of a patent.

To obtain a patent, then, it needs to be demonstrated that the human game developer using the GenAI tools "design[ed], buil[t], or train[ed] an AI system in view of a specific problem to elicit a particular solution" in a not insignificant fashion, according to the Guidance.

Likewise, in order to obtain a copyright, it must be demonstrated that "the AI contributions are the result...of an author's ‘own original mental conception, to which [the author] gave visible form'" rather than the result "of ‘mechanical reproduction,'" according to the U.S. Copyright Office. Hence, when employing DevAI, game developers need to be active participants in directing (if not also in training) the GenAI model if they would like to be able to protect the outputs produced by the model.

Privacy Law

In addition to IP risks, there are some privacy law risks worth mentioning.

While privacy laws vary from state to state in the U.S., some states have consumer privacy laws in place. For example, California has a strong Consumer Privacy Act that prescribes how consumer's data can be collected and disseminated.

Particularly for LiveAI (although potentially for DevAI, as well), a user's personal data (e.g., name, date of birth, internet protocol address (IP address)) may be collected and fed into or used to train a GenAI model. Further, it is certainly possible that the outputs of such a GenAI model, or the GenAI model itself, would later be shown to or provided to another user (e.g., another gamer and/or another game developer).

Because the GenAI model or its outputs may inherently include personal data, sharing them with another user may implicate some consumer protection/privacy laws. Thus, game developers need to be extremely careful about how much personal data they share among various entities via GenAI (even if the personal data is only being shared in the form of a trained model, especially if personally identifying data could somehow be extracted from the GenAI model).

Tort Law/Contract Law

A subset of tort law also provides individuals with the "right of publicity" (i.e., the right of every human being to control the commercial use of their identity). One way this right can be violated is if another entity uses someone's identity (e.g., name, image or likeness) for commercial benefit without consent.

These rights can have substantial value, as evidenced by the recent NCAA fracas regarding college athletes being able to profit from their own name, image and likeness.

It is quite possible that DevAI or LiveAI, either inadvertently or intentionally, may output a person's name, image, or likeness. As just one example, if a player is playing a game with a webcam and their facial reactions are used to train a GenAI, it is certainly possible that the resulting GenAI could then later produce that person's image or likeness (either partially or entirely) as an output (either in future games and/or to other players).

Arguably, if it is used as an output and it is a game for which the developer makes a profit, that developer has commercially benefitted from the player's image and/or likeness (since their likeness was a feature in a profit-making venture). This, then, could give rise to liability for violating the player's right of publicity.

Game developers need to be extremely careful about how much personal data they share among various entities via GenAI

As a game developer, the absolute minimum step that should be taken with respect to players' rights of publicity would be to include one or more disclaimers in a game's terms of service, which indicate that a player's data may be used in development and/or execution of the present game or future games.

However, such a step should not be considered a panacea for all potential issues with GenAI. For example, hiding a blanket license for a game developer to use a person's name, image, and likeness as they please in terms of service could be a bad public relations move, at the very least.

Further, players are not the only stakeholders with regards to their rights of publicity. Perhaps more important to consider are voice actors and/or motion capture (mocap) actors. With regards to GenAI, this issue has come to a head recently with the SAG-AFTRA strikes (both last year in the film industry and this year in the games industry). Among many issues during the strike, the actors justifiably raised concerns about their images and likenesses being imported into a GenAI model, which could result in future work no longer requiring the underlying human actor (thereby resulting in lost wages for the actors).

Cost savings clearly exist for a game developer if they can secure a license to use an actor's name, image, and/or likeness (e.g., within a GenAI model) in perpetuity after an initial recording and/or filming session. However, given the strikes, securing such a contract is unlikely. Thus, game developers need to be careful when preparing training sets and/or structuring the underlying model for DevAI (e.g., to ensure that the DevAI does not produce an output that is overly similar to an actor's image or likeness).

On top of the standard tort law remedies for misappropriation of the right of publicity, as name, image, and likeness rights become more and more important in light of the proliferation of AI, the U.S. Congress has also proposed further individual protections.

Notably, multiple senators have promulgated legislation that would give rise to legal damages should someone be engaged in "producing, hosting, or sharing a digital replica of an individual performing in an audiovisual work, image, or sound recording that the individual never actually appeared in or otherwise approved – including digital replicas created by generative artificial intelligence (AI)."

This bill – called the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act – has received significant bipartisan and industry support thus far. Should this bill be ultimately signed into law, it would represent yet another minefield when using GenAI (particularly LiveAI, as it would be harder to catch inadvertent reproduction of someone's image or likeness).

There are many potential legal traps for the unwary game developer who haphazardly employs GenAI during development. Unless and until someone produces a GenAI model that, off the shelf, accounts for all of the above issues, publishers, developers and game studios should put AI use policies in place to ensure that there is sufficient internal policing to avoid potential liability.

Assuming proper precautions are put in place, though, a game developer should be able to look back at what they produced using GenAI and say, in the words of gaming's arguably most famous AI, "This was a triumph. I'm making a note here: huge success."

Andrew Velzen is an intellectual property expert and associate at law firm MBHB, who counsels and supports clients on IP matters related to a variety of technologies, including machine learning and artificial intelligence.

Read this next

Related topics