fb-pixel
Skip to content

Sponsored content

Understanding IP risks in the age of AI

Generative AI has become a transformative force in many industries, offering unprecedented opportunities for innovation and efficiency. However, along with those opportunities also comes risk.

This article explores some of those risks, with a focus on UK IP laws.

Subsistence and ownership of IP in AI-generated works

A typical output of generative AI might be written content, images, audio or video, all of which have the potential to be of commercial value to a business. Under UK law, copyright is automatically granted to a human creator of such works if they are sufficiently original, thereby providing legal protection against third-party copying.

But what if the work is instead generated by an AI model? For example, if a journalist uses a large language model to generate a comment piece or a fashion designer uses an image generator to generate a new clothing design, does copyright subsist in such outputs to prevent copying and who owns that copyright if it does subsist?

While few IP laws explicitly deal with content generated by AI, the UK is one of the few jurisdictions to provide for the subsistence of copyright and design rights in computer-generated works with no human author. However, the legal concept of originality (which is necessary for copyright to subsist in a work) has been developed entirely with reference to human traits such as personality, judgement, and skill.

It is therefore unclear whether copyright can subsist in an AI-generated work, opening up the risk that there may be no legal recourse to prevent copying of commercially valuable AI-generated works in the same way that there would be if they had been authored by a human. 

If copyright can subsist in an AI-generated work, the legal owner of that work is the person who made arrangements necessary for the creation of the work. The law does not, however, provide a clear definition as to who that person would be in the context of generative AI models.  There is, therefore, a risk that ownership of commercially valuable works could potentially be claimed by the developer of the AI model, rather than the person authoring the prompts to generate the work.

Infringement of existing IP rights

Generative AI learns from existing data, which often includes works that are subject to IP protection. If an AI model generates content that is substantially similar to these works, it could potentially infringe upon existing IP rights. For example, recent lawsuits filed by Getty Images against the owners of the AI image generator Stable Diffusion have alleged that aspects of Getty’s images, on which Stable Diffusion was trained, are included in some of the model’s output. In particular, images generated by Stable Diffusion are alleged to include a version of Getty’s watermark, prompting accusations of trademark infringement. 

The process of training an AI model may also include making copies of protected works. Some jurisdictions provide so-called text and data mining provisions that exclude computational analysis of works from copyright infringement. However, in the UK, this exclusion only applies to research for non-commercial purposes. The training of AI models on copyrighted works for commercial purposes could therefore risk claims of copyright infringement.

Confidential information and trade secrets

Generative AI models are often implemented by third parties on cloud-based servers. The use or training of generative AI typically involves providing some form of potentially business-critical information to a third party. While this can be done confidentially, without careful consideration, there is a risk of unwittingly disclosing confidential information. Such disclosure may risk losing trade secret protection for business-critical know-how and forfeiting the ability to register IP rights (such as patent and design rights) for any disclosed subject matter.

Mitigating risks

The application of IP law to the use of generative AI remains relatively untested in the UK but has the potential to be a source of legal risk. To mitigate these risks, businesses should consider developing and implementing clear policies and agreements to cover the use of AI. Given the complex and evolving nature of AI and IP law, businesses should also consider seeking legal advice to ensure they are compliant with current laws and are prepared for future changes.

For more information and advice on understanding intellectual property risks in the age of AI, contact patent director Nicholas King at nking@hgf.com.

You may also like...

A man doing a presentation

The three fates of workers in the age of AI

In this guest article, Hannah Seal, partner at Index Ventures, explores the impact of AI on the workforce. “Love and work are the cornerstones of our humanness,” Sigmund Freud once wrote. So, what happens to our humanity in an era of AI, which – according to the headlines – threatens to replace millions of jobs...
A sign showing the go:tech awards logo

Go:Tech Awards 2024 shortlist revealed

Business Leader has revealed the shortlist for this year’s Go:Tech Awards. The finalists were decided through a rigorous selection process by the awards’ judging panel, which this year included HSBC’s head of technology sector Roland Emmans; Dr Sofie McPherson, patent director at the law firm HGF; Yiannis Maos, founder and CEO of Birmingham Tech; and...

Quantum sensors: A booming market

For many years, quantum sensing was largely a scientific curiosity that few people could grasp. Now, it is primed to become a hotbed of commercialised innovation