Although artificial intelligence (AI) systems can surely be as inventive as humans, they cannot qualify as inventors under UK patent law. Similarly, although numerous projects have shown that an AI is capable of creating fabulous artworks, it cannot qualify as the author/creator of the artistic work for copyright purposes.
At the beginning of September the UK Government sent out a “Call for Views” exploring whether the UK intellectual property (IP) framework remains fit for purpose in relation to AI and looking ahead to new challenges in this field. Key questions are who should own the rights in innovations and creative works produced by AI and who should be liable when things go wrong.
Who should own the rights?
In many instances a human inventor or creator can in fact be identified, for example where the AI acts essentially as a tool. However, questions may arise where it is the AI rather than the human that provides the essential inventive or creative input: in such cases, an extreme position would be that entitlement to a patent or to copyright should not arise at all if there is no human inventor or creator. Some would say that the whole purpose of IP rights is to incentivise people to invent and create rather than machines.
However, ensuring IP protection for AI generated subject matter is – rightly in our view - seen as essential for incentivising the development and investment in AI technologies. So far as copyright is concerned, in 1988 the UK Copyright Designs and Patent Act introduced a far-sighted new rule whereby the person undertaking the “arrangements necessary for the creation of the work” would be regarded as the author of computer-generated creative works in many cases. The Call for Views asks, however, whether greater clarity is now required in relation to ownership of copyright in relation to AI. In the case of patents, key questions include whether the AI should be credited as inventor (thus presumably increasing the reputation of the AI in question) and who should be entitled to own the patent in such cases. Possibilities might be the AI developer, the user of the AI or the person who constructed the datasets on which the AI is trained.
Who should be liable?
The Call for Views uses Google’s AI Quick, Draw!, which attempts to identify doodles drawn by users, to illustrate how an AI learns from other works which may be copyright protected. This learning will usually involve copyright infringement unless an exception applies or a licence has been granted. Readers of the Call for Views are asked whether they regard the need for copyright clearance as an impediment to the development of AI, and whether the law should make it easier for AI to use protected content to learn on, for example by introducing a new copyright exception.
More fundamental is the question of who is liable if an AI system itself carries out infringing acts without the knowledge of the makers. This may arise, for example, if it uses a patented process or includes copyright material in generated output. The problem is particularly acute in the case of “black box” neural systems where even the maker of the AI does not know exactly how it functions. Views are invited as to who should be liable in such situations.
Staying ahead
The Government wants the UK to “remain at the forefront of the AI and data revolution” and hopes to encourage growth in this sector. Ensuring clarity on IP rights and liabilities is likely to be important in securing investment in many cases. It remains to be seen, however, whether the availability of formal IP rights, particularly patents, to AI gives developers the confidence they will need to protect their “crown jewels” as the technology progresses. Some commentators have seen a risk that AI developers may prefer to rely on secrecy to protect their technology where this is practicable. Trade secret protection has the twin advantages of potentially lasting longer than patent protection and also avoiding the need to describe the technology in the patent specification – indeed some have questioned whether it will be possible to describe this technology adequately in any event. However, the Call for Views echoes concerns in some quarters that extensive reliance on secrecy could lead to duplication of research and may avoid a proper ethical assessment of systems that are not in the public domain.
Responses to the Call for Views are invited by 30 November 2020. It contains detailed sections on patents, copyright, design rights trade marks and trade secrets.
Access it here.
If you would like to discuss the Call for Views, do please contact Tom Lingard or Charlotte Tillett or your usual contact at Stevens & Bolton.