Is your M&A target a manufacturing company with automated production, a consumer products business with online sales and marketing or an education company that creates content for students? The increasing use and development of artificial intelligence (“AI”) systems and products, particularly generative AI, has created risks for businesses using such tools. AI plays a role in many industries and businesses whose products and services are not themselves AI. In the context of a M&A transaction, it is important to identify and allocate responsibility for these risks. Risks of AI may include: infringement (including through use of training data as well as outputs), confidentiality, IP ownership and protection (including limits on protection of IP generated by AI), regulatory (e.g., privacy, recent AI related legislation), and other risks arising from use such as indemnity obligations or managing contractor use of AI.

Even for companies not focused on AI such as where AI may be used in the business but is not the product of the target, it is important to identify the AI risks and properly address and allocate these risks through due diligence and appropriate terms in the definitive acquisition agreement. Businesses in many industries ranging from healthcare to manufacturing to entertainment are using or interacting with AI. Luckily, many AI related risks may be covered, in part, by commonly used IP representations and warranties. This article reviews the representations and warranties of a purchase agreement which may include AI related risks and issues and provides some comments on how to leverage those terms to address AI specific risks. 

In some cases, existing IP representations and warranties can be expanded to address the definition of AI technology either as its own defined term or to be incorporated into other IP related definitions like “software” or “technology”, as needed. AI technology may include generative AI tools such as ChatGPT or tools which include AI like MSWord which has Co-Pilot integrated and the definitions should take the specific uses by the target into account. These types of tools use machine learning algorithms and large volumes of training data to develop models that can generate outputs in the form of high quality text, images and other content based upon user inputs. If these tools are used by the target, it may also be necessary to include a concept of “Training Data” to flush out AI risks in the target business. Training Data is usually described as data, fine tuning and RAGS consent used to train, pretrain, validate, or otherwise evaluate, improve, modify or supplement a software algorithm or model. In general, the actual scope of the definitions will vary depending on the results of the target company diligence. For example, diligence may also reveal whether the target has an established policy for its employees on the use of AI in its business. Reviewing these policies can also influence these definitions as well as if the target uses contractors who create work product using AI. 

As part of the diligence conducted, buyers should confirm, among other things, rights with respect to the use of and ownership of output of tools. In these cases, custom representations and warranties may be useful like (i) infringement of and rights to Training Data, (ii) ownership of AI technology, (iii) ownership and protectability of output developed using AI technology tools or software, (iv) licenses to inputs from AI technology, and (v) separate copyright and patent representations and warranties related to AI technology, as needed. The actual scope of the representations and warranties will vary depending on what AI technology or tool is used by the company and for what purposes such as customer facing in products or for internal purposes. In any M&A transaction where the target company uses AI, it is important to understand the risks of AI and to draft appropriate provisions that properly allocate and mitigate these risks. To draft appropriate provisions, parties can consider a combination of: (a) due diligence, (b) disclosure representations, (c) standard, pre-existing representations or (d) custom representations.