AI and your contracts: 10 Essential clauses every UK SME needs

The way AI is being built into everyday business tools is changing faster than most contracts can keep up. If you’re an SME in the UK, it’s no longer enough for your agreements to cover only traditional risks; they also need to address the way AI systems use your data, generate outputs, and make decisions. The right clauses will protect your intellectual property, help you meet your legal obligations, and ensure suppliers are accountable.


AI has shifted from “nice-to-have” tech to a core driver of growth for many SMEs. It has become an integral part of everything, from streamlining administrative tasks and generating marketing content to analysing business data and screening job applications. However, while the tools have evolved, many contracts haven’t kept pace. That mismatch can leave UK businesses exposed to regulatory penalties, IP disputes, or consumer complaints.

Three significant developments in the past 18 months mean it’s time to update your contracts:

1) UK data privacy reform (DUAA 2025)

The Data (Use and Access) Act 2025 has tightened the rules on automated decision-making, introduced a “not materially lower” test for international data transfers, and clarified your scope for “reasonable and proportionate” searches when responding to Data Subject Access Requests (DSARs). You can do more with AI, but you must be able to prove human oversight and keep robust records. The DUAA 2025 amends the UK GDPR and the Data Protection Act 2018, meaning its changes form part of the UK’s core data protection regime, not a standalone law. The “not materially lower” test applies to all restricted data transfers, regardless of whether they involve AI systems.

2) EU AI Act 

If your product or service reaches EU users, you’re already on the hook for compliance. Prohibitions on certain AI systems have been in effect since February 2025, while obligations for “general purpose AI” (GPAI) took effect in August 2025. Most high-risk system requirements are expected to be in place by August 2026. The EU AI Act applies extraterritorially, meaning that even non-EU businesses must comply if they place AI systems on the EU market or if those systems affect people in the EU.

3) Consumer law enforcement (DMCC 2024)

 The Digital Markets, Competition and Consumers Act 2024grants the Competition and Markets Authority (CMA) direct fining powers of up to 10% of a company’s global turnover for unfair commercial practices. If your AI tools interact with consumers (chatbots, sign-ups, personalised recommendations), you’ll need to lock down how they’re marketed and operated. The CMA’s enhanced enforcement powers under the DMCC 2024 came into effect on 6 April 2025, so the new fines and penalties are already in force.

The 10-clause AI contract portfolio

The right AI clauses protect you on multiple fronts, from data privacy and IP to performance, security, and consumer law. These 10 clauses serve as a solid starting point for an AI supplier addendum that you can send to your vendors.

1) Model transparency and change notice

  • No black boxes: Know what you’re using and when it changes.

AI models aren’t static; they’re susceptible to model drift as real-world data and environments change; even minor updates or retraining can degrade accuracy, introduce bias or unfairness, and, under the EU AI Act’s lifecycle obligations, may require the model to be reclassified under a different risk category with additional compliance steps.

Example:

“Supplier will keep a live register of AI components (model family/version, region, sub-processors) and give 30 days’ notice of material changes; on request, we can hold or roll back a model for re-validation.”

2) No training on your data (unless you opt in)

  • Keep your confidential data out of someone else’s training set.

Once your data enters a model’s training set, you lose control over how it’s used, even if it’s “anonymised.” That could leak trade secrets, customer insights, or unique creative work. This clause lets you choose if and when to share data for training, protecting both your IP and your competitive advantage.

Example:

“Supplier shall not use our data or outputs to train, refine or evaluate any model, except in a ring-fenced instance with our prior written consent.”

3) Ownership of outputs

  • Secure full rights to AI-generated outputs.

UK copyright law treats “computer-generated works” differently from works created by a human author. Under section 9(3) of the Copyright, Designs and Patents Act 1988, if a work is generated by a computer and there is no human author, the “author” is legally defined as the person who made the arrangements necessary for its creation. Without a clear contractual clause, your AI supplier could argue that they (and not you) meet that definition and therefore own the copyright. Explicit ownership terms protect your right to use, adapt, and monetise the AI-driven products and content you’ve paid for.

Example:

“All IP in the outputs vests in the Customer on creation; Supplier gets a limited licence to use outputs only to run the service.”

4) IP warranties, dataset provenance and indemnity

  • Ensure your supplier’s AI training data is lawfully sourced.

Many AI models are trained on mixed datasets scraped from the internet, not all of which are legally available for commercial use. If a rights-holder sues, you could be dragged into litigation unless your supplier is contractually responsible. This clause puts the onus on them to verify provenance and carry the legal risk.

Example:

“Supplier warrants it has lawful rights to training/validation datasets and third-party components, that our permitted use is non-infringing, and will defend and indemnify us for IP claims.”

5) Automated decision-making safeguards

  • Keep humans in the loop and stay compliant with DUAA.

Automated decision-making (ADM) can impact hiring, lending, or eligibility decisions. If you can’t explain or challenge the decision, you risk regulatory penalties and reputational damage. This clause ensures transparency and fairness, meeting both DUAA and ICO guidance.

Example:

“Supplier will enable meaningful human involvement, clear explanations and an easy appeal route wherever AI materially affects an individual.”

6) Security baseline for AI

  • Build AI-specific security into your contract.

AI introduces new attack vectors (such as prompt injection, model theft, and data poisoning) that traditional cybersecurity policies often overlook. By referencing established standards, you establish a measurable security bar and ensure a rapid breach response, thereby reducing operational and legal fallout.

Example:

"Controls must align with the NCSC and CISA joint Guidelines for Secure AI System Development and Deploying AI Systems Securely, applying secure-by-design principles, thorough logging, monitoring, and incident response protocols. Where the AI system processes personal data, the Supplier must also provide prompt breach notification and supply investigation artefacts, enabling us to meet our reporting obligations under the UK GDPR, including notifying the Information Commissioner’s Office within 72 hours where required."

7) Performance, monitoring and drift

  • Keep AI systems accurate and reliable over time.

AI models degrade over time as the world and your data change. Without drift monitoring, you could be making decisions on stale, biased, or inaccurate outputs. This clause enforces proactive oversight and rapid fixes.

Example:

“Supplier will meet agreed accuracy/latency KPIs, monitor for model/data drift, alert us within 24 hours of material degradation, and expose a kill-switch or rollback if outputs become unreliable.”

8) Generative AI used by the supplier (ban or safeguards)

  • Control risks from your supplier’s own AI use.

Even if your policies are tight, a supplier could expose you by feeding your data into unsecured GenAI tools. This clause either prohibits their use entirely or sets clear, enforceable safeguards.

Example:

Ban: “No use of generative AI tools in development or support.”

Safeguards: “Only named tools with our consent; human review of AI-created artefacts; licence checks; developer training; no input of our data into public models.”

9) Marketing and consumer-law compliance

  • Avoid regulatory action for misleading AI claims.

Misleading “AI-powered” claims or unlabelled synthetic content can result in fines from the CMA or the Advertising Standards Authority (ASA). This clause ensures that suppliers stand behind their claims and comply with consumer protection rules.

Example:

“Supplier warrants AI-related claims are accurate and substantiated, uses clear disclosures for synthetic media where omission would mislead, and complies with CAP/ASA and DMCC.”

10) Audit, logs and data privacy readiness

  • Enable investigations, compliance, and accountability.

Without logs, you can’t prove what happened or meet DSAR obligations. This clause secures your right to verify compliance and maintain defensible records for regulatory purposes or in the event of disputes.

Example:

“We may audit on notice; Supplier will keep proportionate prompt/output/decision logs for [X] months, support DSARs on a ‘reasonable and proportionate’ basis, and ensure international transfers meet the DUAA’ not materially lower’ protection test.”

Top 5 AI contract questions UK SMEs ask us

  1. Do I really need to change my contracts just because I’m using AI tools?

Yes. If AI is part of your operations, even indirectly through suppliers. Many standard contracts were written before AI became mainstream and don’t address risks such as data leakage into public models, shifting regulatory classifications, or automated decision-making. Updating your contracts makes sure you own your outputs, manage supplier risk, and can prove accountability if something goes wrong.

  1. How do I know if the AI tools I’m using are compliant with UK and EU law?

You’ll need to consider three key factors: the type of AI system, its deployment location, and the individuals it impacts. In the UK, the DUAA 2025 sets rules for automated decision-making and international data transfers. If you operate in or sell to the EU, the EU AI Act classifies systems as prohibited, high-risk, or limited-risk, with obligations already in force for general-purpose AI. The safest route is to build compliance checks into your supplier contracts, so it’s their responsibility to keep you informed and compliant.

  1. Who owns the intellectual property in AI-generated content?

In the UK, “computer-generated works” have special copyright rules, and without a clear contract, the supplier could claim some or all rights. If you’re using AI to create marketing content, designs, reports, or other commercial outputs, you want full ownership. That means your contracts need a clause assigning all IP in the outputs to you, with only a limited licence back to the supplier so they can operate the service.

  1. How can I stop my data from being used to train someone else’s AI model?

Without a “no training” clause, many suppliers can, and will, use your inputs, prompts, or outputs to train and improve their models, sometimes even when they say it’s “anonymised.” This risks leaking your trade secrets or customer insights to competitors. A properly drafted clause stops all training unless you explicitly opt in, and it should also block “aggregated analytics” unless you consent.

  1. What happens if my AI supplier gets something wrong, like making an unlawful decision or infringing copyright?

If a supplier’s system makes an unlawful automated decision, breaches data protection rules, or uses infringing training data, you could still be on the hook as the deploying business. That’s why supplier contracts should include warranties, indemnities, and audit rights. These clauses ensure the supplier takes responsibility for their tech, supports you in investigations, and covers your losses if their AI causes a problem.

Final word

AI has evolved beyond being an IT asset; it’s now a strategic, legal, and reputational one. Updating your contracts is one of the fastest ways to safeguard your business while maximising AI opportunities.

At Lawyerlink, we specialise in helping SMEs integrate AI into their contracts with confidence. We can provide a ready-to-use AI supplier addendum, tailored to your sector, and create an internal AI use policy to match.

This article is general guidance, not legal advice. For tailored advice, speak to one of our solicitors.

 

Strengthen your contracts before AI tests them

Don’t let outdated agreements leave your business exposed. At Lawyerlink, our solicitors can review your existing contracts, draft AI-ready clauses, and ensure your commercial agreements keep pace with technology and the law. Whether you need a one-off AI supplier addendum or a full contract overhaul, we’ll make sure your business is protected and ready to grow.

Explore our commercial contract services →