Blog

AI & data protection in 2026: A practical guide to the Data (Use and Access) Act 2025 for SMEs

Written by Patricia Stelmach | Jan 27, 2026 12:36:54 PM

Most business owners and key decision-makers I speak to are in the same position: AI tools are already in use across the business (marketing, customer service, HR, analytics, and finance), but governance hasn’t kept pace. That's understandable. Businesses move fast, but regulation moves more slowly. And yet, in 2026, the direction of travel is clear: data protection is becoming more operational (especially around complaints and access requests), and AI governance is moving from “nice to have” to “expected”.

This is not to be feared, as you do not need a complex corporate compliance programme to get this right. What you need is a simple, documented playbook that fits your size, risk profile, and technology use.

In this article, I'll focus on the changes under the Data (Use and Access) Act 2025 (“DUAA”) and make recommendations on what SMEs can do to stay compliant while still innovating confidently. I’ve set out the key 2026 dates to watch, what the DUAA changes in practice, and a ‘minimum viable’ action plan you can implement quickly.

The 2026 dates SMES should diarise

Now: DUAA is law, but commencement is staged

The Data (Use and Access) Act 2025 received Royal Assent on 19 June 2025. The government published a commencement plan confirming that its provisions will be brought into force in stages via commencement regulations.

Government guidance explains that data protection changes will be commenced over time after Royal Assent, with precise dates set out in regulations rather than in the Act itself.

The practical implication is that 2026 is not about reacting to a single “go live” date. It is about tracking when key measures come into effect and ensuring your day-to-day processes are ready in advance.

June 2026: Complaint-handling will become an operational requirement

Government commencement planning indicates the measures requiring controllers to establish complaint-handling processes are expected to commence around 12 months after Royal Assent (i.e. around June 2026, subject to commencement regulations).

The ICO’s complaints guidance is insistent that organisations must have a process for handling data protection complaints. At a minimum, SMEs should be able to:

  • receive complaints via an accessible route,
  • investigate and respond consistently, and
  • log outcomes and learning.

This is one of the clearest high-impact changes for SME businesses.

March 2026: AI and copyright milestone to watch

DUAA also contains a specific AI/copyright “watch this space” obligation: the government must publish an economic impact assessment and a report on the use of copyright works in the development of AI systems before 18 March 2026. 

Even if you are not training AI models yourself, this date is worth tracking. It is likely to influence the UK’s future approach to AI development, transparency and licensing.

What DUAA means for SMEs

DUAA does not replace UK GDPR. It amends the UK GDPR, the Data Protection Act 2018 and PECR. The policy intent is to reduce friction in certain areas while maintaining core protections. In practice, businesses tend to feel the impact in a small number of very specific, operational places.

If you only do four things this year, be sure it's the following:

1) Complaints-handling: The new “must-have”

In my experience, SMEs already handle complaints, but often informally (a customer emails someone, a manager replies, and it’s treated as customer service rather than a legal process). The DUAA shifts this. The ICO expects organisations to have an internal process for handling data protection complaints.

You need a visible route for reporting data-handling issues, and a repeatable internal process for receiving, investigating, responding to, and documenting outcomes.

This is one of the easiest 2026 projects to get right, and it pays dividends in reduced escalation to the ICO.

2) Subject Access Requests: “Stop the clock” and reasonable searches

Subject Access Requests (SARs) are disruptive largely because they arrive unexpectedly and often land with people who do not recognise them as SARs.

The ICO has updated its SAR guidance to reflect DUAA changes (while noting that some changes are not yet in force). Two practical points are particularly important:

  1. Organisations only need to carry out reasonable and proportionate searches.
  2. Organisations can “stop the clock” where clarification is required from the requester.

For most SMEs, the pain point is not the legal test; it’s knowing where data lives (email, collaboration tools, shared drives, CRM, HR systems) and getting the right people to search quickly.

The risk here is less about the law and more about the workflow. I recommend doing two things:

  • Train your team

Train your frontline teams (reception, sales, HR, customer support) to recognise SAR wording, because there are no formal requirements and requests can come in multiple formats.

  • Create a playbook

Create a one-page SAR playbook. Be sure to cover who receives it, who logs it, who searches systems, how you verify identity, when you seek clarification, and who signs off the response.

3) Automated decision-making: AI safeguards

DUAA takes a more permissive approach to decisions based solely on automated processing, including those with legal or similarly significant effects (for example, recruitment screening, credit decisions, pricing eligibility or service access).

However, that permission is conditional. Government guidance emphasises that organisations must implement safeguards, including:

  1. providing information about significant decisions,
  2. enabling individuals to make representations and challenge decisions, and
  3. enabling individuals to obtain human intervention.

Meaningful human involvement in reconsideration is not optional. If you use AI tools for HR screening, risk scoring, fraud decisions, or customer profiling, your safest approach is:

  • Document decisions

Document the decision chain (what is automated vs what is reviewed).

  • Human intervention

Ensure a human can override.

  • Plain language explanations

Make sure you can explain decisions in plain language if challenged.

4) Cookies and legitimate interests: Watch the guidance, not just the Act

DUAA aims to simplify some areas that SMEs regularly find burdensome.

Two examples to watch:

  • Recognised legitimate interests

     

DUAA introduces a list of recognised legitimate interests intended to give more certainty for specific purposes, but SMEs should follow commencement and ICO guidance closely before changing their approach.

  • Storage and access technologies (cookies)

Government guidance indicates the Act allows some uses without explicit consent in certain low-risk situations. Treat this as guidance-led and evidence-based: document why you think a cookie falls into a permitted category.

Your DUAA playbook 

This is the straightforward, step-by-step plan I use with clients to quickly get AI and data governance into a safe, workable shape, without overcomplicating it or creating unnecessary paperwork.

1) Make a one-page AI and Data Map

Create a simple document that gives you a clear view of your AI and data landscape: what personal data you hold (customers, employees, leads/marketing lists), where it lives (CRM, HR system, email marketing tool, shared drives), and which AI tools touch it (chatbots, meeting transcription, analytics, recruitment tools). You can do this easily in Excel, or in any platform you already use. Create a table and fill it:

  • Data type
  • Who it relates to
  • Where stored
  • Who has access
  • AI tool involved?
  • Notes/risk 

2) Put two operational playbooks in place: SARs and complaints 

These are the two day-to-day playbooks your team needs so everyone handles requests consistently, and on time.

SARs (Subject Access Requests)

A SAR playbook is your step-by-step process for when someone asks for a copy of their personal data. It should set out:

  • Who receives SARs (and where they should be sent)
  • How you confirm identity (so you don’t disclose data to the wrong person)
  • Where you search (systems, inboxes, shared drives, platforms)
  • How you respond (what you provide, what you can redact, how you package it)
  • Timelines and tracking (how you monitor deadlines and extensions)
  • Who signs off before the response is sent

Complaints

A complaints playbook explains what happens when someone raises a concern about your service or how you’ve handled their data. It should cover:

  • How complaints are logged (it should be in one place and format)
  • Acknowledgement steps
  • Investigation and evidence (who investigates and what forms evidence)
  • Response steps (how you reply, remedies, and learning actions)
  • Escalation (when it’s moved to a senior owner, and who that is)
  • Ownership (who is responsible internally)

3) Do AI supplier due diligence

Before you roll out a tool, verify:

  • Training: Will your prompts/content be used to train models?
  • Processing location: Where is data processed/stored?
  • Security: What controls exist (access, encryption, incident response)?
  • Contract basics: Confidentiality, sub-processors, retention/deletion, audit/assurance

4) Use recognised guidance instead of reinventing the wheel

Start with the ICO’s AI and Data Protection guidance and align your approach to it.

5) Treat copyright and AI as a live issue

Plan conservatively:

  • Assume AI outputs need human review for originality and rights clearance.
  • Be careful about including third-party materials in prompts.
  • Avoid embedding copyrighted materials in customer-facing outputs unless you have clear rights to do so.