The EU AI Act is now in force. But does it apply to your small business? This guide explains who needs to comply, what's required, and what to do if you're affected.

The EU AI Act came into force in August 2024 and is being phased in over the following two years, with most obligations applying from August 2026. If you run a small business that uses, develops, or deploys artificial intelligence in any form, you may be wondering whether any of this applies to you — and if so, what you actually need to do.

The honest answer is that it depends. Most small businesses will be minimally affected. But some will face real obligations, and the worst thing you can do is assume you're exempt without checking.

What is the EU AI Act?

The EU AI Act is the world's first comprehensive legal framework for artificial intelligence. It takes a risk-based approach — the higher the risk an AI system poses to people's health, safety, or fundamental rights, the stricter the requirements.

The Act classifies AI systems into four risk categories: unacceptable risk, high risk, limited risk, and minimal risk. Where your AI use cases fall in that classification determines what you need to do.

Does it apply to UK businesses?

This is the question most UK small businesses ask first, particularly post-Brexit. The answer is yes, potentially — in the same way GDPR applies to UK businesses that process data about EU residents.

If your business offers AI-powered products or services to customers in the EU, deploys AI systems that affect people in the EU, or uses AI as part of a service delivered to EU-based clients, the EU AI Act is likely to apply to you regardless of where your business is based.

If your customer base is entirely UK-based and you have no EU operations or clients, you are outside the direct scope of the regulation. However, the UK government is developing its own AI governance framework, and many of the principles from the EU AI Act are expected to influence UK regulation over time. Getting ahead of it now is sensible regardless.

Which AI systems are prohibited?

The Act outright bans certain uses of AI that are considered to pose unacceptable risks. These include AI systems that manipulate people through subliminal techniques, exploit vulnerabilities of specific groups, use real-time biometric surveillance in public spaces for law enforcement purposes, and social scoring systems used by governments.

For most small businesses these prohibitions are not relevant — they apply to very specific high-risk use cases, not general business applications of AI.

What counts as high risk AI?

High risk AI systems face the most stringent requirements under the Act. These include AI used in critical infrastructure, education and vocational training, employment and HR decisions, access to essential services, law enforcement, migration and border control, and the administration of justice.

For small businesses the most relevant category is employment and HR. If you use AI tools to shortlist CVs, score candidates, or make decisions about employees — redundancy, performance assessment, promotions — those systems could be classified as high risk under the Act.

AI used in safety-critical products also falls into the high risk category — medical devices, vehicles, industrial machinery. If your business develops or integrates AI into any of these contexts, you need to take the Act seriously.

What about general purpose AI like ChatGPT?

Most small businesses use AI in the form of general purpose AI tools — ChatGPT, Microsoft Copilot, Google Gemini and similar. The EU AI Act has specific provisions for general purpose AI models, but these obligations fall primarily on the developers and providers of those models, not on the businesses using them.

As a user of general purpose AI tools you are not directly regulated in the same way. However you still have responsibilities around how you use those tools — particularly if you're using them to process personal data or make decisions that affect people. Your GDPR obligations remain fully in force regardless of whether a human or an AI makes a decision.

What do small businesses need to do right now?

Even if you conclude that the EU AI Act doesn't impose direct obligations on your business today, there are some sensible steps worth taking now.

Conduct an AI inventory

Make a list of every AI system or AI-powered tool your business currently uses or is considering using. Include everything — recruitment tools, customer service chatbots, fraud detection, content generation, analytics platforms. You cannot assess your compliance position without knowing what you're using.

Classify your AI use cases

For each item on your inventory, consider which risk category it falls into. Most general business applications — content creation, customer support, data analysis — will be minimal or limited risk. Flag anything that touches HR decisions, safety-critical systems, or services delivered to EU customers for closer scrutiny.

Check your contracts with AI providers

If you are building products or services on top of AI models — using the OpenAI API, for example, to build a customer-facing application — your contract with that provider should address AI Act compliance. Review what obligations your provider is taking on and what remains with you as the deployer.

Document your AI governance

Even for low risk AI use, having some basic documentation of what AI you use, why, and what oversight mechanisms you have in place is good practice. If a client, investor, or regulator asks about your AI governance — increasingly common in procurement questionnaires — you want to be able to answer confidently.

Stay informed on the UK framework

The UK is developing its own approach to AI regulation. The current position is principles-based and sector-specific rather than a single overarching law, but that is likely to evolve. Signing up to updates from the ICO and the AI Safety Institute will keep you informed as the UK framework develops.

When do the obligations kick in?

The EU AI Act is being implemented in phases. The prohibition on unacceptable risk AI systems applied from February 2025. Obligations for high risk AI systems and general purpose AI models apply from August 2026. Codes of practice and guidelines for specific sectors are being developed throughout 2025 and 2026.

This means small businesses have time to prepare — but the August 2026 deadline for high risk AI is approaching faster than it might seem, particularly if you need to make changes to products or services to comply.

How does this interact with ISO 27001 and GDPR?

If you're already working toward ISO 27001 certification or maintaining GDPR compliance, you have a head start on EU AI Act readiness. ISO 27001's risk management framework maps well onto the AI Act's requirements for high risk AI systems. Your existing data protection practices, privacy notices, and records of processing activities all contribute to your AI Act compliance posture.

ISO 42001, the international standard for AI management systems, is designed to complement ISO 27001 and provides a structured framework for demonstrating responsible AI governance. If the EU AI Act is relevant to your business, ISO 42001 is worth considering as part of your compliance roadmap.

Managing AI Act compliance alongside your other frameworks

For most small businesses the challenge isn't understanding what the AI Act requires — it's managing compliance across multiple overlapping frameworks simultaneously. ISO 27001, Cyber Essentials, GDPR, and now the EU AI Act all have requirements that interact with each other. Tracking all of this on spreadsheets quickly becomes unmanageable.

SnapGRC brings your compliance frameworks together in one place — risk registers, controls, evidence, supplier assessments, and policy management — so you can manage your obligations across multiple standards without the overhead of separate spreadsheets for each one. If you're starting to think about AI Act readiness alongside your existing compliance programme, book a free demo to see how SnapGRC can help, or explore our compliance knowledge base for more guides like this one.