Pausing the AI Act: SME Implications

Colleagues reviewing timeline changes on a digital calendar in an office

Since its adoption in May 2024, the EU AI Act has promised a predictable framework for regulating artificial intelligence across the European Union. But in mid-2025, the European Commission signalled that it may propose a delay in some of the Act’s implementation timelines, particularly for high-risk AI systems. For SMEs, pausing the AI Act implementation creates both uncertainty and opportunity.

The discussion around a possible pause has sparked reactions across industry, civil society, and member states. While a delay may seem like relief for smaller organisations facing new compliance obligations, it does not mean that preparation can stop. SMEs that postpone internal governance may find themselves scrambling once the law becomes fully operational.

What is under discussion?

As of July 2025, the Commission is considering proposals to delay certain key deadlines related to the AI Act. The suggested pause does not affect the entry into force of the regulation itself, which remains in place. Instead, it would affect the timeline for enforcement, particularly:

  • The requirements for high-risk AI systems (Title III of the Act)
  • Obligations for general-purpose AI models (GPAI) under Title IIIa
  • SME exemptions and transitional support mechanisms
  • Availability of harmonised standards and conformity assessment infrastructure

The delay is partly driven by the realisation that technical guidance, notified bodies, and supporting legislation are not yet fully in place. It also reflects concerns raised by industry stakeholders about the feasibility of meeting compliance expectations, especially among startups and SMEs.

Why does this matter for SMEs?

Many SMEs have limited capacity to track EU regulatory developments in real time. The promise of a delay may appear to reduce immediate pressure. However, several risks emerge from misinterpreting what this pause would actually mean.

It is not a repeal

The AI Act remains in effect. Its scope, structure, and requirements are already law. A delay in application does not remove any of the underlying obligations. It only shifts the timeline. Companies that treat this as a reason to disengage are likely to fall behind.

Clients and partners may move faster

Large companies and public authorities are continuing their compliance programmes. Many will expect suppliers, subcontractors, and digital service providers to align with the AI Act regardless of when specific provisions come into force. SMEs that wait may be excluded from procurement processes or preferred vendor lists.

Compliance takes time

Developing an AI governance framework, training staff, updating contracts, and documenting internal processes cannot be done overnight. The AI Act introduces complex obligations that affect product development, risk management, human oversight, and technical transparency. A short delay does not remove the need for early preparation.

What are the benefits of possibly pausing the AI Act?

Despite the risks, a well-structured delay could offer several advantages, particularly if used strategically.

  • More time to understand obligations and map affected systems
  • Better access to harmonised standards and guidance materials
  • Opportunity to consult with legal and technical experts
  • More cost-effective compliance planning, especially for smaller firms
  • Potential alignment with other emerging regulations (e.g. GDPR, DSA)

A phased approach also allows SMEs to prioritise high-risk systems or core business areas rather than trying to address everything at once.

What SMEs should do during this period

The Commission’s proposal does not change the direction of AI regulation in the EU. For SMEs, this is the moment to take practical steps that lay the groundwork for compliance, regardless of any temporary adjustments to enforcement dates.

1. Conduct an internal AI system audit

Map all AI systems in use. This includes commercial tools, in-house models, and AI features embedded in software. Identify which systems may be considered high-risk under Annex III of the AI Act.

This does not require deep technical analysis. Start by documenting:

  • What the system does
  • Which departments use it
  • Who the vendor or provider is
  • Whether it makes decisions that affect people’s rights or safety
2. Review contracts and data handling

Check whether contracts with vendors include obligations related to AI governance, transparency, and risk sharing. Look for clauses that refer to AI usage, model documentation, or legal disclaimers. SMEs may be held accountable for tools they do not fully control.

Review how AI systems use input data, especially personal data. Ensure that use aligns with GDPR principles, including purpose limitation, fairness, and accuracy.

3. Identify key roles and responsibilities

Decide who within the organisation is responsible for AI oversight. This may include:

  • A governance lead (often the DPO, CTO, or COO)
  • A project team to track AI use across departments
  • Legal or compliance staff to monitor regulatory developments

Assigning responsibility early helps avoid confusion when compliance becomes mandatory.

4. Start staff awareness and training

AI governance is not just a legal or technical function. Employees across departments need basic awareness of how AI is used and what risks it might pose. Training should cover:

  • What qualifies as AI under the EU Act
  • What high-risk systems are and how to recognise them
  • What transparency and human oversight requirements involve
  • How to raise concerns or report issues

Even short introductory sessions can help create accountability across the organisation.

5. Monitor policy updates

Follow reliable sources such as the European Commission, official journals, and legal platforms. Subscribe to updates from your national authority or trade association. Key milestones to watch include:

A delay is not a reset

The temptation to pause compliance efforts is understandable. But the fundamentals of the AI Act will not change. Its requirements are aligned with global trends and growing demands for responsible, transparent AI use.

For SMEs that want to grow, attract funding, or enter public procurement channels, aligning with the Act’s principles is not only prudent. In many sectors, it is already expected. A pause gives businesses more time, but it should not be treated as a reason to stop preparing.

Turning uncertainty into preparation

Periods of uncertainty can create space for smart planning. With the possibility of phased implementation or limited delays, SMEs have a rare opportunity to take stock, build internal understanding, and reduce last-minute stress.

The goal is not to implement every compliance requirement immediately. It is to make sure the organisation knows what AI is in use, who is responsible for it, what the associated risks are, and how those risks will be managed.

Taking these steps now will protect SMEs from reputational harm, project delays, or financial penalties in the future.

Final thoughts

If the Commission confirms a delay to key provisions of the AI Act, SMEs will benefit most by staying engaged and using the extra time wisely. Compliance is not a checkbox. It is an evolving process that helps organisations make better use of AI while respecting legal and ethical boundaries.

The SMEs that prepare during the transition, even if certain obligations are postponed, will be better positioned to compete in an AI-driven market. Waiting may feel easier in the short term, but acting now offers long-term advantages.

more posts: