Our website is using cookies to deliver best possible experience to you. By using it you agree to cookies policy.

Created with Sketch. Back

We are at risk of regulatory failure in the absence of technical standards for assessing compliance with the AI Act

We encourage you to read the publication on the challenges of implementing the AI Act from the point of view of the availability of technical standards to ensure compliance with the regulations. The European AI Forum - the largest association of AI companies in Europe with more than 2,200 entities - is a partner of the publication, which was developed in cooperation with leading European companies and universities by our colleagues from Germany.

We are at risk of regulatory failure in the absence of technical standards for assessing compliance with the AI Act

The publication points out that the creation of technical standardisation (e.g. along the lines of ISO standards) is a key way to ensure compliance with the AI Act, especially for high-risk systems. Standardisation can facilitate clarity and unambiguity of technical requirements. The planned ~30 (partly referenced in the document) standards cannot be implemented in a few months. Realistically, at least 12 months are needed to implement the standards. We are therefore in danger of regulatory failure and high penalties for companies.

The diagnosed main challenges in the implementation of AI standards include:

  • too short a timeframe for the implementation of standards for high-risk systems - companies are likely to be left with around 6-8 months to implement standards, when in reality at least 12 months are needed. The original deadline for the standard was April 2025, but this is likely to be pushed back to August 2025. Then, auditors must also acquire new skills to certify companies, which will further reduce the time to implement the standards in companies.

  • Asymmetry of influence in the standardisation process - large technology companies, including US players, have more influence over the design of standards, which can marginalise the interests of European SMEs and startups. We encourage the funding of expert work in standardisation bodies.

  • too many standards with too costly access to them - the current standardisation work of CEN-CENELEC JTC 21 includes about 35 activities, combining existing international standards with new European standards

  • double regulation - AI Act imposes additional requirements on companies that already have to comply with other sectoral regulations

  • high cost of access to standards - there is a risk that standards will be chargeable, making access more difficult for smaller companies. Moreover, to see a standard, you will have to pay for it.

  • small companies anticipate a cost of up to €200,000 per year for compliance with the AI act alone.

Policy eecommendations:

  • Changing the deadlines for the validity of high-risk systems to provide a realistic timeframe for companies. There is a real need to move the compliance deadline for high-risk systems from holiday 2026 to holiday 2027.

  • Lowering the barriers to participation in the standardisation process, especially for startups and SMEs, by publishing the standard as part of an act. There are already known precedents in the EU on this issue.

  • providing practical support for the implementation of standards within regulatory sandboxes, including financial support and technical guidance, is key to effective compliance with the AI Act. While regulatory sandboxes have not lived up to expectations in the financial sector, they can play an important role in the area of AI - provided they realistically facilitate compliance. For this to happen, they should be able to be certified or work closely with supervisors. It is also necessary to involve a representative of the Commission responsible for overseeing the AI Act in the team creating regulatory sandboxes.

  • systemic involvement of SMEs in the implementation process through dedicated advisory forums and consultation channels, especially by chambers and NGOs.

  • harmonisation of AI Act standards with existing industry and international standards, while remaining compatible with European values and fundamental rights..

Publication is available on the website of the European AI Forum, of which the foundation is one of the founders.

Autor: Fundacja Digital Poland