BCS members have been divided in some community discussions about how the audit and assurance of technical systems can add value, if at all. Adam Leon Smith FBCS explains.

As I wrote in my last column on AI, the EU AI Act is somewhat forcing the issue. In some cases, it will require third-party audit before AI systems can be used; in others, it will require rigorous technical quality management systems to be self-assessed.

Steep costs?

The European Commission’s impact assessment estimates a 17% overhead on all AI spending, with compliance costs for the initial use of a high-risk system estimated at 400,000 Euros, with additional maintenance costs each year. There is some debate around these numbers, but it won’t be cheap.

While the UK’s position on the EU AI Act is not clear, the CDEI published a significant report in December - The roadmap to an effective AI assurance ecosystem. It positions the UK as a potential market leader in the AI Assurance ecosystem, creating a new multi-billion pound industry. This is in line with the UK’s National AI Strategy, which also aims to position the UK as a leader in the governance of AI.

The assurance ecosystem

The report outlines priority enablers for the development of the assurance ecosystem, which obviously includes demand and market participants. It also points at professionalisation as a key enabler, something that is core to BCS’ mission and something that it is already being focused on.

It puts a lot of weight upon technical standards and highlights the leading role the UK had in the development of ISO 9001 (quality management systems) and ISO/IEC 27001 (information security). The UK is playing a similar leading role in the development of AI standards.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

ISO/IEC’s AI committee has four major normative standards projects at the Draft International Standards stage, which means they are open to public consultation.

This includes two foundational standards defining the landscape, but also the first standard that will be relevant to the legislation - ISO/IEC DIS 23894: Information technology - Artificial intelligence - Risk management.

There are also several major projects not far behind, including a quality model and a management systems standard. These will likely be very important to implementing the assurance ecosystem and complying with upcoming legislation.

Impact on business

It remains to be seen whether the European Commission will adopt ISO/IEC standards or take a view of it’s own. However, given the amount of work to do, and the lack of visible divergence between the EU and the international community in the technical detail, it seems likely that the ISO/IEC work will mostly be adopted by Europe.

Interestingly, the European Standardisation Organisation CEN/CEN-ELEC has recently confirmed BSI remains a member. This means that UK industry remains at the heart of technical European standards work, and can continue to have a voice in the development of EU AI regulation.

For those of us working on the technical details, it may seem as though these regulations and the supporting standards are a long way away, but there is a huge amount of work remaining. The cost to UK SMEs will be disproportionately high if they are forced to rely on legal advice to implement regulation, without common technical standards to interpret it.