European Union legislators gave approval to the AI Act on 13 March. It is due to come into law in phases beginning in six months time.

Other countries including the UK cannot ignore it. As with GDPR rules, the Act includes proposed regulations that organisations will need to comply with, in order to do business with the EU.

A risk based approach

The Act takes a “risk-based approach” to AI in products or services.
Low-risk systems, like those dealing with content recommendation, will come under lighter rules – they might just have to state clearly that they are powered by AI.

When it comes to higher risk, AI in everything from medical devices to toys will face extra checks as part of existing safety regimes, and its use in particular scenarios such as credit worthiness, employment or training will be regulated. Serious incidents, such as malfunctions causing a death or severely harming property, must be reported.

Some AI uses are banned – these include:

  • ‘Social scoring’ systems that have a role in how people behave.
  • Police face scanning in public using biometrics, except for some crimes like terrorism.
  • Some emotion recognition systems – in workplaces or schools.

Scrutiny for largest AI models

The largest AI models like GPT4 by OpenAI and Gemini by Google are to have additional scrutiny.

Generative AI and Deepfakes

AI-created Deepfakes, featuring video or audio of real people, events or places must be labelled clearly as artificially manipulated.

General purpose AI models must give details of the text, pictures, video and other internet data being used to train their systems. They will also have to obey EU copyright law.

How are BCS members involved?

Adam Leon Smith, Chair of BCS’ Fellows Technical Advisory Group (F-TAG) and CTO, Dragonfly, has been at the forefront of shaping international standards in AI for five years.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

He explained:

“Article 40 of the AI Act outlines a radical role for technical standards. While technical standards have been part of product safety legislation for several decades, they have never before been used to explicitly underpin fundamental rights. This is a new direction for digital legislation that will soon be followed in the EU's Cyber Resilience Act.

“The UK remains an active member of the European Standardisation Organisations, and a number of senior BCS volunteers are providing leadership and technical guidance in establishing the state of the art both in Europe and on the international stage.”