Liability for Artificial Intelligence: explanations on the Expert Group’s report

Liability for Artificial Intelligence: explanations on the Expert Group’s report

During a parliamentary meeting of the Legal Affairs Committee (JURI) on 9 January, a representative from the European Commission gave explanations on the report on “Liability for Artificial Intelligence and other emerging technologies”. Dirk Staudenmayer, Head of the Contract Law Unit at DG JUST, underlined that this report is not the official line of the Commission -as it is an Expert Group’s report. The Commission still does not have a position on this issue. However, an initiative on AI is expected in the coming months

The experts recommend to distinguish high risk applications and low risk applications.

High risk applications: these are such applications where 1) the public at large is exposed to a risk, 2) a legal interest of high value is at stake (for instance the citizens’ lives). It also needs to be a fully autonomous AI. Typically, autonomous vehicles and drones fit into this category.

– Low risk applications: they cover most of AI uses, such as smart homes. Medical surgery devices would also fit into this category as they do not expose the public at large but “only” the patient.

Further analysis available for Eurosmart members upon request