Aller au contenu

Report on CEPANI40-CIArb YMG webinar on the impact of the AI Act on international arbitration

On 26 November 2024, CEPANI40 and CIArb Young Members Group (YMG) organised a topical webinar, moderated by Lauren Rasking and Guillaume Croisant (CEPANI40) and Earvin Delgado (CIArb YMG), dedicated to the exploration of the implications, challenges, and opportunities presented by this significant legislative development to arbitral tribunals and institutions, and finally, its users.

Guillaume Croisant
Counsel, Linklaters.
Guillaume Croisant
Counsel, Linklaters

On 26 November 2024, CEPANI40 and CIArb Young Members Group (YMG) organised a topical webinar, moderated by Lauren Rasking and Guillaume Croisant (CEPANI40) and Earvin Delgado (CIArb YMG), dedicated to the exploration of the implications, challenges, and opportunities presented by this significant legislative development to arbitral tribunals and institutions, and finally, its users.

Dan Nechita, former lead technical negotiator for the EU AI Act on behalf of the European Parliament, outlined the Act's objectives, which aim to create uniform regulations across the European Union to foster public trust in AI technologies. He explained the technical framework, categorizing AI systems by risk levels and specifying obligations for high-risk applications, including safety measures and transparency requirements.

Ole Jensen, Managing Counsel at ArbBoutique, then discussed the relevance of the AI Act for the international arbitration practice. In a nutshell, the AI Act will not impose specific obligations not applicable where AI systems are used as a “digital assistant” to the arbitrator and/or arbitral institution, if their output does not materially influence the outcome of decision making. Arbitrators and arbitral institutions would only be under the general obligation applicable to all deployers to have a sufficient level of “AI literacy”.

However, where an AI system is used irresponsibly by relying on the output to an extent that equates to a delegation of decision-making authority to the AI system, the EU AI Act’s rules on “high risk” systems would be engaged. Where an arbitrator or arbitral institution does deploy an AI system in a high-risk manner, this entails a number of obligations under the EU AI Act: (i) deployers must take sufficient technical and organisational measures to ensure that the systems are used appropriately and that the operation of the AI system be monitored accordingly; (ii) deployers of high-risk artificial intelligence systems must also keep the logs generated by the systems for at least six months (to the extent that such logs are under their control); and (iii) deployers of high-risk AI systems that make decisions or assist in making decisions related to natural persons must inform them that they are subject to the use of the high-risk AI system.

Claire Morel de Westgaver, Partner at BCLP, presented recent survey findings on AI's use in arbitration, revealing a mix of opportunities and challenges, including concerns about cybersecurity and the integrity of evidence. The study underscored the importance of transparency and guidelines for AI use in arbitration to ensure compliance with relevant regulations and ethical duties.

Leonora Riesenburg, independent arbitrator, addressed the ethical implications of AI in legal practice, emphasizing the need for practitioners to understand ethical guidelines and technical knowledge to mitigate biases and ensure fairness. She flagged potential concerns about the reliability of AI tools and the potential for manipulation by sophisticated parties, as well as the associated need for adequate transparency.

Contacter LE CEPANI

Rue des Sols 8 — B-1000 Bruxelles
info@cepani.be — +32 2 515 08 35
TVA BE 0413 975 115

Informations financières

BNP  BE45 2100 0760 8589 (BIC GEBABEBB)
KBC  BE28 4300 1693 9120 (BIC KREDBEBB)
ING   BE36 3100 7204 1481 (BIC BBRUBEBB)

Abonnez-vous à notre newsletter

s'abonner

FEB/VBO, Rue Ravenstein 4 — B-1000 Bruxelles
Lun – Ven 09:00h – 12:30h & 14:00h – 17:00h