Enabling AI Safety Information Sharing: UK Competition Law Block Exemptions and Institutional Design

Published: 03 Nov 2025, Last Modified: 03 Dec 2025EurIPS 2025 Workshop PAIG PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: AI safety, information sharing, competition law, institutional design, antitrust
TL;DR: We classify AI safety information and design competition law block exemptions enabling labs to share critical information without antitrust violations.
Abstract: Frontier AI labs face a coordination failure: sharing safety-critical information1 could prevent systemic failures, but competition law designed to prevent collusion2 creates legal barriers to collaboration. This paper addresses this coordination failure3 through comparative institutional analysis and legal framework redesign for the4 UK ecosystem. Drawing on cybersecurity (ISACs/ISAOs, CISA) and pharma-5 ceutical (EudraVigilance) precedents from the UK, EU, and US, we demonstrate6 how sector-specific legal exemptions paired with neutral clearinghouse institutions7 resolve tensions between competition enforcement and safety-critical information8 exchanges. We develop a two-dimensional taxonomy that maps technical AI infor-9 mation by commercial sensitivity and safety relevance, enabling clearinghouses10 and competition authorities to weigh antitrust risk against safety value. Applying11 UK Competition Act Chapter I analysis reveals that safety-critical information12 exchanges currently lack legal clarity; most crucially, the existing R&D Block13 Exemption Order (2022) does not protect post-deployment disclosures. Our analy-14 sis demonstrates that effective block exemptions require three design principles:15 (1) FRAND access, (2) anonymisation through neutral intermediaries, and (3)16 transparency requirements. We propose establishing the UK AI Security Institute17 (AISI) as a neutral clearinghouse and systematically evaluate nine institutional18 mechanisms to incentivise AI lab information sharing.
Submission Number: 26
Loading