<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Autonomous Systems Archives - Blakistons</title>
	<atom:link href="https://blakistons.co.uk/category/autonomous-systems/feed/" rel="self" type="application/rss+xml" />
	<link>https://blakistons.co.uk/category/autonomous-systems/</link>
	<description>Drone Law</description>
	<lastBuildDate>Thu, 06 Nov 2025 18:28:20 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft—and what the defence carve?outs really mean</title>
		<link>https://blakistons.co.uk/how-europes-new-ai-rulebook-would-and-wouldnt-touch-autonomous-combat-aircraft-and-what-the-defence-carveouts-really-mean/</link>
		
		<dc:creator><![CDATA[admin.richard]]></dc:creator>
		<pubDate>Thu, 06 Nov 2025 18:28:20 +0000</pubDate>
				<category><![CDATA[AI and Drone Technology]]></category>
		<category><![CDATA[AI Governance and Ethics]]></category>
		<category><![CDATA[AI Regulation]]></category>
		<category><![CDATA[AI Technology]]></category>
		<category><![CDATA[Airspace Legislation]]></category>
		<category><![CDATA[Airspace Management]]></category>
		<category><![CDATA[Airspace Management and UTM Systems]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Autonomous Systems]]></category>
		<category><![CDATA[Autonomous Systems in Aviation]]></category>
		<category><![CDATA[Aviation Innovation]]></category>
		<category><![CDATA[Aviation Law and Regulations]]></category>
		<category><![CDATA[Aviation Regulation]]></category>
		<category><![CDATA[Aviation Regulations]]></category>
		<category><![CDATA[Aviation Safety]]></category>
		<category><![CDATA[Aviation Security]]></category>
		<category><![CDATA[Aviation Technology]]></category>
		<category><![CDATA[Civil Aviation]]></category>
		<category><![CDATA[Defence]]></category>
		<category><![CDATA[Defence Procurement]]></category>
		<category><![CDATA[Defence Technology]]></category>
		<category><![CDATA[Defense Innovation]]></category>
		<category><![CDATA[EASA]]></category>
		<category><![CDATA[Emerging Technologies]]></category>
		<category><![CDATA[Emerging Technologies in Logistics]]></category>
		<category><![CDATA[EU AI Act Compliance]]></category>
		<category><![CDATA[EU Regulations and Compliance]]></category>
		<category><![CDATA[European Union Policy Updates]]></category>
		<category><![CDATA[Future Trends]]></category>
		<category><![CDATA[Government Reports]]></category>
		<category><![CDATA[High-Risk AI Applications]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[Legal Analysis and Recommendations]]></category>
		<category><![CDATA[Legal Challenges]]></category>
		<category><![CDATA[Legal Conflicts]]></category>
		<category><![CDATA[Legal Frameworks]]></category>
		<category><![CDATA[Legal Implications of AI]]></category>
		<category><![CDATA[Legal Insights]]></category>
		<category><![CDATA[Legal Insights for Drone Industry]]></category>
		<category><![CDATA[Legal Updates]]></category>
		<category><![CDATA[Local Government Policies]]></category>
		<category><![CDATA[Local Government Policy]]></category>
		<category><![CDATA[Military Law]]></category>
		<category><![CDATA[Military Procurement]]></category>
		<category><![CDATA[Military Technology]]></category>
		<category><![CDATA[National Security]]></category>
		<category><![CDATA[Public Safety]]></category>
		<category><![CDATA[Regulations and Compliance]]></category>
		<category><![CDATA[Regulatory and Legal Compliance]]></category>
		<category><![CDATA[Regulatory Compliance Strategies]]></category>
		<category><![CDATA[Regulatory Oversight]]></category>
		<category><![CDATA[Risk Assessment & Mitigation]]></category>
		<category><![CDATA[Risk Management and Insurance]]></category>
		<category><![CDATA[Safety and Risk Management - Emphasizes safety protocols]]></category>
		<category><![CDATA[Safety and Security in Aviation]]></category>
		<category><![CDATA[Tech Law and Regulation]]></category>
		<category><![CDATA[Technological Innovations in Drones]]></category>
		<category><![CDATA[Technology and Innovation]]></category>
		<category><![CDATA[UK Aviation Law]]></category>
		<category><![CDATA[UK Defence Policy]]></category>
		<category><![CDATA[UK Government Policy]]></category>
		<category><![CDATA[autonomous fighters]]></category>
		<category><![CDATA[biometric identification]]></category>
		<category><![CDATA[CE marking]]></category>
		<category><![CDATA[defence law]]></category>
		<category><![CDATA[drone law]]></category>
		<category><![CDATA[EU AI Act]]></category>
		<category><![CDATA[EU aviation law]]></category>
		<category><![CDATA[GA-ASI]]></category>
		<category><![CDATA[GPAI]]></category>
		<category><![CDATA[high-risk AI]]></category>
		<category><![CDATA[International Fighter Conference]]></category>
		<category><![CDATA[manned-unmanned teaming]]></category>
		<category><![CDATA[real-world testing]]></category>
		<category><![CDATA[Rome]]></category>
		<category><![CDATA[UAS]]></category>
		<guid isPermaLink="false">https://blakistons.co.uk/?p=2624</guid>

					<description><![CDATA[<p>By Richard Ryan, barrister and drone lawyer How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft — and what the defence carve-outs really mean. In Brief&#8230; Purely military AI systems are out of scope of the EU AI Act. If an AI system is developed or used exclusively for military/defence or national-security [&#8230;]</p>
<p>The post <a href="https://blakistons.co.uk/how-europes-new-ai-rulebook-would-and-wouldnt-touch-autonomous-combat-aircraft-and-what-the-defence-carveouts-really-mean/">How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft—and what the defence carve?outs really mean</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><!-- Begin blog content (no title; WordPress provides its own) --></p>
<div>
By Richard Ryan, barrister and drone lawyer </p>
<p><em>How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft — and what the defence carve-outs really mean.</em></p>
<hr />
<h3>In Brief&#8230;</h3>
<ul>
<li><strong>Purely military AI systems are out of scope</strong> of the EU AI Act. If an AI system is <strong>developed or used exclusively for military/defence or national-security purposes</strong>, the Act does not apply. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Dual-use is different.</strong> If the same autonomy stack, sensors or models are marketed or used for <strong>civilian</strong> purposes in the EU (for example, civil UAS, border or law-enforcement tasks), the Act can apply — with stringent duties for “high-risk” systems. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Real-world testing is regulated.</strong> Pre-market R&amp;D is generally excluded, <strong>but real-world testing isn’t</strong> — it requires specific safeguards and registration. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Foundation models (GPAI)</strong> have their own rules from <strong>2 Aug 2025</strong>; the defence carve-out in the Act is written for <strong>AI systems</strong>, not explicitly for <strong>models</strong>. If a model is placed on the EU market generally, the provider’s GPAI obligations can still bite. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<blockquote>
<p><strong>Context:</strong> sUAS News reports that GA-ASI is showcasing its autonomous fighter portfolio (for example, YFQ-42A CCA, MQ-20 Avenger) at the International Fighter Conference in Rome, 4–6 Nov 2025. This post overlays that scenario with the EU AI Act’s rules.</p>
</blockquote>
<hr />
<h2>1) First principles: When does the EU AI Act apply?</h2>
<p>The Act has <strong>extraterritorial reach</strong>. It covers (i) providers and deployers in the EU, (ii) providers placing on the EU market or putting systems into service in the EU — even if they are not established here — and (iii) providers/deployers in third countries <strong>where the AI system’s output is used in the EU</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<p>However, <strong>Article 2(3)</strong> draws a bright line: the Act <strong>does not apply</strong> to <strong>AI systems used exclusively</strong> for <strong>military, defence or national security</strong>. It also does not apply where a system is <strong>not</strong> placed on the EU market but its <strong>output is used in the EU exclusively</strong> for those purposes. Recital 24 reiterates this and clarifies that <strong>non-defence use falls back under the Act</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<p><strong>What this means in Rome:</strong></p>
<ul>
<li>A <strong>closed, defence-only</strong> showcase for European militaries: <strong>out of scope</strong>.</li>
<li>A <strong>civil-use pitch</strong>, civil flight trials, or plans to sell autonomy modules to <strong>EU civilian buyers</strong>: <strong>in scope</strong> (see the high-risk section below). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>2) The key defence carve-outs (and their limits)</h2>
<p><strong>Carve-out #1 — Defence/military:</strong></p>
<blockquote>
<p>“This Regulation shall not apply to AI systems … used exclusively for military, defence or national security purposes.” (Article 2(3))</p>
</blockquote>
<p>Two important nuances:</p>
<ul>
<li><strong>Exclusivity matters.</strong> The moment an autonomy stack or sensor suite is also <strong>marketed or used for civilian</strong> or law-enforcement tasks, the <strong>defence exclusion no longer shields those non-defence uses</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Models vs systems.</strong> The text explicitly excludes <strong>AI systems</strong> for defence; it <strong>does not create an explicit defence exclusion for general-purpose AI models</strong>. If a <strong>GPAI model</strong> is <strong>placed on the EU market</strong>, Chapter V obligations for model providers can still apply — even if one downstream customer is a defence user. (More on GPAI below.) (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>Carve-out #2 — Pre-market R&amp;D:</strong><br />
  R&amp;D <strong>before</strong> placing on the market is generally outside scope, <strong>but real-world testing is not</strong>. Testing in real-world conditions triggers a dedicated regime (for example, registration, time limits, informed consent or special conditions for law enforcement, incident reporting). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<p><strong>Carve-out #3 — Emergency derogations (non-defence):</strong><br />
  For <strong>exceptional public-security reasons</strong> (or imminent threats to life/health), <strong>market surveillance authorities</strong> can authorise <strong>temporary use</strong> of a high-risk AI system <strong>before</strong> full conformity assessment — subject to strict conditions. Law-enforcement or civil-protection bodies can also use in urgent cases, then seek authorisation without undue delay. This is <strong>not</strong> a defence-specific carve-out, but it explains emergency deployments outside the military context. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>3) If the defence exclusion doesn’t apply, would autonomous fighters tech be “high-risk”?</h2>
<p>Very likely <strong>yes</strong> — for <strong>civil</strong> variants or dual-use spin-outs:</p>
<ul>
<li><strong>Annex I (product-safety route).</strong> AI that is a <strong>safety component</strong> of products covered by sectoral EU safety laws is <strong>high-risk</strong> where those products need <strong>third-party conformity assessment</strong>. That list <strong>explicitly includes EU civil aviation law (Reg. 2018/1139)</strong> — covering <strong>unmanned aircraft</strong> and their remotely controllable equipment. In a civil-UAS configuration, an autonomy stack acting as a safety component would be regulated as <strong>high-risk</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Annex III (stand-alone uses).</strong> Separate “high-risk” buckets also capture, for example, <strong>remote biometric identification</strong> and other sensitive functions (if and where permitted by Union/national law), <strong>critical infrastructure</strong> safety components, and more. If a fighter-born sensing suite were repurposed for <strong>civil border surveillance</strong> or <strong>public-space identification</strong>, you quickly hit these Annex III categories. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>What “high-risk” demands in practice</strong><br />
  Providers must implement a <strong>risk-management system</strong>, <strong>data governance</strong>, <strong>technical documentation</strong>, <strong>logging</strong>, <strong>transparency/instructions</strong>, <strong>human oversight</strong>, and <strong>accuracy/robustness/cybersecurity</strong> — then pass <strong>conformity assessment</strong>, issue an <strong>EU Declaration of Conformity</strong>, and affix <strong>CE marking</strong>. Deployers also carry duties (for example, monitoring, data relevance, user notification in some cases). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>4) Sensors on show: what about face recognition and other “red lines”?</h2>
<p>The <strong>EU bans</strong> several AI practices outright (from <strong>2 Feb 2025</strong>), including:</p>
<ul>
<li><strong>Untargeted scraping</strong> of facial images to build recognition databases.</li>
<li><strong>Biometric categorisation</strong> inferring sensitive traits (for example, race, political opinions, religion).</li>
<li><strong>Emotion recognition</strong> in workplaces or schools (with narrow safety/medical exceptions).</li>
<li><strong>Predictive “risk assessments”</strong> of criminality based solely on personality traits/profiling.</li>
<li><strong>Real-time remote biometric identification (RBI) in public spaces for law enforcement</strong> — <strong>unless</strong> strictly authorised and necessary for narrowly defined objectives (for example, locating a specific suspect in serious crimes, preventing a specific imminent threat, finding missing persons), with prior judicial/independent approval and registration. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>Implication for a trade-show demo:</strong> training a camera on attendees to test <strong>real-time RBI</strong> in a public venue would <strong>likely be unlawful</strong> unless those strict law-enforcement exceptions and procedural safeguards apply — which they typically <strong>will not</strong> at a commercial defence conference. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>5) Real-world testing in the EU (civil or dual-use variants)</h2>
<p>If a provider runs <strong>real-world flight tests</strong> in the EU (outside the defence exclusion), the Act requires — among other things — <strong>registration</strong>, an EU-established entity or <strong>EU legal representative</strong>, limits on <strong>duration</strong> (normally up to six months, extendable once), rules on <strong>informed consent</strong> (with special handling for law-enforcement tests), <strong>qualified oversight</strong>, and the ability to <strong>reverse/ignore</strong> the system’s outputs. <strong>Serious incidents</strong> must be reported promptly. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>6) Foundation models (GPAI): obligations can still attach</h2>
<p>From <strong>2 Aug 2025</strong>, <strong>Chapter V</strong> sets <strong>baseline transparency and copyright-policy duties</strong> for <strong>providers of general-purpose AI models</strong> (with extra obligations if the model presents <strong>systemic risks</strong>). The defence exclusion in Article 2(3) is framed for <strong>AI systems</strong>, not <strong>models</strong>. So, if a foundation model is <strong>placed on the EU market</strong>, the <strong>model provider</strong> can have obligations even if a downstream customer is a defence prime. (Open-source specifics and systemic-risk thresholds also apply.) (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>7) Timelines you need in Rome (as of 6 Nov 2025)</h2>
<ul>
<li><strong>Entry into force:</strong> 1 Aug 2024 (20 days after OJ publication).</li>
<li><strong>Prohibited practices + core chapters (I–II):</strong> apply from <strong>2 Feb 2025</strong>.</li>
<li><strong>GPAI rules (Chapter V), plus other chapters (III §4, VII, XII, and Article 78):</strong> apply from <strong>2 Aug 2025</strong>.</li>
<li><strong>General application:</strong> <strong>2 Aug 2026</strong> (high-risk regime starts to bite broadly).</li>
<li><strong>Article 6(1) Annex III classification trigger &amp; related obligations:</strong> <strong>2 Aug 2027</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>8) Enforcement and penalties</h2>
<ul>
<li>Violating <strong>prohibited practices</strong> (Article 5) can draw fines up to <strong>€35m or 7%</strong> of worldwide annual turnover, whichever is higher.</li>
<li>Other operator obligations can reach <strong>€15m or 3%</strong>; supplying <strong>misleading information</strong> can reach <strong>€7.5m or 1%</strong> (SMEs benefit from caps). Separate fine scales apply to EU institutions. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>9) Practical playbook for IFC attendees</h2>
<p><strong>If you are a defence OEM showing autonomy stacks:</strong></p>
<ol>
<li><strong>Map uses</strong>: Defence-only (excluded) vs <strong>any civil or law-enforcement</strong> pathways (potentially in scope). Document the <strong>exclusivity</strong> of defence deployments if you rely on the carve-out. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>GPAI suppliers</strong>: If you place a <strong>foundation model</strong> on the EU market, expect <strong>Chapter V</strong> duties regardless of defence customers. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>No RBI demos</strong> on the show floor. Those prohibitions already apply in 2025. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Planning EU flight tests</strong> for civil variants? Prepare for <strong>real-world testing</strong> conditions (registration, oversight, incident reporting). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li>For <strong>civil UAS commercialisation</strong>, treat your autonomy as <strong>high-risk</strong> (EASA product-safety route), budget time for <strong>conformity assessment</strong> and <strong>CE marking</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ol>
<p><strong>If you are a European ministry or agency:</strong></p>
<ul>
<li>Distinguish <strong>military operations</strong> (out of scope) from <strong>law-enforcement or border</strong> uses (in scope; watch <strong>RBI</strong> limits and high-risk duties). Consider <strong>Article 46</strong> emergency derogations only in <strong>exceptional</strong> and <strong>documented</strong> cases. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>If you are a civil UAS integrator:</strong></p>
<ul>
<li>Expect the full <strong>high-risk</strong> package (risk management, data governance, human oversight, cybersecurity, logs, conformity assessment, CE). Build compliance into your <strong>system architecture</strong>, <strong>ML pipelines</strong>, <strong>safety cases</strong>, and <strong>ops manuals</strong> from day one. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>10) Quick decision pathway</h2>
<ol>
<li><strong>Is the use exclusively defence or national security?</strong><br />
      Yes: AI <strong>system</strong> is <strong>out of scope</strong>.<br />
      No: continue. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
<li><strong>Is it a civil product or law-enforcement/border use?</strong><br />
      Civil product with safety function (for example, civil UAS): <strong>High-risk</strong> via <strong>Annex I</strong> ? conformity assessment + CE. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)<br />
      Stand-alone sensitive use (for example, RBI, critical infrastructure): <strong>Annex III</strong> high-risk or <strong>Article 5</strong> prohibition applies. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
<li><strong>Is there a GPAI model being placed on the EU market?</strong><br />
      Yes: <strong>Chapter V</strong> duties for <strong>model providers</strong> from <strong>2 Aug 2025</strong>, separate from the defence carve-out for systems. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
<li><strong>Is this pre-market testing?</strong><br />
      <strong>Real-world testing</strong> rules apply (registration, oversight, incident reporting). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
</ol>
<hr />
<h3>Bottom line for “Autonomous Fighters in Rome”</h3>
<ul>
<li>A <strong>military-only</strong> display of GA-ASI’s autonomous fighters is <strong>outside</strong> the AI Act.</li>
<li>Any <strong>civil</strong> spin-off (cargo drones, civil surveillance, airport ops) or <strong>law-enforcement</strong> application in the EU will trigger the Act — often at the <strong>high-risk</strong> level — together with <strong>tight prohibitions</strong> around biometric uses in public spaces. Plan your <strong>compliance architecture</strong> accordingly. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><em>This article is informational and not legal advice. Citations are to the Official Journal text of the <strong>Artificial Intelligence Act (Regulation (EU) 2024/1689)</strong> for scope (Art. 2), prohibitions (Art. 5), high-risk regime (Ch. III), real-world testing (Arts. 57–61), GPAI (Ch. V incl. Art. 53), timelines (Art. 113), and penalties (Art. 99–101).</em></p>
<hr />
<section aria-label="Author bio">
<p><strong>About the author — Richard Ryan</strong></p>
<p>Richard Ryan is a UK barrister (Direct Access), mediator and Chartered Arbitrator (FCIArb), and a Bencher of Gray’s Inn. He practises across defence, aerospace, construction, engineering and commodities, with a leading specialism in drone and counter-drone law, unmanned aviation regulation, and AI-enabled safety and compliance. Richard advises government, primes and operators on EU/UK UAS frameworks, BVLOS, U-space/UTM and the EU AI Act. He leads Blakiston’s Chambers and contributes regularly to industry guidance and policy consultations.</p>
</section>
</div>
<p><!-- End blog content --></p>
<p>The post <a href="https://blakistons.co.uk/how-europes-new-ai-rulebook-would-and-wouldnt-touch-autonomous-combat-aircraft-and-what-the-defence-carveouts-really-mean/">How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft—and what the defence carve?outs really mean</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>AI Drone Swarms and the EU AI Act: A Game-Changer in Modern Warfare?</title>
		<link>https://blakistons.co.uk/ai-drone-swarms-and-the-eu-ai-act-a-game-changer-in-modern-warfare/</link>
		
		<dc:creator><![CDATA[admin.richard]]></dc:creator>
		<pubDate>Sat, 16 Nov 2024 17:44:52 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[AUKUS Partnership]]></category>
		<category><![CDATA[Autonomous Systems]]></category>
		<category><![CDATA[Defense Innovation]]></category>
		<category><![CDATA[Drone Swarms]]></category>
		<category><![CDATA[Ethical Considerations in AI]]></category>
		<category><![CDATA[EU AI Act Compliance]]></category>
		<category><![CDATA[Intellectual Property]]></category>
		<category><![CDATA[International Security]]></category>
		<category><![CDATA[Legal Implications of AI]]></category>
		<category><![CDATA[Military Law]]></category>
		<category><![CDATA[Regulatory Oversight]]></category>
		<category><![CDATA[Simulation and Sandboxing]]></category>
		<category><![CDATA[Technology and Warfare]]></category>
		<category><![CDATA[AI Drone Swarms]]></category>
		<category><![CDATA[AI Ethics]]></category>
		<category><![CDATA[Artificial Intelligence in Warfare]]></category>
		<category><![CDATA[AUKUS Trials]]></category>
		<category><![CDATA[Autonomous Military Technology]]></category>
		<category><![CDATA[Autonomous Systems Oversight]]></category>
		<category><![CDATA[Defence Innovation]]></category>
		<category><![CDATA[drone law]]></category>
		<category><![CDATA[Drone Swarm Risks]]></category>
		<category><![CDATA[EU AI Act]]></category>
		<category><![CDATA[Intellectual Property Risks]]></category>
		<category><![CDATA[Military AI Regulations]]></category>
		<category><![CDATA[regulatory compliance]]></category>
		<category><![CDATA[Richard Ryan]]></category>
		<category><![CDATA[Simulation Sandbox]]></category>
		<guid isPermaLink="false">https://blakistons.co.uk/?p=2504</guid>

					<description><![CDATA[<p>AI Drone Swarms and the EU AI Act: A Game-Changer in Modern Warfare? By Richard Ryan, Drone Lawyer The recent trials conducted by the AUKUS nations—Australia, the United Kingdom, and the United States—mark a significant milestone in the integration of artificial intelligence (AI) and autonomy within military operations. The deployment of AI-enabled uncrewed aerial vehicles [&#8230;]</p>
<p>The post <a href="https://blakistons.co.uk/ai-drone-swarms-and-the-eu-ai-act-a-game-changer-in-modern-warfare/">AI Drone Swarms and the EU AI Act: A Game-Changer in Modern Warfare?</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" src="https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare-300x171.webp" alt="" width="300" height="171" class="alignnone size-medium wp-image-2505" srcset="https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare-300x171.webp 300w, https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare-1024x585.webp 1024w, https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare-768x439.webp 768w, https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare-1536x878.webp 1536w, https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare-600x343.webp 600w, https://blakistons.co.uk/wp-content/uploads/2024/11/241116_AI-Drone-Swarms-and-the-EU-AI-Act-A-Game-Changer-in-Modern-Warfare.webp 1792w" sizes="(max-width: 300px) 100vw, 300px" /></p>
<p><strong>AI Drone Swarms and the EU AI Act: A Game-Changer in Modern Warfare?</strong></p>
<p><strong>By Richard Ryan, Drone Lawyer</strong></p>
<p>The recent trials conducted by the AUKUS nations—Australia, the United Kingdom, and the United States—mark a significant milestone in the integration of artificial intelligence (AI) and autonomy within military operations. The deployment of AI-enabled uncrewed aerial vehicles (UAVs) capable of locating, disabling, and destroying ground targets presents both remarkable advancements and complex legal challenges, particularly in the context of the European Union&#8217;s AI Act.</p>
<p>As a drone lawyer with over 20 years of experience in the UK, I find it imperative to dissect the interaction between these groundbreaking trials and the regulatory landscape shaped by the EU AI Act. This discussion aims to highlight the risks, oversight issues, and intellectual property considerations that arise when integrating AI algorithms into military UAV swarms.</p>
<p><strong>Understanding the EU AI Act&#8217;s Impact</strong></p>
<p>The EU AI Act seeks to establish a comprehensive regulatory framework for AI technologies, focusing on transparency, accountability, and human oversight. High-risk AI systems, which include those used in critical infrastructure and law enforcement, are subject to stringent requirements. Military applications, while often exempt from certain civilian regulations, still operate under international humanitarian laws and ethical guidelines that resonate with the Act&#8217;s principles.</p>
<p>The AUKUS trials demonstrate the use of AI in autonomous systems for military purposes. The AI-enabled UAVs operated collaboratively, sharing data seamlessly across nations. While the Act primarily governs civilian AI use within the EU, the ethical considerations it embodies cannot be ignored in military contexts, especially when such technologies might eventually influence civilian sectors.</p>
<p><strong>Risks and Oversight Challenges</strong></p>
<p>One of the foremost risks is the potential for AI algorithms to make autonomous decisions without adequate human oversight. The EU AI Act emphasizes the necessity of meaningful human control over AI systems, particularly those capable of impacting human lives. In the AUKUS trials, although a human operator was involved, the level of autonomy granted to the UAVs raises questions about compliance with the Act&#8217;s standards if similar technologies were deployed within the EU.</p>
<p>Data exchange and interoperability between the three nations introduce another layer of complexity. The seamless sharing of information enhances operational efficiency but also raises concerns about data protection and cybersecurity. Ensuring that sensitive data transmitted between UAVs and control systems is secure aligns with the Act&#8217;s requirements for robust data governance.</p>
<p><strong>The Case for a Simulation Sandbox</strong></p>
<p>To address compliance with the EU AI Act, conducting such trials within a simulation sandbox could be a prudent approach. A sandbox environment allows for the testing and validation of AI algorithms in a controlled setting, mitigating risks associated with real-world deployment. It enables developers to assess the AI&#8217;s decision-making processes, identify potential flaws, and ensure adherence to ethical and legal standards before actual implementation.</p>
<p>Moreover, a sandbox can facilitate transparency and accountability, key tenets of the EU AI Act. By documenting the AI&#8217;s performance and decision rationale within simulations, stakeholders can provide evidence of compliance and readiness for safe deployment.</p>
<p><strong>Intellectual Property Considerations</strong></p>
<p>Introducing AI algorithms into a regulatory sandbox presents intellectual property (IP) risks that must be carefully managed. Proprietary algorithms and technologies shared within the sandbox could be exposed to unauthorized access or misuse. Protecting IP rights is crucial to encourage innovation and maintain competitive advantages.</p>
<p>To mitigate these risks, clear agreements outlining the ownership, usage rights, and confidentiality obligations related to the AI algorithms are essential. Collaborative efforts, such as those seen in the AUKUS trials, require robust legal frameworks to safeguard each party&#8217;s IP while promoting shared development goals.</p>
<p><strong>Conclusion</strong></p>
<p>The integration of AI and autonomous systems in military applications is an evolving frontier that necessitates careful navigation of legal and ethical landscapes. The EU AI Act, while primarily focused on civilian applications, provides valuable guidance on managing high-risk AI systems.</p>
<p>By recognising the risks and oversight challenges presented by the AUKUS AI-enabled UAV trials, stakeholders can proactively address compliance issues. Utilising simulation sandboxes offers a viable pathway to refine these technologies within the bounds of regulatory requirements.</p>
<p>Intellectual property considerations remain a critical aspect of this process. Ensuring that AI algorithms are protected within collaborative environments will foster innovation while maintaining legal integrity.</p>
<p>As we advance into this new era of AI-driven military capabilities, a balanced approach that harmonises technological potential with regulatory compliance will be essential. The lessons learned from these trials will undoubtedly shape the future of AI in both military and civilian spheres.</p>
<p>&#8212;</p>
<p><strong>About Richard Ryan</strong></p>
<p>Richard Ryan is a leading drone lawyer based in the United Kingdom, with over 20 years of legal experience as a direct access barrister. Specializing in the legal aspects of unmanned aerial systems and AI technologies, Richard has advised government agencies, defense contractors, and private enterprises on compliance, intellectual property, and regulatory matters. His extensive expertise bridges the gap between cutting-edge technological advancements and the complex legal frameworks that govern them.</p>
<p>The post <a href="https://blakistons.co.uk/ai-drone-swarms-and-the-eu-ai-act-a-game-changer-in-modern-warfare/">AI Drone Swarms and the EU AI Act: A Game-Changer in Modern Warfare?</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
