<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>National Security Archives - Blakistons</title>
	<atom:link href="https://blakistons.co.uk/category/national-security/feed/" rel="self" type="application/rss+xml" />
	<link>https://blakistons.co.uk/category/national-security/</link>
	<description>Drone Law</description>
	<lastBuildDate>Thu, 06 Nov 2025 18:28:20 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft—and what the defence carve?outs really mean</title>
		<link>https://blakistons.co.uk/how-europes-new-ai-rulebook-would-and-wouldnt-touch-autonomous-combat-aircraft-and-what-the-defence-carveouts-really-mean/</link>
		
		<dc:creator><![CDATA[admin.richard]]></dc:creator>
		<pubDate>Thu, 06 Nov 2025 18:28:20 +0000</pubDate>
				<category><![CDATA[AI and Drone Technology]]></category>
		<category><![CDATA[AI Governance and Ethics]]></category>
		<category><![CDATA[AI Regulation]]></category>
		<category><![CDATA[AI Technology]]></category>
		<category><![CDATA[Airspace Legislation]]></category>
		<category><![CDATA[Airspace Management]]></category>
		<category><![CDATA[Airspace Management and UTM Systems]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Autonomous Systems]]></category>
		<category><![CDATA[Autonomous Systems in Aviation]]></category>
		<category><![CDATA[Aviation Innovation]]></category>
		<category><![CDATA[Aviation Law and Regulations]]></category>
		<category><![CDATA[Aviation Regulation]]></category>
		<category><![CDATA[Aviation Regulations]]></category>
		<category><![CDATA[Aviation Safety]]></category>
		<category><![CDATA[Aviation Security]]></category>
		<category><![CDATA[Aviation Technology]]></category>
		<category><![CDATA[Civil Aviation]]></category>
		<category><![CDATA[Defence]]></category>
		<category><![CDATA[Defence Procurement]]></category>
		<category><![CDATA[Defence Technology]]></category>
		<category><![CDATA[Defense Innovation]]></category>
		<category><![CDATA[EASA]]></category>
		<category><![CDATA[Emerging Technologies]]></category>
		<category><![CDATA[Emerging Technologies in Logistics]]></category>
		<category><![CDATA[EU AI Act Compliance]]></category>
		<category><![CDATA[EU Regulations and Compliance]]></category>
		<category><![CDATA[European Union Policy Updates]]></category>
		<category><![CDATA[Future Trends]]></category>
		<category><![CDATA[Government Reports]]></category>
		<category><![CDATA[High-Risk AI Applications]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[Legal Analysis and Recommendations]]></category>
		<category><![CDATA[Legal Challenges]]></category>
		<category><![CDATA[Legal Conflicts]]></category>
		<category><![CDATA[Legal Frameworks]]></category>
		<category><![CDATA[Legal Implications of AI]]></category>
		<category><![CDATA[Legal Insights]]></category>
		<category><![CDATA[Legal Insights for Drone Industry]]></category>
		<category><![CDATA[Legal Updates]]></category>
		<category><![CDATA[Local Government Policies]]></category>
		<category><![CDATA[Local Government Policy]]></category>
		<category><![CDATA[Military Law]]></category>
		<category><![CDATA[Military Procurement]]></category>
		<category><![CDATA[Military Technology]]></category>
		<category><![CDATA[National Security]]></category>
		<category><![CDATA[Public Safety]]></category>
		<category><![CDATA[Regulations and Compliance]]></category>
		<category><![CDATA[Regulatory and Legal Compliance]]></category>
		<category><![CDATA[Regulatory Compliance Strategies]]></category>
		<category><![CDATA[Regulatory Oversight]]></category>
		<category><![CDATA[Risk Assessment & Mitigation]]></category>
		<category><![CDATA[Risk Management and Insurance]]></category>
		<category><![CDATA[Safety and Risk Management - Emphasizes safety protocols]]></category>
		<category><![CDATA[Safety and Security in Aviation]]></category>
		<category><![CDATA[Tech Law and Regulation]]></category>
		<category><![CDATA[Technological Innovations in Drones]]></category>
		<category><![CDATA[Technology and Innovation]]></category>
		<category><![CDATA[UK Aviation Law]]></category>
		<category><![CDATA[UK Defence Policy]]></category>
		<category><![CDATA[UK Government Policy]]></category>
		<category><![CDATA[autonomous fighters]]></category>
		<category><![CDATA[biometric identification]]></category>
		<category><![CDATA[CE marking]]></category>
		<category><![CDATA[defence law]]></category>
		<category><![CDATA[drone law]]></category>
		<category><![CDATA[EU AI Act]]></category>
		<category><![CDATA[EU aviation law]]></category>
		<category><![CDATA[GA-ASI]]></category>
		<category><![CDATA[GPAI]]></category>
		<category><![CDATA[high-risk AI]]></category>
		<category><![CDATA[International Fighter Conference]]></category>
		<category><![CDATA[manned-unmanned teaming]]></category>
		<category><![CDATA[real-world testing]]></category>
		<category><![CDATA[Rome]]></category>
		<category><![CDATA[UAS]]></category>
		<guid isPermaLink="false">https://blakistons.co.uk/?p=2624</guid>

					<description><![CDATA[<p>By Richard Ryan, barrister and drone lawyer How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft — and what the defence carve-outs really mean. In Brief&#8230; Purely military AI systems are out of scope of the EU AI Act. If an AI system is developed or used exclusively for military/defence or national-security [&#8230;]</p>
<p>The post <a href="https://blakistons.co.uk/how-europes-new-ai-rulebook-would-and-wouldnt-touch-autonomous-combat-aircraft-and-what-the-defence-carveouts-really-mean/">How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft—and what the defence carve?outs really mean</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><!-- Begin blog content (no title; WordPress provides its own) --></p>
<div>
By Richard Ryan, barrister and drone lawyer </p>
<p><em>How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft — and what the defence carve-outs really mean.</em></p>
<hr />
<h3>In Brief&#8230;</h3>
<ul>
<li><strong>Purely military AI systems are out of scope</strong> of the EU AI Act. If an AI system is <strong>developed or used exclusively for military/defence or national-security purposes</strong>, the Act does not apply. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Dual-use is different.</strong> If the same autonomy stack, sensors or models are marketed or used for <strong>civilian</strong> purposes in the EU (for example, civil UAS, border or law-enforcement tasks), the Act can apply — with stringent duties for “high-risk” systems. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Real-world testing is regulated.</strong> Pre-market R&amp;D is generally excluded, <strong>but real-world testing isn’t</strong> — it requires specific safeguards and registration. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Foundation models (GPAI)</strong> have their own rules from <strong>2 Aug 2025</strong>; the defence carve-out in the Act is written for <strong>AI systems</strong>, not explicitly for <strong>models</strong>. If a model is placed on the EU market generally, the provider’s GPAI obligations can still bite. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<blockquote>
<p><strong>Context:</strong> sUAS News reports that GA-ASI is showcasing its autonomous fighter portfolio (for example, YFQ-42A CCA, MQ-20 Avenger) at the International Fighter Conference in Rome, 4–6 Nov 2025. This post overlays that scenario with the EU AI Act’s rules.</p>
</blockquote>
<hr />
<h2>1) First principles: When does the EU AI Act apply?</h2>
<p>The Act has <strong>extraterritorial reach</strong>. It covers (i) providers and deployers in the EU, (ii) providers placing on the EU market or putting systems into service in the EU — even if they are not established here — and (iii) providers/deployers in third countries <strong>where the AI system’s output is used in the EU</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<p>However, <strong>Article 2(3)</strong> draws a bright line: the Act <strong>does not apply</strong> to <strong>AI systems used exclusively</strong> for <strong>military, defence or national security</strong>. It also does not apply where a system is <strong>not</strong> placed on the EU market but its <strong>output is used in the EU exclusively</strong> for those purposes. Recital 24 reiterates this and clarifies that <strong>non-defence use falls back under the Act</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<p><strong>What this means in Rome:</strong></p>
<ul>
<li>A <strong>closed, defence-only</strong> showcase for European militaries: <strong>out of scope</strong>.</li>
<li>A <strong>civil-use pitch</strong>, civil flight trials, or plans to sell autonomy modules to <strong>EU civilian buyers</strong>: <strong>in scope</strong> (see the high-risk section below). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>2) The key defence carve-outs (and their limits)</h2>
<p><strong>Carve-out #1 — Defence/military:</strong></p>
<blockquote>
<p>“This Regulation shall not apply to AI systems … used exclusively for military, defence or national security purposes.” (Article 2(3))</p>
</blockquote>
<p>Two important nuances:</p>
<ul>
<li><strong>Exclusivity matters.</strong> The moment an autonomy stack or sensor suite is also <strong>marketed or used for civilian</strong> or law-enforcement tasks, the <strong>defence exclusion no longer shields those non-defence uses</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Models vs systems.</strong> The text explicitly excludes <strong>AI systems</strong> for defence; it <strong>does not create an explicit defence exclusion for general-purpose AI models</strong>. If a <strong>GPAI model</strong> is <strong>placed on the EU market</strong>, Chapter V obligations for model providers can still apply — even if one downstream customer is a defence user. (More on GPAI below.) (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>Carve-out #2 — Pre-market R&amp;D:</strong><br />
  R&amp;D <strong>before</strong> placing on the market is generally outside scope, <strong>but real-world testing is not</strong>. Testing in real-world conditions triggers a dedicated regime (for example, registration, time limits, informed consent or special conditions for law enforcement, incident reporting). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<p><strong>Carve-out #3 — Emergency derogations (non-defence):</strong><br />
  For <strong>exceptional public-security reasons</strong> (or imminent threats to life/health), <strong>market surveillance authorities</strong> can authorise <strong>temporary use</strong> of a high-risk AI system <strong>before</strong> full conformity assessment — subject to strict conditions. Law-enforcement or civil-protection bodies can also use in urgent cases, then seek authorisation without undue delay. This is <strong>not</strong> a defence-specific carve-out, but it explains emergency deployments outside the military context. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>3) If the defence exclusion doesn’t apply, would autonomous fighters tech be “high-risk”?</h2>
<p>Very likely <strong>yes</strong> — for <strong>civil</strong> variants or dual-use spin-outs:</p>
<ul>
<li><strong>Annex I (product-safety route).</strong> AI that is a <strong>safety component</strong> of products covered by sectoral EU safety laws is <strong>high-risk</strong> where those products need <strong>third-party conformity assessment</strong>. That list <strong>explicitly includes EU civil aviation law (Reg. 2018/1139)</strong> — covering <strong>unmanned aircraft</strong> and their remotely controllable equipment. In a civil-UAS configuration, an autonomy stack acting as a safety component would be regulated as <strong>high-risk</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Annex III (stand-alone uses).</strong> Separate “high-risk” buckets also capture, for example, <strong>remote biometric identification</strong> and other sensitive functions (if and where permitted by Union/national law), <strong>critical infrastructure</strong> safety components, and more. If a fighter-born sensing suite were repurposed for <strong>civil border surveillance</strong> or <strong>public-space identification</strong>, you quickly hit these Annex III categories. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>What “high-risk” demands in practice</strong><br />
  Providers must implement a <strong>risk-management system</strong>, <strong>data governance</strong>, <strong>technical documentation</strong>, <strong>logging</strong>, <strong>transparency/instructions</strong>, <strong>human oversight</strong>, and <strong>accuracy/robustness/cybersecurity</strong> — then pass <strong>conformity assessment</strong>, issue an <strong>EU Declaration of Conformity</strong>, and affix <strong>CE marking</strong>. Deployers also carry duties (for example, monitoring, data relevance, user notification in some cases). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>4) Sensors on show: what about face recognition and other “red lines”?</h2>
<p>The <strong>EU bans</strong> several AI practices outright (from <strong>2 Feb 2025</strong>), including:</p>
<ul>
<li><strong>Untargeted scraping</strong> of facial images to build recognition databases.</li>
<li><strong>Biometric categorisation</strong> inferring sensitive traits (for example, race, political opinions, religion).</li>
<li><strong>Emotion recognition</strong> in workplaces or schools (with narrow safety/medical exceptions).</li>
<li><strong>Predictive “risk assessments”</strong> of criminality based solely on personality traits/profiling.</li>
<li><strong>Real-time remote biometric identification (RBI) in public spaces for law enforcement</strong> — <strong>unless</strong> strictly authorised and necessary for narrowly defined objectives (for example, locating a specific suspect in serious crimes, preventing a specific imminent threat, finding missing persons), with prior judicial/independent approval and registration. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>Implication for a trade-show demo:</strong> training a camera on attendees to test <strong>real-time RBI</strong> in a public venue would <strong>likely be unlawful</strong> unless those strict law-enforcement exceptions and procedural safeguards apply — which they typically <strong>will not</strong> at a commercial defence conference. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>5) Real-world testing in the EU (civil or dual-use variants)</h2>
<p>If a provider runs <strong>real-world flight tests</strong> in the EU (outside the defence exclusion), the Act requires — among other things — <strong>registration</strong>, an EU-established entity or <strong>EU legal representative</strong>, limits on <strong>duration</strong> (normally up to six months, extendable once), rules on <strong>informed consent</strong> (with special handling for law-enforcement tests), <strong>qualified oversight</strong>, and the ability to <strong>reverse/ignore</strong> the system’s outputs. <strong>Serious incidents</strong> must be reported promptly. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>6) Foundation models (GPAI): obligations can still attach</h2>
<p>From <strong>2 Aug 2025</strong>, <strong>Chapter V</strong> sets <strong>baseline transparency and copyright-policy duties</strong> for <strong>providers of general-purpose AI models</strong> (with extra obligations if the model presents <strong>systemic risks</strong>). The defence exclusion in Article 2(3) is framed for <strong>AI systems</strong>, not <strong>models</strong>. So, if a foundation model is <strong>placed on the EU market</strong>, the <strong>model provider</strong> can have obligations even if a downstream customer is a defence prime. (Open-source specifics and systemic-risk thresholds also apply.) (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</p>
<hr />
<h2>7) Timelines you need in Rome (as of 6 Nov 2025)</h2>
<ul>
<li><strong>Entry into force:</strong> 1 Aug 2024 (20 days after OJ publication).</li>
<li><strong>Prohibited practices + core chapters (I–II):</strong> apply from <strong>2 Feb 2025</strong>.</li>
<li><strong>GPAI rules (Chapter V), plus other chapters (III §4, VII, XII, and Article 78):</strong> apply from <strong>2 Aug 2025</strong>.</li>
<li><strong>General application:</strong> <strong>2 Aug 2026</strong> (high-risk regime starts to bite broadly).</li>
<li><strong>Article 6(1) Annex III classification trigger &amp; related obligations:</strong> <strong>2 Aug 2027</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>8) Enforcement and penalties</h2>
<ul>
<li>Violating <strong>prohibited practices</strong> (Article 5) can draw fines up to <strong>€35m or 7%</strong> of worldwide annual turnover, whichever is higher.</li>
<li>Other operator obligations can reach <strong>€15m or 3%</strong>; supplying <strong>misleading information</strong> can reach <strong>€7.5m or 1%</strong> (SMEs benefit from caps). Separate fine scales apply to EU institutions. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>9) Practical playbook for IFC attendees</h2>
<p><strong>If you are a defence OEM showing autonomy stacks:</strong></p>
<ol>
<li><strong>Map uses</strong>: Defence-only (excluded) vs <strong>any civil or law-enforcement</strong> pathways (potentially in scope). Document the <strong>exclusivity</strong> of defence deployments if you rely on the carve-out. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>GPAI suppliers</strong>: If you place a <strong>foundation model</strong> on the EU market, expect <strong>Chapter V</strong> duties regardless of defence customers. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>No RBI demos</strong> on the show floor. Those prohibitions already apply in 2025. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li><strong>Planning EU flight tests</strong> for civil variants? Prepare for <strong>real-world testing</strong> conditions (registration, oversight, incident reporting). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
<li>For <strong>civil UAS commercialisation</strong>, treat your autonomy as <strong>high-risk</strong> (EASA product-safety route), budget time for <strong>conformity assessment</strong> and <strong>CE marking</strong>. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ol>
<p><strong>If you are a European ministry or agency:</strong></p>
<ul>
<li>Distinguish <strong>military operations</strong> (out of scope) from <strong>law-enforcement or border</strong> uses (in scope; watch <strong>RBI</strong> limits and high-risk duties). Consider <strong>Article 46</strong> emergency derogations only in <strong>exceptional</strong> and <strong>documented</strong> cases. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><strong>If you are a civil UAS integrator:</strong></p>
<ul>
<li>Expect the full <strong>high-risk</strong> package (risk management, data governance, human oversight, cybersecurity, logs, conformity assessment, CE). Build compliance into your <strong>system architecture</strong>, <strong>ML pipelines</strong>, <strong>safety cases</strong>, and <strong>ops manuals</strong> from day one. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<hr />
<h2>10) Quick decision pathway</h2>
<ol>
<li><strong>Is the use exclusively defence or national security?</strong><br />
      Yes: AI <strong>system</strong> is <strong>out of scope</strong>.<br />
      No: continue. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
<li><strong>Is it a civil product or law-enforcement/border use?</strong><br />
      Civil product with safety function (for example, civil UAS): <strong>High-risk</strong> via <strong>Annex I</strong> ? conformity assessment + CE. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)<br />
      Stand-alone sensitive use (for example, RBI, critical infrastructure): <strong>Annex III</strong> high-risk or <strong>Article 5</strong> prohibition applies. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
<li><strong>Is there a GPAI model being placed on the EU market?</strong><br />
      Yes: <strong>Chapter V</strong> duties for <strong>model providers</strong> from <strong>2 Aug 2025</strong>, separate from the defence carve-out for systems. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
<li><strong>Is this pre-market testing?</strong><br />
      <strong>Real-world testing</strong> rules apply (registration, oversight, incident reporting). (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)
    </li>
</ol>
<hr />
<h3>Bottom line for “Autonomous Fighters in Rome”</h3>
<ul>
<li>A <strong>military-only</strong> display of GA-ASI’s autonomous fighters is <strong>outside</strong> the AI Act.</li>
<li>Any <strong>civil</strong> spin-off (cargo drones, civil surveillance, airport ops) or <strong>law-enforcement</strong> application in the EU will trigger the Act — often at the <strong>high-risk</strong> level — together with <strong>tight prohibitions</strong> around biometric uses in public spaces. Plan your <strong>compliance architecture</strong> accordingly. (<a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ%3AL_202401689" target="_blank" rel="noopener">EUR-Lex</a>)</li>
</ul>
<p><em>This article is informational and not legal advice. Citations are to the Official Journal text of the <strong>Artificial Intelligence Act (Regulation (EU) 2024/1689)</strong> for scope (Art. 2), prohibitions (Art. 5), high-risk regime (Ch. III), real-world testing (Arts. 57–61), GPAI (Ch. V incl. Art. 53), timelines (Art. 113), and penalties (Art. 99–101).</em></p>
<hr />
<section aria-label="Author bio">
<p><strong>About the author — Richard Ryan</strong></p>
<p>Richard Ryan is a UK barrister (Direct Access), mediator and Chartered Arbitrator (FCIArb), and a Bencher of Gray’s Inn. He practises across defence, aerospace, construction, engineering and commodities, with a leading specialism in drone and counter-drone law, unmanned aviation regulation, and AI-enabled safety and compliance. Richard advises government, primes and operators on EU/UK UAS frameworks, BVLOS, U-space/UTM and the EU AI Act. He leads Blakiston’s Chambers and contributes regularly to industry guidance and policy consultations.</p>
</section>
</div>
<p><!-- End blog content --></p>
<p>The post <a href="https://blakistons.co.uk/how-europes-new-ai-rulebook-would-and-wouldnt-touch-autonomous-combat-aircraft-and-what-the-defence-carveouts-really-mean/">How Europe’s new AI rulebook would (and wouldn’t) touch autonomous combat aircraft—and what the defence carve?outs really mean</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Military Drones and AI Regulation: A UK Drone Lawyer’s Perspective</title>
		<link>https://blakistons.co.uk/military-drones-and-ai-regulation-a-uk-drone-lawyers-perspective/</link>
		
		<dc:creator><![CDATA[admin.richard]]></dc:creator>
		<pubDate>Tue, 07 Jan 2025 13:29:44 +0000</pubDate>
				<category><![CDATA[AI Regulation]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Drone Law]]></category>
		<category><![CDATA[Drone Law - As the article provides legal insights specific to drone operations]]></category>
		<category><![CDATA[Drone Law - Covers legal aspects and compliance specific to drone operations and incidents.]]></category>
		<category><![CDATA[Future of Drone Regulations]]></category>
		<category><![CDATA[High-Risk AI Applications]]></category>
		<category><![CDATA[International Drone Regulations]]></category>
		<category><![CDATA[International Humanitarian Law]]></category>
		<category><![CDATA[Legal Implications of AI]]></category>
		<category><![CDATA[Military Law]]></category>
		<category><![CDATA[Military Procurement]]></category>
		<category><![CDATA[Military Technology]]></category>
		<category><![CDATA[National Security]]></category>
		<category><![CDATA[Regulations and Compliance]]></category>
		<category><![CDATA[Regulatory Oversight]]></category>
		<category><![CDATA[UK Defence Policy]]></category>
		<category><![CDATA[UK drone policy]]></category>
		<category><![CDATA[UK Drone Regulations]]></category>
		<category><![CDATA[UK Government Policy]]></category>
		<category><![CDATA[UK Law]]></category>
		<category><![CDATA[UK Legislation]]></category>
		<category><![CDATA[UK Policy]]></category>
		<category><![CDATA[AI Ethics]]></category>
		<category><![CDATA[AI regulation]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[drone law]]></category>
		<category><![CDATA[EU AI Act]]></category>
		<category><![CDATA[Faculty AI]]></category>
		<category><![CDATA[frontier AI]]></category>
		<category><![CDATA[lethal autonomous weapons]]></category>
		<category><![CDATA[military drones]]></category>
		<category><![CDATA[UK drone lawyer]]></category>
		<guid isPermaLink="false">https://blakistons.co.uk/?p=2540</guid>

					<description><![CDATA[<p>Military Drones and AI Regulation: A UK Drone Lawyer’s Perspective By Richard Ryan, UK Drone Lawyer On 7 January 2025, The Guardian published an article highlighting the British AI consultancy Faculty AI’s involvement in the development of drone technology for defence clients, prompting renewed questions about where legal, ethical, and regulatory boundaries should lie for [&#8230;]</p>
<p>The post <a href="https://blakistons.co.uk/military-drones-and-ai-regulation-a-uk-drone-lawyers-perspective/">Military Drones and AI Regulation: A UK Drone Lawyer’s Perspective</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><img fetchpriority="high" decoding="async" src="https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_-300x300.webp" alt="" width="300" height="300" class="alignnone size-medium wp-image-2541" srcset="https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_-300x300.webp 300w, https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_-150x150.webp 150w, https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_-768x768.webp 768w, https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_-600x600.webp 600w, https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_-100x100.webp 100w, https://blakistons.co.uk/wp-content/uploads/2025/01/250107_Military-Drones-and-AI-Regulation_A-UK-Drone-Lawyers-Perspective_.webp 1024w" sizes="(max-width: 300px) 100vw, 300px" /></p>
<p><strong>Military Drones and AI Regulation: A UK Drone Lawyer’s Perspective</strong></p>
<p><strong>By Richard Ryan, UK Drone Lawyer</strong></p>
<p>On 7 January 2025, The Guardian published an article highlighting the British AI consultancy Faculty AI’s involvement in the development of drone technology for defence clients, prompting renewed questions about where legal, ethical, and regulatory boundaries should lie for AI-driven military applications.<br />
Faculty AI, already prominent for its work with various UK government departments (including the NHS and the Department for Education) and advisory services for the AI Safety Institute (AISI), has reportedly developed and deployed AI models on unmanned aerial vehicles (UAVs) for military purposes. Although it remains unclear whether these drones are intended for lethal operations, the revelations have amplified concerns about how best to regulate or restrict the use of AI in weapon systems.<br />
Below, I explore the key legal issues and examine how the recently adopted <strong>EU AI Act</strong>—as well as the evolving UK regulatory framework—may shape the future of this sector.<br />
________________________________________<br />
<strong>1. Faculty AI’s Defence Work: A Brief Overview</strong><br />
<strong>1.1 Government and Public Sector Ties</strong><br />
Faculty AI, known for its work with the Vote Leave campaign in 2016, was later engaged by Dominic Cummings to provide data analytics during the pandemic. Since then, it has won multiple government contracts worth at least £26.6m, extending its work into healthcare (via the NHS), education, and policy consulting with the AISI on frontier AI safety.<br />
<strong>1.2 UAV Development</strong><br />
The Guardian reports that Faculty AI has experience in deploying AI models on UAVs. Its partner firm, Hadean, indicated that the two companies collaborated on subject identification, tracking objects in movement, and exploring swarm deployment. While Faculty states that it aims to create “safer, more robust solutions”, details on whether these drones might be capable of lethal autonomous targeting remain undisclosed.<br />
________________________________________<br />
<strong>2. The EU AI Act: A New Regulatory Milestone</strong><br />
<strong>2.1 Status of the EU AI Act</strong><br />
Introduced by the European Commission in 2021 as a proposed regulation, the EU AI Act has since been adopted via the EU’s legislative process. As of early 2025, it is recognised as a binding regulation designed to harmonise AI rules across all EU Member States. Although the UK is no longer part of the EU, any UK-based company offering AI products or services within the EU must ensure compliance with the regulation’s requirements.<br />
<strong>2.2 Risk-Tiered Framework</strong><br />
The EU AI Act operates on a tiered risk basis:<br />
•	<strong>Unacceptable risk</strong>: Certain AI applications (e.g., social scoring) are outright banned.<br />
•	<strong>High risk</strong>: This category includes critical infrastructure, healthcare, and—potentially—defence-related AI systems that could significantly affect people’s safety or fundamental rights. Such systems must meet strict transparency, oversight, and data governance requirements.<br />
•	<strong>Limited or minimal risk</strong>: These uses are subject to fewer obligations, generally focused on transparency (e.g., disclosing AI usage to end users).<br />
For <strong>high-risk</strong> AI in military contexts, the EU AI Act demands robust <strong>human oversight</strong>, thorough documentation, and strict compliance obligations, particularly around accountability and the prevention of harm.<br />
<strong>2.3 Potential Impact on Military Drones</strong><br />
While national security and defence largely remain the prerogative of individual EU Member States, the EU AI Act’s principles can still influence how companies and governments view the development of autonomous or semi-autonomous drones. Key considerations include:<br />
•	<strong>Transparent Data and Design</strong>: Documenting data sets, development processes, and operational parameters.<br />
•	<strong>Human in the Loop</strong>: Ensuring a human operator is always able to override or intervene in the AI’s decision-making. Other terms such as Human on the Loop and Human outside the Loop are also referred to.<br />
•	<strong>Liability and Penalties</strong>: Breaches can incur hefty fines—up to 6% of global turnover—thus acting as a significant deterrent against unethical or unlawful AI deployment.<br />
________________________________________<br />
<strong>3. The UK’s Approach to AI Regulation and Military Drones</strong><br />
<strong>3.1 Divergence from the EU?</strong><br />
Post-Brexit, the UK has chosen a “pro-innovation” approach to AI regulation. Rather than adopting a single, all-encompassing statute akin to the EU AI Act, the UK is implementing a sector-by-sector and risk-based strategy, guided by existing regulators such as the Information Commissioner’s Office and the Competition and Markets Authority.<br />
<strong>3.2 AI Safety Institute (AISI)</strong><br />
Established under former Prime Minister Rishi Sunak in 2023, the AISI focuses on frontier AI safety research. Faculty AI’s role in testing large language models and advising the AISI on threats like disinformation and system security places the company in a key position to influence UK policy. Critics argue that this may create potential conflicts of interest if the same organisation is also developing AI for military use.<br />
<strong>3.3 House of Lords Recommendations</strong><br />
In 2023, a House of Lords committee urged the UK Government to clarify the application of International Humanitarian Law (IHL) to lethal drone strikes and to work towards an international agreement limiting or banning fully autonomous weapons systems. The Government response acknowledged the importance of maintaining “human control” in critical decisions but did not enact binding legislation banning lethal autonomous drones outright.<br />
________________________________________<br />
<strong>4. Legal and Ethical Concerns for AI-Enabled Drones</strong><br />
<strong>4.1 International Humanitarian Law (IHL)</strong><br />
IHL principles—<strong>distinction</strong> (separating combatants from civilians) and <strong>proportionality</strong> (limiting harm relative to military objectives)—are central to discussions on AI-driven drones. Fully autonomous UAVs, capable of selecting and engaging targets without human intervention, raise profound legal questions on accountability, particularly if biases or system errors result in wrongful casualties.<br />
<strong>4.2 Allocation of Liability</strong><br />
Traditionally, accountability in military operations lies with commanders and operators. With increasingly autonomous systems, however, liability could extend to technology developers, programmers, or even the purchaser of the system. Clarifying how legal responsibilities are distributed may become a focal point for future litigation and regulatory reform.<br />
<strong>4.3 Export Controls</strong><br />
Companies like Faculty AI must also comply with arms-export rules when providing AI-targeting systems or related software to foreign entities. In the UK, export licences for military-grade technology are subject to domestic legislation and international protocols, such as the Wassenaar Arrangement on dual-use goods.<br />
________________________________________<br />
<strong>5. Looking Ahead: Balancing Innovation, Safety, and Accountability</strong><br />
<strong>5.1	Stronger National Frameworks</strong><br />
Although the UK favours a pro-innovation stance, there is growing pressure from Parliament and civil society for more rigorous, enforceable rules on potentially lethal AI applications. The EU AI Act may serve as a reference point for the UK to consider stricter domestic regulations.<br />
<strong>5.2	International Collaboration</strong><br />
Calls for global agreements—treaties or non-binding accords—to prohibit fully autonomous weapons continue to gain momentum. The House of Lords committee specifically recommended international engagement to ensure that lethal force remains under human control.<br />
<strong>5.3	Corporate Accountability</strong><br />
Organisations operating at the intersection of commercial defence contracts and government policy—such as Faculty AI—need transparent internal processes and robust ethics boards to mitigate conflicts of interest. Demonstrating genuine corporate responsibility will be vital for maintaining public trust.<br />
<strong>5.4	Ethical and Safety Audits</strong><br />
As AI becomes more embedded in defence, mandatory ethical and safety audits may become standard practice. These would scrutinise algorithmic fairness, training data, and how effectively systems can identify and mitigate unintended harms.<br />
________________________________________<br />
<strong>6. Conclusion</strong><br />
Faculty AI’s role in developing AI for military drones underscores how high the stakes are when cutting-edge technology meets defence applications. With the EU AI Act now in force as a binding regulation, Europe has provided a blueprint for tighter control over “high-risk” AI systems. In contrast, the UK’s approach still offers substantial flexibility for companies, potentially raising both legal and ethical concerns around autonomy, accountability, and conflicts of interest.<br />
From an IHL standpoint, keeping a human responsible for any life-and-death decision is imperative. As a UK drone lawyer, I urge policymakers, regulators, and industry stakeholders to keep asking: <strong>Where do we draw the line between legitimate defensive innovation and an unacceptable risk to civilians?</strong> Only by establishing clear, enforceable legal standards—anchored in international law and ethical scrutiny—can we ensure AI-powered drones serve to protect rather than endanger fundamental human values.</p>
<p><strong>Bio – Richard Ryan, UK Drone Lawyer</strong></p>
<p>Richard Ryan is a UK-based drone lawyer specialising in the regulatory, ethical, and commercial aspects of unmanned aerial vehicles (UAVs) and artificial intelligence (AI). Through a series of blogs, Richard Ryan has explored critical issues such as the EU AI Act, the UK’s evolving “pro-innovation” regulatory landscape, and the legal considerations surrounding military drones and lethal autonomous weapons systems.</p>
<p>Drawing on extensive experience in advising government bodies, technology companies, and public institutions, Richard Ryan brings a deep understanding of how international humanitarian law (IHL), export controls, and data protection obligations intersect in modern drone operations. Their writing emphasises the importance of maintaining human oversight in AI-driven systems, championing ethical development and transparent accountability mechanisms.</p>
<p>A trusted voice in the field, Richard Ryan regularly comments on emerging case law, parliamentary recommendations, and global discussions around frontier AI safety. The mission is to help stakeholders—from hobbyist drone operators to established aerospace firms—navigate the complexities of regulation, risk management, and innovation.</p>
<p>The post <a href="https://blakistons.co.uk/military-drones-and-ai-regulation-a-uk-drone-lawyers-perspective/">Military Drones and AI Regulation: A UK Drone Lawyer’s Perspective</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Government’s Drone Resist Strategy is launched… October 2019</title>
		<link>https://blakistons.co.uk/uk-unveils-counter-drone-strategy-new-measures-to-combat-unmanned-aircraft-threats/</link>
		
		<dc:creator><![CDATA[zeroabove]]></dc:creator>
		<pubDate>Mon, 11 Nov 2019 15:48:18 +0000</pubDate>
				<category><![CDATA[Drone Legislation]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[National Security]]></category>
		<category><![CDATA[UK Government Policy]]></category>
		<category><![CDATA[Air Traffic Management and Unmanned Aircraft Bill]]></category>
		<category><![CDATA[Counter-Unmanned Aircraft Strategy]]></category>
		<category><![CDATA[Drone Detection Systems]]></category>
		<category><![CDATA[Drone Industry Standards]]></category>
		<category><![CDATA[drone law]]></category>
		<category><![CDATA[Drone Registration]]></category>
		<category><![CDATA[Drone Safety]]></category>
		<category><![CDATA[Drone Threats]]></category>
		<category><![CDATA[Operational Responders]]></category>
		<category><![CDATA[Police Powers]]></category>
		<category><![CDATA[Unmanned Traffic Management System]]></category>
		<guid isPermaLink="false">https://blakistons.co.uk/?p=139</guid>

					<description><![CDATA[<p>The government finally published the UK’s counter-unmanned aircraft strategy.  Ultimately directed at dealing with drone issues that relate to: Organised crime; Disruption to national infrastructure; Acts of terrorism; Threats to the UK’s national security. Interestingly the report refers to the incident at Gatwick airport during the Christmas period in 2018, but has failed to refer [&#8230;]</p>
<p>The post <a href="https://blakistons.co.uk/uk-unveils-counter-drone-strategy-new-measures-to-combat-unmanned-aircraft-threats/">Government’s Drone Resist Strategy is launched… October 2019</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The government finally published the UK’s counter-unmanned aircraft strategy.  Ultimately directed at dealing with drone issues that relate to:</p>
<ol>
<li>Organised crime;</li>
<li>Disruption to national infrastructure;</li>
<li>Acts of terrorism;</li>
<li>Threats to the UK’s national security.</li>
</ol>
<p>Interestingly the report refers to the incident at Gatwick airport during the Christmas period in 2018, but has failed to refer to or provide any evidence that there was in fact a drone that caused so much disruption to so many.  To date, there is still no evidence and Sussex Police have conceded that their investigation produced nothing costing the tax payer hundreds of thousands of pounds. The government’s strategy for promoting legitimate drone use in the UK will be set out in the forthcoming Aviation Strategy (date to be announced).</p>
<p>The Strategy paper focuses on the following with my comments:</p>
<ul>
<li>The Air Traffic Management and Unmanned Aircraft Bill will give police wide powers to deal with illegal drone use.</li>
</ul>
<p>COMMENT: this will be covered in more detail in another blog;</p>
<ul>
<li>A new industry action group will be formed that will include drone manufacturers to implement international design standards for integrated safety features and their technology pipelines;</li>
</ul>
<p>COMMENT: This is great, but realistically, is a Chinese manufacturer going to disclose their technology pipeline? How are different manufacturers going to agree international design standards unless mandated to do so?</p>
<ul>
<li>A mobile <a href="http://droneresist.com/" target="_blank" rel="noopener noreferrer">Drone Resist</a> unit containing detection and disruption equipment will be developed for deployment to drone-related incidents and major events across the UK;</li>
</ul>
<p>COMMENT: This is for the most part reactionary and how will the government mitigate the threat to a live drone attack in multiple locations simultaneously, e.g. three major airports at the same time being attacked by hostile drones.</p>
<ul>
<li>Police will have to log and record incidents of illegal and/or hostile drone activity to further understand the drone threat;</li>
</ul>
<p>COMMENT: Police should be doing something similar already as their CAA approved operations manual will most likely contain an obligation to record all flights; therefore not an unnecessary burden.</p>
<ul>
<li>Policymakers and regulators will engage with manufacturers of drone components;</li>
</ul>
<p>COMMENT: This is great, but what about a company’s intellectual property rights?  What if the company is in the Far East?</p>
<ul>
<li>By 30 November 2019, operators of drones weighing between 250g and 20Kg will have to register them with the CAA at a cost  and drone remote pilots will have to take an online competency test.  This will make <em>“it easier to identify a drone that is being misused.”</em></li>
</ul>
<p>COMMENT: The Irish Government have had registration since 2017 and it is a bureaucratic burden!  It has done nothing for safety apparently. Interesting to note how it will be easier to identify a drone that is being misused without actually stating how this is going to be achieved and why. It is almost another reason to justify registration, which cost has been recently reduced from £16 to £9 by the CAA for a registration platform that costs £millions, when data security by the CAA has recently proven not to be safe.</p>
<ul>
<li>The government is developing concepts for future implementation of an unmanned traffic management (UTM) system.</li>
</ul>
<p>Many governments are developing systems, but ultimately data from each drone operator will be required in order to provide the visibility necessary, and the same applies to some general aviation aircraft. The scope is being developed by a number of jurisdictions on both sides of the Atlantic.</p>
<ul>
<li>New definition of <em>“operational responders”</em> that must have counter drone knowledge, that just does not include the Police, but also:
<ul>
<li>Other public sector employees such as prison officers;</li>
<li>Private sector employees responsible for safety and security (prisons, CNI and crowded places);</li>
</ul>
</li>
</ul>
<p>COMMENT: Many will need training, will that be left to the NQE’s to pick up? There’s a potential business opportunity!</p>
<ul>
<li><em>“The police are able to legally deploy a range of DTI (detect, track and identify) and counter-drone effector systems”</em> versus <em>“current police powers need to be built upon to meet the evolving threat, and some of the processes that underpin these powers were not designed with counter-drone capability in mind.”</em></li>
</ul>
<p>COMMENT: There is a clear gap in the law here, are the Police acting illegally as the law is currently written? More analysis in another blog to follow.</p>
<p>It will come as no surprise that the paper raises many quesitons, but is a good start.  The governement is commited to working with various stakeholders, but the law must change so that all stakeholders can benfit accordingly…</p>
<p>The post <a href="https://blakistons.co.uk/uk-unveils-counter-drone-strategy-new-measures-to-combat-unmanned-aircraft-threats/">Government’s Drone Resist Strategy is launched… October 2019</a> appeared first on <a href="https://blakistons.co.uk">Blakistons</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
