Defense Podcasts

The Lessons from the Drone Wars: The Podcast

Picture a Russian soldier in a fortified trench, overwhelmed by an enemy he cannot see. In desperation, he holds up a piece of cardboard reading “We want to surrender” and points it not at a human officer but at the camera of a ground robot rolling toward his position. This is not speculative fiction; it is the new routine of the Ukrainian battlefield as described in Robbin F. Laird’s 2026 book, Lessons from the Drone Wars: Maritime Autonomous Systems and Maritime Operations. The book and a recent podcast discussion about it portray a world in which a 400‑dollar commercial drone can destroy a multimillion‑dollar strategic bomber, museum aircraft train AI systems to strike active warships, and a spool of fiber‑optic microfilament can neutralize high‑end electronic warfare domes. The math of modern conflict has been broken and rebuilt on entirely different foundations.​

The Autonomy Illusion

Laird begins by dismantling one of the most persistent myths surrounding drone warfare: that autonomous weapons are already roaming battlefields as fully independent, Terminator‑style killers. In military doctrine, autonomy is defined along a strict five‑level scale, with level five representing a fully independent system making strategic decisions without human intervention, a capability that remains firmly theoretical. Even the most advanced systems in service today, such as the MQ‑9 Reaper, the MQ‑4C Triton, and Australia’s Ghost Bat, operate in the far more constrained world of levels two and three.​

At level two, a machine can handle basic functions such as steering and acceleration, but a human operator still guides the mission and remains responsible for its conduct. At level three, the system may make immediate tactical decisions, adjusting course, avoiding obstacles, refining a firing solution, but a human must be ready to intervene at any moment. The Reaper’s cockpit still has a physical stick and throttle; they simply sit in a ground control station thousands of miles away from the aircraft itself. Laird warns that calling such systems “autonomous” obscures the real issues. Sensational debates about rogue AI crowd out the more urgent operational questions of what current systems can truly do, where they are vulnerable, and how commanders should employ them in combat.​

Intelligent Mass vs. Exquisite Scarcity

At the core of Laird’s analysis lies a stark tension between two competing philosophies of power: intelligent mass and exquisite scarcity. Exquisite scarcity is the traditional Western and Russian model, invest heavily in a small number of technologically supreme platforms, from stealth fighters and strategic bombers to billion‑dollar destroyers. These systems are extraordinary but also irreplaceable; the loss of a single platform can constitute a strategic disaster rather than a mere tactical setback.​

Intelligent mass offers a radically different logic: saturate the battlespace with large numbers of cheap, networked, “good‑enough” systems that are individually expendable but collectively decisive. In Ukraine, first‑person‑view drones costing between 400 and 500 dollars have, in some engagements, generated casualty rates of 70 to 80 percent among Russian units, destroying armored vehicles and logistics hubs worth millions. The cost‑exchange ratio is devastating to traditional doctrine.​

Nowhere is this imbalance more obvious than in the Red Sea. Houthi forces have used drones priced from roughly 2,000 to 50,000 dollars to harass and threaten international shipping lanes. Coalition forces have often responded with advanced interceptor missiles costing between 2 million and 27 million dollars per shot. No economist is required to see that this equation is unsustainable: a determined adversary with industrial capacity and access to inexpensive drones can bleed a superpower’s defense budget simply by forcing it into a defensive posture.​

Israel’s Iron Beam program points toward a way out of this trap. By using directed‑energy weapons, Israel has reduced the cost of intercepting an incoming drone to roughly 3 dollars per engagement, with an effectively bottomless “magazine” limited mainly by power supply. Laird frames such systems as one of the few credible long‑term answers to the economics of intelligent mass.​

Operation Spider Web: Museum‑Trained Strike

One of the most striking case studies Laird explores is Operation Spider Web, an audacious Ukrainian strike conducted on June 1, 2025 against five Russian air bases spanning five time zones. Rather than sending stealth bombers, Ukraine relied on 117 OSA quadcopters, each hidden inside an ordinary wooden cabin mounted on the flatbed of a commercial truck. Unaware Russian civilian drivers transported these vehicles along standard shipping routes, using commercial 4G networks and parking near the perimeter of supposedly secure installations.​

The targeting problem was formidable. Once airborne, the drones would have to operate in heavily jammed environments where radio‑based remote control was impossible. The engineering solution was disarmingly simple and cheap: Ukrainian teams trained their AI targeting algorithms using old Soviet aircraft displayed at the Poltava Museum of Long Range Aviation. Unable to collect data from active bombers, they instead mapped the geometry of museum exhibits, feeding thousands of images and sensor readings into the system until it could recognize 90‑centimeter aim points such as fuel tanks and wing roots. These signatures translated directly to active aircraft in Russia’s bomber force.​

The AI was then coupled with open‑source autopilot software, enabling the drones to navigate without radio links, relying entirely on internal cameras and museum‑derived pattern recognition. The operation’s outcome was staggering: those low‑cost systems destroyed 41 aircraft worth roughly 7 billion dollars—around a third of Russia’s strategic bomber fleet—in a single strike. Geographic depth and rear‑area basing, long seen as a sanctuary for scarce, exquisite assets, no longer offered reliable protection.​

The “Subsea Baby” and Multi‑Domain Operations

On December 15, 2025, an unmanned underwater vehicle nicknamed the “Subsea Baby” disabled a Kilo‑class Russian submarine at its pier in Novorossiysk, a vessel equipped to launch Kalibr cruise missiles and designed to epitomize stealth. The symbolism was blunt: submarines, the classic silent predators of the sea, could now be hunted in their home ports. Laird shows that this was not a one‑off improvisation but the culmination of careful multi‑domain planning.​

Days before the Subsea Baby entered the harbor, Ukrainian forces launched a separate aerial drone strike aimed at destroying Russia’s sole IL‑38 maritime patrol aircraft in the region—the airborne sensor system that functioned as the fleet’s eyes. Eliminating that single aircraft created a precise gap in Russian maritime surveillance. Within that blind spot, the underwater vehicle slipped through undetected, executing its mission against the submarine at the pier. The operational logic was methodical: blind the watchtower, then send in the underwater assassin.​

Russia’s reaction to this new vulnerability reinforces one of Laird’s broader themes: mutual vulnerability at sea. Unable to operate surface warships safely in the western Black Sea, where Ukrainian maritime drones made billion‑dollar vessels effectively indefensible, Russia shifted to sustained aerial bombardment of Odessa’s ports and energy infrastructure in late 2025 and early 2026. Ukraine could contest the sea and sink ships but could not fully shield its coastal cities from missile and drone strikes. Neither side could achieve uncontested dominance; both could inflict significant economic damage. Laird describes this as a grinding contest of mutual vulnerability, a technologically enabled stalemate that challenges traditional concepts of sea control.​

Fiber Optics and Fast Followers

On land, Laird argues that Ukraine now hosts perhaps the most complex electromagnetic environment in history, as both sides blanket the battlespace with overlapping jamming fields that sever radio links between operators and drones. Sophisticated systems can turn into inert metal the instant they cross into the wrong frequency band. The technical response is paradoxically simple: fiber‑optic control cables.​

Instead of relying on jammable radio waves, engineers equip drones with ultra‑thin microfilament cables that unspool behind the aircraft, kept under zero tension as the drone advances. The operator retains an unjammable high‑definition video feed and control link, effectively immune to expensive electronic warfare systems thanks to a spool of wire. It is a quintessential example of the low‑cost inversion Laird sees throughout the conflict: a modest technological adjustment that outmaneuvers multimillion‑dollar systems.​

But Laird is careful not to turn this into a triumphalist story about Ukrainian ingenuity alone. The fast‑follower dynamic cuts both ways. Russia captures Ukrainian and Western systems, reverse‑engineers them, and fields its own variants, compressing adaptation cycles from years to months or even days. Moscow is now targeting annual production of roughly 1.4 million drones, a figure that underlines Laird’s contention that rapid iteration has become the ultimate weapon of modern war. Having the best idea first matters less if your adversary can replicate and mass‑produce it within weeks.​

Hedgehog States and a Distributed Defense

Laird ultimately expands his operational analysis into a structural critique of traditional defense models, which he frames as the “hedgehog state logic”. Twentieth‑century deterrence relied on large standing armies and centralized installations: major bases, massive depots, visible concentrations of combat power. In his view, this architecture has become fundamentally vulnerable in an age of intelligent mass and precision strike. The alternative is a highly distributed, resilient system that draws on the entire fabric of society.​

Ukraine’s wartime industrial strategy offers a proof of concept. Rather than concentrating drone production in a small number of major factories, the country deliberately dispersed manufacturing across more than 500 private producers, garage workshops, startup hubs, small‑batch fabrication shops embedded within the civilian economy. The result is a defense ecosystem that behaves more like the Internet than a traditional industrial base: a mesh of nodes rather than a single mainframe. Such a network is far harder to cripple with precision strikes because there is no singular “center” to destroy.​

This logic poses uncomfortable questions for established military bureaucracies that can take two decades to field a new fighter aircraft. How can organizations built around slow, linear procurement cycles survive in an environment where frontline units iterate tactics and technologies in weeks ?

Laird suggests that the problem is not limited to defense ministries. Any institution anchored in slow legacy systems and rigid hierarchies now faces similar competitive pressure from cheaper, faster, more adaptive rivals. The multimillion‑dollar bomber on the tarmac is also a metaphor for legacy corporations, regulatory systems, and infrastructure.​

A Final Provocation: Democratizing Lethality

The podcast closes on a question that Laird’s book raises but does not fully resolve: what happens when this level of low‑cost, AI‑enabled lethality migrates beyond the battlefield ? If militaries can use 400‑dollar FPV drones, museum‑trained AI, and fiber‑optic‑guided underwater systems with such precision and effect, what will non‑state actors, organized crime, or commercial rivals be able to do when similar capabilities become widely accessible ? How do cities, ports, and civilian infrastructure defend themselves when autonomous or semi‑autonomous strike systems can be acquired for the price of a laptop and a few components ordered online ?​

Laird is explicit that his book is a work of strategic analysis rather than prophecy. Yet the trajectory he documents. from garage‑built FPV drones to museum‑trained strike packages to multi‑domain operations against submarines at pier, suggests that these questions are arriving faster than most institutions can adapt. The rules of conflict have been rewritten; the more urgent test is whether our thinking, our organizations, and our societies can keep pace.​