To understand how the U.S. Navy will operate its enhanced safety corridors out of Hormuz, it is important to review the state of the art, to bring public perception, so often conditioned by images of the past, into line with current practice.
The U.S. military said it is deploying “multidomain unmanned platforms” as part of ongoing operations connected with "Project Freedom" aimed at restoring freedom of navigation in the Strait of Hormuz. In plain English, this means it has deployed hundreds of robots that are now operating in the air, on the surface of the Strait, and under the water. These systems, primarily managed through 5th Fleet's Task Force 59, are used for continuous surveillance, mine countermeasures, and other tasks, creating a kind of awareness and action bubble over the whole area.
This bubble will be mostly invisible to observers accustomed to World War 2 style convoy operations, with their ordered lines of merchant ships flanked by doughty escort ships and surmounted by overhead air cover, perhaps most recently typified by the 2020 movie Greyhound. But times have changed. In September 2021, the U.S. Navy’s 5th Fleet set up Task Force 59 in Bahrain with the goal of integrating artificial intelligence into unmanned systems.
It is believed to have hundreds of unmanned surface vessels, or USVs, on patrol in the narrow waters. One such type is the GARC, “a 4.8-meter craft with a 1.75-meter beam, 4,800-pound full-load displacement, 1,000-pound payload, 163 gallons of diesel, and a 200-horsepower engine. Claimed performance is 22 knots at cruise, 40 knots at sprint, and up to 700 nautical miles at cruise speed or 1,600 nautical miles at 5 knots, with operation in Sea State 5.” These can easily keep up with Iranian speedboats.
A more specialized form of robotic speedboat probably used in Hormuz is the CUSV minesweeping drone. Built by Textron, these sea drones can tow sonar arrays to locate underwater mines, and upon locating a suspected contact, can control long-range Unmanned Underwater Vehicles or UUVs, specifically the General Dynamics Mk18 Mod 2 Kingfish, to investigate and blow up sinister explosives.
Flying over this array of sea robots are unmanned aerial vehicles at high, medium, and low altitudes; the familiar MQ-4 Triton, MQ-9 Reaper, and a plethora of smaller disposable drones under the control of Task Force 59. Soaring above them all, watching the Gulf and Strait from outer space are the synthetic aperture radar satellites. These can look through smoke, clouds, and darkness and pick up objects ten inches in size on the surface of the earth. All this data is being picked up in addition to that already swept up by electronic and signals intelligence platforms.
But none of these systems would be of much use if the data could not be processed, analyzed, and integrated into the command system in near real time. AI plays a central, enabling role in managing the overwhelming volume of data generated by CENTCOM's multidomain robot armada. At an exercise called Digital Horizon in 2022, “Task Force 59 leveraged artificial intelligence to create an interface on one screen, also called a ‘single pane of glass.’ The screen displayed relevant data from multiple unmanned systems for watchstanders in Task Force 59’s Robotics Operations Center.”
The machine intelligence is always learning. One of the most useful abilities of the Navy’s AI system is its ability to sense when the enemy is up to something. AI builds baselines of "normal" maritime activity (vessel patterns, traffic flows) and flags deviations — details that might elude a human operator.
For all of its synoptic powers, AI can delegate responsibility more rapidly than a human through ‘edge computing’. The result is unprecedented operational speed.
Edge computing allows raw sensor data to be processed locally at the source of collection—such as directly onboard a surveillance drone, naval vessel, or tactical ground vehicle—rather than transmitting it to a centralized cloud server. This drastically reduces latency, conserves critical bandwidth, and ensures that multi-modal sensors can continuously fuse data and generate insights even in highly contested environments where communications are jammed or degraded…
For example, a high-altitude surveillance drone processing its own live video feed can utilize onboard computer vision to identify a specific hostile vehicle classification, extract the precise geolocation, and transmit only those relevant coordinates back to the command center. This edge-processing methodology dramatically reduces the bandwidth required for transmission and provides combatant commanders with immediate, uncluttered situational awareness.
Advanced US AI combat/targeting systems can nominate, prioritize, and support the prosecution of hundreds of targets per hour. This means that the digital convoy or enhanced security corridors of Project Freedom are potentially orders of magnitude faster and better than the Greyhound-era system. While the system may not work entirely as advertised (few things ever do) the AI controlled robotic overwatch has the potential to be extremely effective.
Actually, the public has already witnessed a demonstration of a similar system in action. An F-15E (call sign DUDE44) was shot down over southwestern Iran (Zagros Mountains area). Both crew members ejected; the pilot was rescued relatively quickly (within about 7 hours). The WSO (a colonel) evaded capture for about 50 hours, hiding in rugged terrain while injured, before a high-risk special operations extraction.
MQ-9 Reaper drones provided critical overwatch, surveillance, and fire support. They struck Iranian forces (IRGC and others) closing in on the WSO’s position, keeping threats at bay (e.g., within a few kilometers). Reapers flew protective perimeters and engaged "military-aged males" approaching the airman. AI ensured the airman was never ‘lost’ to CENTCOM, nor the subject of accidental fire by the A-10s and Reapers whirling in the sky. Considering that over 155 aircraft and multiple special teams were involved, the coordination was prodigious.
If all of this seems so marvelous to the reader, it is well to remember the words of Robert E. Lee: "It is well that war is so terrible, otherwise we should grow too fond of it." The AI systems and robotic hordes that Task Force 59 will employ in what is doubtless a good cause of reopening the arteries of global trade must, in a few short years, migrate to terror gangs and barbaric pirates. Today’s wonders will be tomorrow’s terror commodity products. Ask not for whom the robot drones. It drones for thee.






