In the 1950s, American strategists warned of a bomber gap. By the end of that decade, the fear had shifted to a missile gap. In both cases, the perception of a gap drove enormous defense investments that ultimately formed the backbone of a credible nuclear deterrent. Today, the joint force faces a different kind of gap. It is not a deficit in capability relative to an adversary. It is a failure to connect the capabilities it already owns. A true modular open systems approach is itself a strategic deterrent—a force that can reconfigure, integrate, and adapt faster than an adversary can target it is a force that deters conflict. But the Department of Defense has not yet closed this integration gap.
The money is flowing. The tools are arriving. Integration, however, is stuck several decades in the past. Absent marked change in how it makes integration a shared government-commercial responsibility, the department faces an integration gap that risks handicapping today’s commercial modernization push as merely experimentation. Warfighters require fielded and sustained combat capability—not science projects.
Having spent fourteen years building, testing, and fielding command-and-control technologies across the defense enterprise, I believe the joint force’s greatest risk in realizing the Joint Warfighting Concept is not a funding gap but an integration gap. Three foundational technologies—enterprise ontology management, conflict-free replicated data types, and zero-trust network architecture—are already proliferating through military formations. This is occurring through major investments like the Army’s Next Generation Command and Control, the combatant commands–focused Global Information Dominance Experiment, and the Marine Corps’s nascent Project Dynamis, among other significant joint force initiatives.
While the department is making the necessary data infrastructure investments with high urgency, it is missing three key ingredients that are required to sustain the present rate of innovation. First, the department needs to maintain a central, accessible, machine-readable body of data models on which the Joint Warfighting Concept depends; second, it needs department-wide leadership on integration standards across commercial and government systems; and third, it needs a modernized operational test and evaluation process that can keep pace with commercial AI-enabling technologies. The argument is no longer about whether to invest. It is about whether soldiers, Marines, sailors, airmen, and guardians can use these tools together for strategic, operational, and tactical effects.
I am a flight test engineer at the Air National Guard Air Force Reserve Command Test Center and a major in the Air Force Reserve. As a former federal civil servant and civilian highly qualified expert, I oversaw the department-wide adoption of the Tactical Assault Kit software suite and led the Maven Smart System program as it scaled to the combatant commands. From these vantage points, I have watched the military’s data infrastructure strain under the weight of its own ambitions—and I have seen what works.
The Tools Are Arriving
The Joint Warfighting Concept, now codified as Joint Publication 1, Volume 1, demands an observe-orient-decide-act cycle that is faster, more resilient, and more expansive than any prior generation of warfare. To deliver that, the department is making substantial investments in three categories of enabling technology.
The first, ontology management, solves the entity, relationship, and hierarchy resolution problem at scale. The Chief Digital and AI Office (CDAO) funds the Maven Smart System, which has scaled from a small prototype to a $1.3 billion contract ceiling serving most, if not all, combatant commands, the Joint Staff, and the Marine Corps. Separately, the Army awarded Palantir a $10 billion enterprise contract consolidating seventy-five existing contracts under a single data and software platform.
The CDAO also awarded the Open DAGIR other transaction agreement to onboard third-party capabilities into a government-owned data environment. Projects like Open DAGIR deliver enterprise ontologies to an ecosystem of modular capabilities, systematically combining data streams across multiple dimensions to produce consistent entities and relationships for those modular capabilities to use for targeting, fires, logistics, personnel, and other doctrinal warfighting functions. Think of an ontology as a shared vocabulary and rulebook that tells machines how to interpret and connect disparate data: a radar track, a satellite image, and a signals intelligence report all referring to the same ship.
To be sure, Palantir is not the only ontology provider. Companies like Technergetics have produced government-owned ontology management capabilities and fielded them on operational networks. Ontologies keep the vast data of today’s battlespace organized and accessible.
The second enabling technology, conflict-free replicated data types (CRDTs), efficiently distribute that connected and converged data across environments with intermittent or degraded connectivity. CRDTs are common throughout modern technology offerings, and the military is gaining immediate advantage by fielding them. The CDAO is funding a three-year, $100 million Anduril Edge Data Mesh pilot in December 2024, while the Army’s Next Generation Command and Control is funding two data mesh prototypes with Anduril and Lockheed Martin valued together at nearly $130 million. CRDTs are key to these future-focused projects. They allow many users to operate on the same data simultaneously with minimal risk of divergence, even when network connections are intermittent—precisely the conditions the joint force should expect in a fight against a peer adversary. Products like Ditto and Lattice are two prominent examples of CRDTs proliferating through military formations.
The third enabling technology is zero-trust network architecture, which provides the security wrapper. The Department of Defense has long acknowledged that it cannot produce all of these technologies organically, and a 2025 executive order on defense acquisition reform went further, mandating a “first preference for commercial solutions” across all pending defense contracting actions. Zero-trust architecture, when properly deployed, makes military data flows generally invisible and inaccessible to adversaries—allowing sensor data to reach shooters, and shooter data to reach sensors, with minimal bandwidth overhead. The government continues to make major investments and progress toward the realization of its zero trust strategy, including tactical edge implementations.
Together, these three technologies enable a comprehensive data mesh that connects the global sensing enterprise with the tactical shooter enterprise, replacing the centralized hub-and-spoke model that has defined military command and control for decades.
But having the tools is not the same as having them work together. The government and commercial engineers who are trying to wire these systems into a coherent whole face a scattered landscape of implementation standards published in different formats, data models maintained by different organizations on different networks, and validation processes that were designed for a slower era of acquisition.
Those standards, models, and processes are the key to unlocking the value of ontology management, conflict-free replicated data types, and zero-trust network architecture—because the value of a data mesh, like any network, grows as the square of its connected nodes. This is Metcalfe’s law in action: Every system that cannot interoperate is not merely absent from the network but actively reduces the return on every system that can. The tools are arriving. The integration is not.
The Integration Gap
The problem is not that interoperability standards do not exist. Standards like the Open Mission Network Interface have gained wide adoption across the combined joint all-domain command-and-control space precisely because they were published as code and made available to developers. The problem is that the specifications, data models, and interface definitions the joint force depends on are scattered across multiple repositories, classification domains, and organizational sponsors—each with different access requirements and different levels of machine readability. A software engineer at Fort Bragg and a data officer at Indo-Pacific Command are often solving the same integration problem independently because neither can easily discover what the other has already built.
The policy mandate to close this gap already exists. The tri-service MOSA directive requires modular open systems approaches in all new acquisition programs. The software acquisition pathway mandates iterative, continuous delivery of software capabilities using published DevSecOps reference design specifications. Continuous delivery ensures warfighters can frequently and securely receive and use the latest features, much like your smartphone apps do today. And the Department of Defense’s API strategy recognizes that application programming interfaces are essential to interoperability and joint warfighting, just like any common banking, rideshare, or navigation app is essential to life in a digitally integrated society. In the age of AI, the government has a leading role to play in the safe, suitable, and effective integration of these systems. That role begins with making the authoritative standards findable, accessible, and consumable as code in one place.
Pockets of excellence exist. The enduring problem, though, is that these repositories are fragmented across different classification domains, different authentication schemes, and different organizational sponsors. The git.mil repository houses various government-centric projects and is easily accessible to Common Access Card holders. The TAK.gov code repository hosts the Tactical Assault Kit software suite as well as the Open Mission Network Interface specifications. The STITCHES repository is the long-term home of DARPA’s mosaic warfare integration suite. The Test Resource Management Center provides a rich repository of models for modeling and simulation purposes.
Modern commercial vendors also provide their own software development kits and application programming interface documentation on their public websites.
The Defense Logistics Agency’s ASSIST database is the official Department of Defense repository for military specifications and standards, and the agency’s data modernization initiative is making progress toward digital delivery. But ASSIST has historically published these standards as PDF documents—human-friendly, not machine-readable. (No, uploading the PDFs into GenAI.mil and asking Gemini for a JSON or Protocol Buffers structure does not make them machine-readable.)
For the Joint Warfighting Concept to work at machine speed, the most urgent priority is as code delivery and sustainment of the machine-to-machine standards that underpin joint interoperability: Tactical Digital Information Link–J, Variable Message Format, Joint Range Extension Applications Protocol–C, and the broader tactical datalink standards body. Critical specifications like the Open Mission Systems and Universal Command and Control Interface standards face the same challenge: In my personal experience, they are often squirreled away on stovepiped systems housed across the Defense Research and Engineering Network and require dedicated sponsorship to access them. They aren’t completely inaccessible, but there is substantial friction. Friction across many channels is a major contributor to today’s integration gap. By contrast, in the private sector, integration accelerates in the open.
If the department is serious about warfighting advantage through machine-speed interoperability and AI-accelerated decision-making, these standards need to be centrally maintained and published in machine-readable specifications—protocol buffer definitions, JSON schemas, API documents—that commercial and government developers can consume directly in their applications and, in due time, agentic frameworks.
Three Things the Department of Defense Should Do
Leaders from government and industry have already decided what products, standards, and technologies will populate tomorrow’s battlefields. The Department of Defense needs to provide opinionated technical leadership on integration standards and a modernized assessment process to sustain the current pace of innovation. Three steps would quickly close the integration gap.
First, the department should establish a unified, accessible code repository and data model registry for combined joint all-domain command-and-control integration standards. The existing repositories at git.mil, TAK.gov, and STITCHES.mil prove the model works, but they are siloed, as are many others across the military services and the intelligence community. A single authoritative registry—federated if necessary, but discoverable through a common portal—would allow any developer, military or commercial, to find the canonical data models, protocol buffer definitions, and interface specifications for the systems their software must interoperate with. This is not a technology problem. It is an organizational decision that senior leaders can make today and should not cede to industry.
Second, the department should publish critical interoperability standards—including Open Mission Systems / Universal Command and Control Interface, and tactical datalink specifications—in machine-readable formats with open access for credentialed developers. Standards locked in PDF documents and gated behind cumbersome access processes cannot drive the kind of rapid, iterative integration that commercial software development demands. If the department wants commercial firms to build interoperable products at commercial speed, it must make the specifications broadly available in the formats those firms actually use: protocol buffers, JSON and XML schema, and API documentation. The Open Mission Network Interface specifications hosted in the https://git.tak.gov/standards repository are a strong example of how this can work.
Third, the department should modernize the operational test and evaluation process for commercial AI-enabling technologies to ensure seamless transition from technology to capability. Soldiers, Marines, sailors, airmen, and guardians are already providing feedback on these tools in the field. That feedback needs to flow into a continuous, formalized assessment process that asks the right questions: Does this technology make discrete, doctrinal warfighting functions faster? Safer? More resilient? More distributed? The current operational test and evaluation framework was designed for hardware programs of record with multiyear development cycles. It is not designed for commercial software that ships updates monthly. The department should commit to fully proliferating working technologies based on warfighter feedback while funding continuous doctrinal assessment of these technologies’ effects on how the force fights.
The goal of modern operational test and evaluation is not to test whether the software works—that remains the job of developmental test and organizations like the Joint Staff’s J6 directorate and the services’ own developmental test functions. The goal is to assess how these investments change the character of the fight, to provide objective feedback as commercial teams iterate on their designs, and to encode these assessments into doctrine and tactics so the next unit to receive the technology arrives in theater with a playbook. While the emergence of commercial capabilities accelerates realization of the latest Joint Warfighting Concept, the objective assessment of the mission outcomes has been shelved in the interest of speed. That rigor must now be reinstated as part of the road to fielding combat capability.
From Technology to Capability
The integration gap is not a technology or resourcing problem. It is a coordination problem. Ontology management, conflict-free replicated data types, and zero-trust network architecture are no longer experimental technologies awaiting investment decisions. They are arriving on the battlefield right now, funded by major programs of record and delivered by commercial firms at commercial speed.
Cold War perceptions of the bomber gap and the missile gap drove investments that formed a credible deterrent. The integration gap demands the same urgency—because a force that can reconfigure faster than it can be targeted is a force that deters conflict. The risk is not that the joint force will fail to buy the right plumbing. The risk is that each pipe will be plumbed to a different standard, and when the force needs to push water through the whole system at once, nothing will connect.
Accessible data model specifications, a unified integration registry, and a modernized AI-focused assessment process are not glamorous reforms. They do not fly, shoot, or explode. But they will determine whether the joint force’s historic investments in AI-enabling infrastructure produce a data mesh that works across services and combatant commands—or a collection of proprietary stovepipes that merely look modern. The tools are here. It’s time to close the integration gap.
Major Ryan McLean is a flight test engineer at the Air National Guard Air Force Reserve Command Test Center and a student at Air Command and Staff College’s eSchool of Graduate Professional Military Education. He has spent fourteen years in defense engineering leading technology programs across the Department of Defense, including the Tactical Assault Kit and Maven Smart System, and deployed to Afghanistan with the 101st Airborne Division.
The views expressed are those of the author and do not reflect the official position of the United States Military Academy, the Department of the Army, the Air National Guard Air Force Reserve Command Test Center, the Air Force Reserve, the Department of Defense, or any other government agency.
Image credit: Chustine Minoda, US Air Force
