NATO Open Source — A new way of building software and services

Markus Sandelin
14 min readApr 13, 2024

--

A White Paper

Markus Sandelin (markus.sandelin@mil.fi) / Finnish Defence Forces
v. 0.06— April 30th 2024

TLDR: I propose an open Source like developer community combined with NATO-owned, cloud-like infrastructure ecosystem inside NATO, both operated and maintained by commercial partners.

Change log:

v.0.01 (April 12th 2024): Initial release

v.0.02 (April 13th 2024): Proofreading, updates based on feedback — thanks to all commenters and supporters so far. Published on Medium.

v.0.03 (April 17th 2024): Updates based on discussion in the LinkedIn comments thread and offline within the FDF.

v.0.04 (April 20th 2024): Expanded some concepts, added NATO Open Cloud concept.

v.0.05 (April 25th 2024): Added product management concepts

v.0.06 (April 30th 2024): Compared and iterated concepts from NATO Digital Policy Committee’s working paper: AC/322-D(2024)0017, A Data Centric Reference Architecture for the Alliance, 13 March 2024

Preface

Software has become exponentially more complicated in the last 20 years. Cloud services were the response to Mobile applications leveraging APIs, and the only infrastructure that could support it during that topological phase of the internet was the cloud. Since, even cloud services themselves have become too complicated. Their benefits have been starting to hinder competing against more localised distributed architectures.

The purchasing of software has changed greatly during this time. Today, _nothing_ works off the shelf in a defense scenario. This leads to a situation where many nations are paying horrific license costs for custom software because the buyers are afraid to build their own.

This has to change.

Open source and commercial

Open source software has been around for a long time. Its core principles are tied with large developer and user communities that act as the testers and sometimes contributors to the software in hand. It is a tested, working model.

Commercial software started with a software house building a solution for a single customer. Personally I doubt there is even a single success story of a software suite built for a “universal client”. Commercial providers build what their customers want to pay for, and new features are paid for. In general, software makers have 1–3 main clients who direct the direction of the software vector. Those clients in turn decide what their *de facto* in-house dev team will build for them next.

The boring parts of tech infrastructure lands in no-man’s land. It’s not sexy enough to turn into a commercial product, and it’s too specific and custom to be open sourced. The only option for data storages and conduits is to build it. Luckily, there are endless projects and programs building the modular components of these un-sexy necessities.

At the same time we face difficulties integrating even different versions of the same software suite. Due to the increasing complexity of clients wanting bespoke things, the software companies themselves are choking up and giving out open change windows of 6 months and more. During a war, six months means death.

There are no existing or ready solutions in the market for NATO. This will not change.

The solution: NATO Open Source

NATO is a rare beast in its uniqueness. It has partner nations who share similar operating environments, the DOTMLPFIi parameters are similar. Defence is one of the few domains which has long been parametrised and structured in its data. It is standing on a pile of legacy software waiting to be changed. The time is now to flip the switch and design how NATO will handle the paradigm change of what the war in Ukraine has taught us.

The four key tenets were summed up excellently by UK colleagues:

- The software is commercially supported (but not owned)
- The software is open source and therefore bespoke-able
- The software exchanges data using — or can be converted easily — to open standards
- The software has a large vested interest group involved in its development — be that a global civilian community or a global military community.

By leading software development from the front together with commercial partners and their developers, a much stronger and durable tech ecosystem could be built for and ran by NATO.

An overall vision is needed, divided into individually buildable and maintainable applications that talk to each other using industry standards. These apps need to be iteratively developed with bite sized sub projects e.g. to add new features and regular sprints/exercise to test progress & refine direction & dev priorities.

Development cycles

In order for the model to work, it has to be phased into clear development cycles and processes. Simplified, it could be:

1. Specifying business requirements
2. Analysing business requirements into a concept
3. Evaluating existing NATO and/or Open Source initiatives
3. Finding pilot nations from allies
4. Making proof of concept with pilot nation
5. Expand piloting the proof of concept to more nations
6. Alpha release — start of maintenance/operations
7. Beta release
8. V1.0 and full on production
9. Constant development, repeat until
10. End of life / merger to other code base

NATO has put the enablers in place. We have a legal framework (NCoDe) and we have a NATO Software Factory where the source code can be managed or at least stored. If you add a bunch of enthusiastic developers that solve real operational problems then you have all the ingredients for a successful solution, as long as the respective service providers are willing to deploy those free software assets as part of their services. But here comes the problem: Do you want to help the operators to solve their problems, or do you want to satisfy a lengthy acquisition process? If we take digital transformation seriously we need to take risk, potentially fail fast and learn quickly. — Gernot Friedrich, NATO Digital Staff

The technical approach shares similarities with Gartner’s Bimodal IT concept, where Mode 1 is optimized for areas that are more predictable and well-understood. It focuses on exploiting what is known, while renovating the legacy environment into a state that is fit for a digital world. Mode 2 is exploratory, experimenting to solve new problems and optimized for areas of uncertainty. These initiatives often begin with a hypothesis that is tested and adapted during a process involving short iterations, potentially adopting a minimum viable product (MVP) approach.

In this model, there exists a three spoke design:

  1. The infrastructure software components (Mode 1) would be a federated “cloud-like” collection of services operated by allied nations, divided between two tiers: Core operators that would offer a geographically positioned full fledged service package, and Edge operators that would offer specific services relevant to geographical and/or other needs. Each nation would be connected to the NATO data mesh and provide APIs both to and from nations.
  2. The application software components (Mode 2) would be built in multi-national teams including NATO staff, allied staff and time & material based assets from professional services providers. This allows normal competition, meets the requirements for tenders and evens out the playing field with software providers. NSPA could operate this procedure together with NCIA.
  3. Commercial partner provided operations and maintenance, including customer support and technical documentation in a standardised format would be tendered out in 4 year contracts. The bidding would be done every four years, allowing a 6-12 month hand-over period. This could be split into smaller chunks if deemed necessary.

In short: non-commercial ownership and initial release software build driven by NATO, then commercial partners handling development, operations and maintenance.

Delivery

NCIA already maintains repository of software builds, and it could continue this task in the future. These builds and versions would be audited, accredited and distributed similarly as of now. Individual core components could be audited separately in a modular way allowing a more compounded method of security.

The code base could be openly available for all allied nations and those part of the NATO fringes such as partner nations. They would run standard software repository and code base softwares, which would be licensed by NATO. For the first time, there could be a software ecosystem similar to how crucial Tidepedia has become for CWIX and similar exercises.

In short: Delivery methods already exist.

Common data architecture and APIs

One of the most important elements would be to scale gains is to tie NCDF and the NATO data mesh solidly in to the ecosystem, together with FMN requirements and other doctrines implemented in a way they become intrinsic parts of the fabric of NATO Open Source.

The community needs to define API structures, UI frameworks, messages and other basic elements found in civilian components for decades. The goal is to remove rebuilding same functionalities repeatedly between projects and the NCDF and similar projects already provide the tools for doing so.

The NATO Digital Backbone is a federation of networks and systems that provides the technical means for a resilient, scalable, and secure digital service continuum including cloud and edge services. This federation connects sensors, decision makers, actors and effectors, across the various organizational, national, operational and security domain boundaries, supporting basic activities and current operations (BACO) up to the maximum level of effort — AC/322-D(2024)0004 (INV)

Based on the NATO DPC AC322 working paper, I summarised it as

  1. NATO’s Digital Backbone uses a model-driven approach to manage application logic and data independent of technology.
  2. Standardization is crucial for this paradigm, enabling uniform data exchange and minimizing obsolescence challenges.
  3. Select a data-centric architecture as the basis for a Data Management Framework.
  4. Use Data Meshes as a decentralized approach to integrate legacy systems.
  5. Adopt a Model Driven Architectural approach to separate application logic from platform.
  6. Innovative path: Determine strategic data requirements, harmonize NATO Data standards, assess and rationalize data definitions, deploy target data structures, and create data bridges and APIs.
  7. Established path: Review existing NATO projects and activities, align data definitions and structures with business needs, update NATO Data standards to align with NCDF, and standardise web APIs.

In short: Common data architecture has been planned, and is being tested at the moment.

Powering NATO Open Source — The Hybrid Cloud Solution

The success of NATO Open Source hinges on a robust and secure IT infrastructure. This chapter explores the proposed hybrid cloud solution, leveraging both centralized cloud services and cloud edge services provided by NATO allies.

Centralized Cloud Services:

  • Core Infrastructure: A central cloud, hosted by a trusted provider, would house core applications, data storage, and collaboration tools for the NATO Open Source ecosystem. This ensures standardized platforms for development and deployment.
  • Security and Compliance: The central cloud provider would offer the highest levels of security and compliance with NATO’s strict data protection regulations.

Cloud Edge Services:

  • National Clouds: Member states can contribute cloud resources and services at the network edge. These “cloud edge” services would provide additional processing power, storage, and disaster recovery capabilities closer to operational areas. This enhances responsiveness and reduces latency for time-sensitive applications.
  • Specialized Expertise: Nations with specific cloud expertise can offer specialized services within the NATO Open Source ecosystem. For example, a nation with advanced cyber security capabilities could provide secure cloud storage for sensitive data.

Benefits of a Hybrid Cloud:

  • Flexibility and Scalability: The hybrid cloud provides the flexibility to scale resources up or down based on project needs. Centralized services provide a stable foundation, while cloud edge services allow for customization and regional optimization.
  • Resilience and Redundancy: Distributing services across a central cloud and multiple national clouds ensures redundancy and minimizes the impact of outages in any single location.
  • Sovereignty and Control: NATO retains control over core applications and data in the central cloud. Member states can control their own cloud edge services, addressing potential concerns over data sovereignty.

Challenges and Considerations:

  • Standardization and Interoperability: Standardized protocols and APIs are crucial to ensure seamless communication and data exchange between the central cloud and disparate cloud edge services.
  • Security Integration: Security measures across the entire hybrid cloud environment need to be robust and consistently enforced. This includes joint threat assessments and coordinated vulnerability management across all cloud platforms.
  • Governance and Cost Sharing: A clear governance model needs to be established to manage resource allocation, service level agreements, and cost-sharing between NATO and member states.

Conclusion:

The hybrid cloud solution is the cornerstone of NATO Open Source. Based on NATO’s AC322 they suggest a similar mechanism, calling it the “NATO Digital Backbone”. By combining centralized and cloud edge services, NATO fosters collaboration, leverages national expertise, and ensures a secure, scalable, and resilient foundation for its open-source software development efforts. In short, this follows similar structures than Federated Mission Networking (FMN) but for a continuous operational environment.

Building the Foundation — Architecture and Governance for NATO Open Source

The built and owned components would be documented using NAF principles. This creates a holistic roadmap of what exists, what has been planned and what will be needed. Again, things could continue much like they do currently.

A strong foundation is essential for any successful endeavor. This chapter explores the architectural principles and standardized governance practices that will underpin the NATO Open Source initiative.

Common Enterprise Architecture (CEA):

A common enterprise architecture (CEA) defines a blueprint for the technology infrastructure and software components of NATO Open Source. This standardized approach offers several benefits:

  • Reduced Complexity: Standardized modules and interfaces streamline development and deployment, eliminating the need to reinvent the wheel for each project.
  • Improved Interoperability: A shared architecture ensures seamless communication and data exchange between different software components, fostering collaboration and reducing integration headaches.
  • Enhanced Maintainability: Standardized modules are easier to maintain and update, reducing costs and improving overall system efficiency.
  • Increased Security: A defined security framework within the CEA ensures consistent security practices throughout the development lifecycle, mitigating vulnerabilities.

Standardized Modules:

The CEA will be built using pre-defined, reusable modules that encapsulate specific functionalities. These modules can include:

  • Data Services: Modules providing standardized data access, storage, and manipulation capabilities.
  • Authentication and Authorization: Modules ensuring secure user access and permission management.
  • Security Services: Modules offering encryption, intrusion detection, and other security features.
  • User Interface Components: Reusable building blocks for creating user interfaces that are consistent and user-friendly across applications.

Standardized Governance Principles:

Effective governance is vital to ensure the smooth operation and continuous evolution of the NATO Open Source ecosystem. Key governance principles include:

  • Open Collaboration: Fostering a collaborative environment where all stakeholders, including NATO, member states, and industry partners, contribute to decision-making and development efforts.
  • Transparency: Maintaining clear communication channels and ensuring all participants have access to relevant information about projects, standards, and decision-making processes.
  • Standardized Processes: Implementing well-defined processes for development, deployment, version control, and maintenance to ensure consistency and quality.
  • Meritocratic Decision-Making: Encouraging decision-making based on technical merit and the best interests of the alliance, fostering innovation and efficient resource allocation.
  • Continuous Improvement: Establishing a framework for ongoing evaluation, feedback, and improvement of NATO Open Source architecture, governance practices, and software components.

Benefits of Standardized Governance:

  • Reduced Risk: Clear processes and decision-making frameworks mitigate risks associated with development and deployment.
  • Increased Efficiency: Standardized workflows lead to faster development cycles and improved resource utilization.
  • Improved Quality: Defined quality standards ensure the software components are robust, secure, and meet NATO’s requirements.
  • Enhanced Agility: Flexible governance allows for rapid adaptation to new technologies and evolving needs.

Conclusion:

A well-defined common enterprise architecture and standardized governance principles provide the foundation for a successful NATO Open Source ecosystem. Standardized modules promote collaboration and efficiency, while strong governance ensures quality, security, and adaptability. By establishing a robust architectural and governance framework, NATO sets the stage for a collaborative and innovative approach to developing software that meets the critical needs of the alliance.

In order to plan, build and operate this type of infrastructure is based on a genuine enterprise architecture is mandatory, and will require a three phased approach of existing, short term, and long term designs.

In short: Common architecture frameworks exist and are in use.

Product management

The open source component and product portfolio needs leadership, as established before in this white paper. I suggest a two-tier product management structure, similar to U.S. Department of Defence product xTAK (team awareness/tactical assault kit), which has a TAK product center acting as the central decision making and prioritisation element, complemented by a Configuration Steering Board (CSB) that consists of participation of several U.S. agencies and military units providing the top level business requirements.

This two tier approach allows a similar mechanism that a warfighting force has on the field, while the strategic command and goals come from the upper echelons of the command chain. Big picture and tactical implementation.

Benefits:

  • Enhanced Decision-Making: A two-tier structure separates strategic vision from tactical implementation, leading to more focused decision-making at each level.
  • Streamlined Prioritization: The NATO Open Source product center, acting as the central hub, can prioritize development efforts based on input from the steering group, ensuring resources are directed towards the most critical needs.
  • Improved Collaboration: The steering group, with participation from various agencies and units, fosters collaboration and ensures diverse perspectives are considered during product development.
  • Increased Agility: The separation of concerns allows for quicker tactical decisions and adaptations within the product center, leading to a more agile development process.
  • Alignment with NATO’s Structure: Mirroring the “warfighting force” and “strategic command” dynamic reflects NATO’s existing structure, promoting familiarity and ease of adoption.

By implementing this two-tier structure, NATO can effectively lead and manage its open-source product portfolio. The benefits of streamlined decision-making, improved collaboration, and increased agility will position NATO for success in the open-source software landscape.

Risks and requirements

Do we have the right people designing, building and planning the alliance’s digital future? We have the functional specialists, but are we able to muster the right teams to lead this kind of a convergence? I question this, because a lot of the planning is done by commercial partners and not NATO itself.

Several nations are in a situation with their software providers where they are paying a lot of money and getting little results. Exhausted — lacking the resolve and willingness to take responsibility in pushing for a joint ecosystem instead of minor integration work between messaging standards. NATO needs to step in, as no single nation is able to do this alone, leaving us for large corporations trying to do it instead.

With the proposed approach, total costs will be lower, and milestones can be set to make sure the focus remains. Product and project managers can be used from commercial partners, and they can help with running the teams. Leaving us the most important part — to provide the functional and operative view from NATO personnel and allies.

The issue is that over the past couple of years, the NATO nations in the resource community (RPPB, IC and the IS/NOR) have explicitly stated in various policy papers that the Agency *should not* create software (even with industry), but should instead adopt existing national solutions unchanged, or buy COTS with minimal modification. … Alternatively, NATO nations would need to develop these collaborative open-source solutions independently of any NATO management or investment, and then offer off them as proven, off-the-shelf products or services in which NATO then invests once completed. -Paul Howland, NCIA

The challenge for open source is in the overall project management and maintenance. These are areas that don’t lend themselves so well to a distributed, community developed program. The most successful open source projects generally tend to have very clear leadership and ownership. Therefore, to solve this issue the leadership of a software product relies on NATOs shoulder, differing it from traditional open source project.

In short: Risks are known, but manageable and the benefits outweigh the risks.

Finances

Development can be done by NATO or allied nations. This investment should count towards the 2% target for nations. The same way that arms purchases support allied economies, this would allow a new way for nations to get their software houses working on a part of NATO’s technical foundations. Technologically apt countries might even become more proficient providing software than warfighting capabilities.

Each country willing to pilot new proof of concept projects would need to commit resources and personnel for testing, and commit to using the software after successful building. After that NATO could ask for a license fee in the same manner they do now with NCIA for applications relying on commercial licenses. The more the services are used, the cheaper it would get.

Incentisivation

Any nation who builds components and shares them in the NATO Open Source ecosystem, will be compensated for the use of their software based on the use of the software in terms of seats and/or installations, or there can be a credit system allowing the country to save on future ecosystem purchases. Resulting in nations willing to keep building and improving their products for others to use.

Final notes

This document is the first draft of a complicated issue that has faults, logical errors and needs a lot more work. But it is a beginning of a thought process, which could end up providing major benefits for the alliance as a whole. All feedback is welcomed, comments to the post and Linkedin discussion happening in this thread.

In Bydgoszcz, April 12th 2024.

Markus

--

--