Avoiding the Military’s Software Security Black Hole: Measuring Software Adoption Approval

Ryan Lewis
5 min readJun 6, 2023

--

Authors: Ryan Lewis and John Speed Meyers

In the past year, there has been a marked increase in the number of reports or commissions highlighting the critical importance of accelerating technology development and adoption by the US military and intelligence agencies. These accounts rightly point out that future US military security competitiveness depends on the rapid integration of emerging technologies into the national security ecosystem. Yet, most of the recommendations focused on necessary changes and reforms to the acquisition system. While such recommended changes are no doubt critical, there are other variables impacting the technology adoption timeline. There have been few studies on software security practices and their effect on software procurement and availability despite the widespread belief that the software security processes built into military procurement (often referred to as the “authority to operate” or ATO process) unnecessarily slow building and buying software while achieving few of the desired security goals.

This state of affairs, if true, should concern the leadership of the U.S. military and intelligence community (IC). Because most commentators agree that software, whether via machine learning applications, embedded in weapon systems or inside a back-office IT systems, has become essential to the future of military power, the possible existence of processes that hamper or outright impede adoption and sustainment of new software should be a cause for concern. Acquisition and security officials within U.S. national security organizations seem to be caught between potentially conflicting requirements: adopt and maintain an ever-growing volume new

software faster while maintaining a consistent security and risk posture. Unfortunately, it is hard to know the magnitude of the impact without authoritative data and systematic analysis. To date, aspects of the ATO process can be likened to a black hole, sucking in software systems without allowing information about the process to escape, which limits the ability to improve and reform it to meet future requirements.

It’s therefore time for the military and IC to undertake systematic studies that examine the ATO process for a wide variety of military and intelligence software programs, uncovering the extent to which the current process affect procurement and program adoption as well as the extent to which it improves an agency’s security posture. If nothing else, it would be helpful to collect and aggregate data on ATO timelines, approvals, rejections, submission types, etc. so practitioners, researchers, and policymakers can better quantify the extent of the challenge and evaluate the impact of previous reforms. For instance, the availability of Small Business Innovation Research (SBIR) data led to the proliferation of analysis and policy reform recommendations (SBIR substack series). The rest of this article describes early indications of a potential ATO problem and outlines an ATO “observatory” approach to examining this potential problem.

ATO Canaries

Because there is no authoritative data on the U.S. military and IC’s ATO processes, analysts interested in ATO are currently reduced to either using their own first-hand experiences of the process or limited survey results conducted by organizations servicing the federal technology market. But what “data” that does exist are mixed, presenting an inconsistent, and sometimes troubling, outlook on the current process. At a minimum, these initial insights should prompt a request for a more comprehensive data collection and analysis.

From an anecdotal perspective, one of the authors, while consulting for an investor focused on early-stage venture capital investment with IC applications, found that this investor was convinced that the ATO process was dooming the efforts of his portfolio companies to deploy for real missions. Nearly two dozen interviews with staff at the portfolio companies, investor staff and even with the government itself revealed that there was no hard data on the ATO process. How long did this process usually take for the specific type of software in question? No one could say for sure. Who was in charge? Again, no one could say for sure. Commentators can be forgiven, then, for seeing this process as a black hole that sucks in software programs and deforms the best-laid plans, all without emitting any information back to the military and IC leaders on the inner workings of the process.

From a more quantitative perspective, some companies supporting the federal technology market have posted results from their surveys on the federal ATO process and its impact on the government’s wider security and technology adoption goals. Security Compass published “The 2021 State of Secure Development & ATO in U.S. Government Agencies” which includes results from 122 respondents, 52 from federal agencies with the remainder from state and local agencies. The results were mixed. On the one hand, ~76% of the federal respondents surveyed said they were satisfied with their security teams and information systems security managers (ISSMs) ability to stay on top of changing compliance requirements. On the other hand, 72% of federal respondents said it took 2 months or longer to achieve ATO. In fact, 38% indicated it took longer than 4 months. These timelines present a significant challenge to adoption of emerging technologies such as generative artificial intelligence techniques that are rapidly evolving almost daily.

An “Observatory” for Black Holes

Of course, the survey data and experiences described above might not be reflective of the aggregate performance across the U.S. military and IC. It is hard to say without more definitive data. Reformers might even argue that the recent growth of continuous ATO processes and fast track ATO mechanism, which are meant to reduce the toil of the government software security review and the likelihood of procurement delays, are reducing the ATO problem. At a surface level, these types of reforms make sense. That said, it would be particularly useful to measure their impact on the approval process particularly given the growing demand to increase the volume and speed of dual use and open source software used by the U.S. government.

To remedy this, it’s time for the military and IC to build an ATO “observatory,” research teams dedicated to understanding the ATO process as it currently exists throughout U.S. national security organizations. At least one of these observatories should be outside of the federal government to enable relatively greater independence while still maintaining access to sensitive information. These observatories should combine empirical software engineering expertise with software security knowledge and a sensitivity to government and military processes and culture. Furthermore, these data and initial findings should be made available to practitioners and researchers to increase industry participation in developing and experimenting with potential solutions. The importance of such an initiative is only likely to grow in the coming months as U.S. national security organizations seek to accelerate their adoption of emerging software technologies.

Ryan Lewis is a Partner at SRI Ventures and a Senior Associate (Non-Resident) at the Center for Strategic and International Studies (CSIS). John Speed Meyers is a principal research scientist at software supply chain security company Chainguard. The views expressed here are those of the authors alone.

--

--

Ryan Lewis

Early stage deep tech investor. Still love all things geospatial. Views expressed here are my own.