Bridging the “Valley of Death”

Defense Acquisition
10 min readDec 29, 2016

--

By Anthony Davis & Tom Ballenger

San Fransico, CA. Photo by U.S. Army Cpl. Timothy Yao.

The “valley of death” between technology development efforts and production programs has long been a problem in the government and private industry. Despite the U.S. Special Operations Command’s (SOCOM) reputation for agile development and rapid acquisition, the same has been true for SOCOM. This article focuses on the development of a new methodology to capture discrete actions in preparation for a technology transition and measure organizational confidence in the success of that transition. Initial indications are that this process significantly increases the likelihood of successful technology transition and that the associated metrics and methodology could be quickly and easily adopted by other acquisition organizations to help them bridge their own “valleys of death” and avoid failed or suboptimal transitions.

In 2014, the command’s Special Operations Forces Acquisition, Technology, and Logistics organization (SOF AT&L) began trying to address its transition shortcomings by moving an experienced, proven program executive officer (PEO) to direct the Science and Technology (S&T) organization. The PEO previously was quite vocal regarding the command’s lack of success in regularly transitioning technologies to a program of record. After roughly a year in the position, numerous changes had been made to increase the likelihood of successful transitions. Despite those efforts, the director still had no real way to measure or predict the probability of transition success either for individual projects or across the portfolio. A team was chartered to look at appropriate leading and following metrics and began work on the problem.

During the research process, the team identified a separate but related issue. While the S&T project managers had a clear understanding that transition of their technology was a desired outcome, there was little common ground between that and the mandate of the PEOs’ program managers who were driven by cost, schedule and the performance of their existing programmatic acquisition strategy.

So, the final challenge to the team was to (1) develop a series of metrics to measure the transition success of each S&T project, (2) ensure those metrics could be aggregated to the portfolio level, and (3) incorporate a mechanism that ensured S&T project managers and PEO program managers would have a common understanding of the mechanisms and motivations for transition.

The search for appropriate tools began with some known constraints. Ideally, a transition support metric would be easy to implement and actually decrease workload for portfolio management. It must fit within funding realities and existing data infrastructure. It must reflect the important balance between innovation opportunities and operational outcomes. To minimize cultural resistance to adoption, it must avoid external benchmarking as measures of success. Most importantly, it must support the SOCOM SOF AT&L customer.

An MQ-9 Reaper Extended Range sits in a hangar Nov. 17, 2015, at Creech Air Force Base, NV. The Reaper is one of several remote-piloted aircraft used by SOCOM. Photo by U.S. Air Force Airman 1st Class Christian Clausen.

Open-source research revealed a common theme across government and commercial development. While the ingredients and pathways of technological progress are well understood, there are few best-practice or standard mechanisms to measure and manage technology transition efforts. In some cases, projects were initiated or even completed before transition potential was determined. In other cases, project initiation required approval from an external oversight council to ensure alignment with the program enterprise. Neither of these extreme approaches are appropriate for SOCOM S&T implementation. The search continued for a solution between these extremes.

The Government Accountability Office (GAO) has studied this issue for more than 40 years. In multiple reports dating back to 1974, GAO has called for better transition metrics and more active management of transition efforts. In recent years, they highlighted the success of transition commitment metrics used by the Joint Capability Technology Demonstration and Future Naval Capabilities programs. These scales scored each project by whether a transition agreement was complete, in progress or absent. Implementation of standardized transition assessment was a step in the right direction.

The innovation environment at SOCOM AT&L encourages risk taking in S&T. Signed transition agreements represent a very high standard for projects. Special Operations PEOs seek to retain their programs’ agility and will not readily commit to unproven solutions. A transition commitment metric tailored for use in SOCOM S&T needs to recognize more incremental precursor steps. The Technology Readiness Level (TRL) scale fills a similar role in the realm of technology risk. GAO recommended DoD-wide adoption of TRL in 1999 following successful use by NASA and the U.S. Air Force. It is well-understood, universally accepted, and applicable across a wide variety of technologies. It is as useful as it is simple. We set out to establish a similar tool for transition management.

The simplicity and applicability of TRL became the tailoring benchmark for a new transition commitment metric. The team first replaced the term commitment with confidence to better reflect a dynamic continuum rather than a binary condition. The new Transition Confidence Level (TCL) scale has the same numerical range and objective accomplishment-based approach as the TRL scale. The 1–9 scaling was initiated as a matter of convenience but later proved to support some compelling data visualization relative to TRL. The steps follow a logical arc from uncertainty to a completed transition, as shown in Table 1.

Like the TRL chart, the steps enable status scoring for a project, and they form a roadmap for progress and coordination typically needed for transition success. In that sense, the TCL chart is both a scorecard and a checklist. The defining characteristics of each level are tailorable to organizational behaviors or changing dynamics between technology developers and PEO leaders. The chart retains its usefulness as long as it represents the organization’s desired steps between project initiation inputs and completed transitions. The current iteration allows a project to proceed to TCL 4 dependent only on internal S&T Directorate activities. These precursor steps provide a progress report on the S&T team’s transition planning during initial project incubation. Advancement to TCL 5 and beyond requires explicit cooperation and increasing coordination with a program office. A project at TCL 7 and 8 merits senior leader attention to ensure high-level coordination for funding, contract actions and organizational handover. We expect the contents of the chart to evolve to meet emerging process changes and support maturing relationships with transition stakeholders.

Implementation of the TCL metric included workforce training, project assessments, TCL chart configuration management, and incorporation of TCL data entry into the Directorate’s knowledge management portal. Workforce training was not difficult. Each technologist and project manager was already familiar with transition planning, command expectations, and the use of similar tools like the TRL scale. Introduction of TCL simply assigned a number and standardized a reporting framework for a process the workforce members already were executing. Project assessments were straightforward. The technology transition lead for the Directorate became the configuration manager for the TCL chart and would control its contents and evolution. The knowledge management portal modification was completed via established change request procedures. Of note, the data entry method for the portal did not include TCL definitions, only the number. This decoupled configuration management of the TCL scale from the portal modification process. Once each project had a TCL value and action officers could keep that value updated in the portal, management metrics can be extracted to inform portfolio decisions across diverse efforts and projects.

The implemented TCL metric enables consistent, uniform discussions of transition likelihood across different types of technologies. The steps capture the organization’s pathway for S&T and program coordination, encouraging both sides of the “valley of death” to lean toward each other to close the gap. Especially for those steps requiring accord between S&T leaders and program managers, it provides a dispassionate, objective framework for discussions and organizational progress. It makes project relevance and transition outcomes a part of every project discussion while contributing to portfolio transparency. The ability to adapt the characteristics of each level ensures relevance as organizational relationships and needs change. Finally, TCL can quickly cue leaders in both the S&T and program spheres to imbalances in the portfolio. The ability to quickly identify outliers allows leaders to allocate their time and attention where they are needed most.

At the individual project level, TCL quantifies a project’s transition status. At the portfolio level, it provides an organizational health indicator that can cue leader decisions. While individual project officers strive for the highest TCL possible for their projects, a very high average TCL for the entire portfolio may indicate inappropriate risk avoidance. If every project will transition, the valiant failures of a dynamic research organization are missing. Conversely, a very low average TCL may indicate a lack of relevance to supported programs. In the case of SOCOM S&T, the target TCL is intended to hover between 4 and 7. It will probably reflect some seasonality under fiscal rules as cohorts of new projects will drive down portfolio TCL upon initiation. As projects mature, the TCL will increase until driven down by a new class of projects with the following year’s appropriation. Likewise, once projects complete their transition and leave the portfolio, their high TCL scores are removed from the equation to be replaced by lower TCL new projects. While not directly coupled, average TRL of the portfolio will follow similar ebbs and flows. An example visualization of average TRL and TCL is shown in Figure 1.

The ability to measure transition confidence in a scale calibrated to technology readiness enables some helpful visualization. The hypothetical S&T portfolio in Table 2 includes data for current TRL, current TCL, and budget. A quick graphic presents a powerful visual tool, shown in Figure 2. Money and time will tend to move projects to the right. Project relevance and program office coordination will tend to move projects toward the top. Relative budget size is an indicator of command priority and risk tolerance. Taken together, these metrics reveal that expensive projects in the bottom right of the chart might be consuming resources best spent on projects at the top left of the chart. No specific behavior rules are needed. The chart is a decision-support tool that graphically presents key data for numerous projects to enable leaders to make more informed decisions no matter the trade space.

Because TCL does not invoke any external standards, S&T organizations are only making internal comparisons. This alleviates concerns about different missions, stakeholders and desired outcomes amongst the many diverse development organizations. Leaders can set their own internal goals and manage against them.

TCL can also contribute to project storyboards for both current status and archiving. When combined with TRL and financial execution data and goals over time, a powerful visualization is formed showing a single timeline of obligations, expenditures, TRL, and TCL; an example is shown in Figure 3. Using averages for TRL and TCL, the storyboard can cover multiple projects within a function or the entire portfolio to compare performance between divisions or year to year.

SOCOM S&T has implemented TCL, and requires its project managers to track and report the measure along with TRL for each of their projects on a recurring basis. The lack of subjectivity in the scale makes it easy to score projects, monitor progress over time, and quickly assess average TCL for the entire portfolio or other subordinate areas. TCL quickly identifies the outliers, allowing leadership to concentrate on candidates for more direct senior coordination, candidates for divestment, and candidates requiring additional funding versus projects “on glideslope” for transition. The data and visualizations can be used explicitly for a management by exception approach or as a tailorable decision support tool for portfolio management.

The adoption of TCL has provided a wealth of insight into the progress of the S&T portfolio toward transition with a minimum of additional data entry. Additionally, the presence of this data on SOF AT&L’s real time dashboard provides complete transparency and understanding between the project manager, S&T director, program manager and PEO. The command believes the tool has immediate potential application to numerous S&T organizations and portfolios and is easily adaptable to fit each organization’s particular needs.

SOCOM S&T plans to continue use of TCL and TRL as complementary measures of project performance, and will continue maturing visualization tools to support informed leadership decision making. The command welcomes any inputs or ideas for how to improve the metrics or visualizations, and is interested in discussing those ideas further.

Davis is the U.S. Special Operations Command director of agile acquisition. He previously was the program executive officer for command, control, communications, and computer systems and director of science and technology. Ballenger is an aviation systems analyst with JHNA, Inc. A retired U.S. Army officer, he provides contract science and technology support to USSOCOM.

The authors can be contacted at anthony.davis@socom.mil and tom.ballenger@jhna.com.

This article was originally published at dau.dodlive.mil and first appeared on December 29, 2016.

--

--