Morgan Dwyer, Brenen Tidwell, and Alec Blivas
Acquisition reform occurs in cycles. For example, to increase acquisition speed, the most recent cycle restructured the Pentagon to
reduce and decentralize the Office of the Secretary of Defense’s (OSD) oversight of major defense acquisition programs (MDAPs).
Using a qualitative and quantitative analysis of past reform cycles and MDAP cycle times (i.e., the time to field new capabilities),
this analysis observes that:
• Even though OSD oversight activities take time, they do not appreciably slow down MDAP acquisition speed;
• Instead, strong, centralized OSD oversight may reduce MDAP cycle times and cycle time growth; and
• The Pentagon has historically fielded new MDAP capabilities at average speeds that are comparable to external benchmarks.
Based on these findings, recent reforms—which reduced and decentralized OSD oversight—may not increase MDAP acquisition
speed. And although the Pentagon does field some MDAPs quite slowly, reformers should not use the experience of worst-case
programs to motivate future reforms of the entire acquisition system.
In defense acquisition, reform is constant. Over the past six decades, reforms have been initiated, implemented, and evaluated, only to be initiated all over again. This pattern—and its repetition throughout history—has led some to describe acquisition reform as a “never-ending cycle” whereby discrete periods of time are characterized by different initiatives.1 Although these initiatives consistently seek to reduce cost, shorten schedules, and increase performance, reformers’ priorities have varied throughout history. Today’s reformers, for example, are focused primarily on acquisition speed (e.g., see the National Defense Authorization Act (NDAA) 2016 Secs. 804, 810, 821, 823, 825 and NDAA 2017 Secs. 805, 806, 807, 901).2 Reformers’ focus on speed is due, in part, to perceptions that U.S. technological advantage vis-à-vis its adversaries is eroding and that the timelines to field new capabilities are dramatically different between the Department of Defense (DOD) and the U.S. private sector.
3 To evaluate those perceptions, this brief compares MDAP cycle times to external benchmarks. It also evaluates recent reforms’ potential to increase acquisition speed by comparing cycle time statistics across the various cycles of acquisition reform.
Today’s reforms aim to speed up the acquisition process as defined by DOD Directive 5000.1. The traditional process, depicted in Figure 1, consists of several milestones. At each milestone, senior DOD officials review progress and determine whether programs should continue to the next phase. Traditionally, officials from the OSD have reviewed and approved DOD’s largest programs (i.e., MDAPs).
DOD typically initiates MDAPs at milestone B, after which full-scale system engineering begins. Next, DOD reviews system designs at milestone C. Pending milestone approval, programs begin low-rate production and system testing.
Once test results are satisfactory, DOD certifies that programs have reached initial operating capability (IOC) and that systems are ready for use.
Today’s reforms aim to shorten the time spent between program initiation and IOC. To achieve this objective, reformers created alternative acquisition pathways (e.g., NDAA 2016 Sec. 804’s “middle tier acquisition”) that largely eschew traditional, OSD-led oversight activities.
4 Reformers also delegated much of OSD’s authority to conduct MDAP milestone reviews back to the military services.5 oversight is unsurprising. Oversight—which often takes the form of reporting requirements and reviews—can lengthen program schedules by adding activities that take time to complete. For example, the Government Accountability Office found that, in a sample of 24 programs, staff spent an average of two years completing the steps necessary to pass an OSD-led milestone review and 5,600 total staff days documenting that work.6 Relatedly, RAND found that 5 percent of a program office staff ’s time was dedicated to regulatory and statutory compliance,7 and researchers at the George Washington University found that between 5 and 40 percent of a contractor’s time was spent complying to oversight requirements.8 By decentralizing and delegating acquisition oversight, today’s reformers hope to reduce the time that programs dedicate to OSD-led oversight activities, thereby shortening the duration between program initiation and IOC. DOD, through its National Defense Strategy, has embraced reformers’ focus on speed and affirmed that it must “deliver performance at the speed of relevance.”9
Importantly, today’s focus on speed is not unique. Rather, recent moves to decentralize OSD oversight follow nearly six decades and multiple cycles of prior acquisition reform. Although the specifics of each reform initiative are distinct and complex, from a macroscopic perspective it is possible to characterize past cycles according to the mechanisms that reformers employed. This brief focuses on one mechanism—
OSD oversight’s centralization or decentralization—which has been both the focus of prior research and which uniquely affects MDAPs.10 The brief acknowledges, however, that reformers sometimes employ multiple mechanisms simultaneously and that these mechanisms may interact in non-simple, non-obvious ways. This analysis, therefore, provides just one perspective on acquisition reform cycles and MDAP cycle times.
Acknowledging these limitations, Table 1 identifies eight reform cycles—including today’s—and classifies those cycles according to their preference for centralized or decentralized oversight.11 These cycles are also summarized briefly below:
1. McNamara Reforms: Secretary Robert McNamara leveraged authorities granted by the DOD Reorganization Act of 1958 to centralize OSD control over military service budgets and major program decisions.12
2. Defense Systems Acquisition Reform Council: Deputy Secretary David Packard created the Defense Systems Acquisition Reform Council (DSARC) to limit OSD involvement in the acquisition process. Through the DSARC, OSD assessed programs at discrete milestones
but otherwise delegated management responsibility to the military services.13 3. Brown Strengthens Control: In response to Packard’s “management by objective” approach, Secretary Harold Brown sought to regain and centralize OSD authority over the acquisition process.14
4. Acquisition Improvement Program: In response to Brown’s tighter OSD control, Secretary Caspar Weinberger
and Deputy Secretary Frank Carlucci initiated the Acquisition Improvement Program to enable the “controlled decentralization” of OSD’s authority.15 5. Defense Acquisition Board: Congress initiated a series of reforms—including the creation of an undersecretary of defense for acquisition—aimed at centralizing and strengthening OSD control over the acquisition process.16 Toward this end, OSD established the Defense Acquisition Board to oversee MDAPs throughout their lifecycle.17
6. Mandate for Change and Transformation: During this extended period—which spanned nearly two administrations—OSD emphasized deregulation and management streamlining but not scrupulous oversight of early program decisions.18 DOD also heavily relied on Total System Performance Responsibility (TSPR) contracts during this period. These contracts delegated a significant amount of authority and responsibility to DOD contractors and in doing so eroded the department’s ability to conduct rigorous oversight.19
7. Weapon Systems Acquisition Reform Act: Responding to cost growth during the prior cycle, Congress implemented a series of reforms aimed at centralizing OSD authority—especially over early program milestones.20 OSD’s Better Buying Power initiative attempted to further strengthen program management throughout the system lifecycle.21 8. Restructuring AT&L: Today’s reformers intend to increase acquisition speed and strengthen DOD’s technological edge by splitting up the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics into two separate offices. To reduce cycle times, the procurement-focused office has delegated much of its oversight authority to the military services.22
These cycles provide a framework for assessing DOD’s historic acquisition speed. Specifically, by classifying programs according to reform cycle or cycle type (i.e., centralized or decentralized oversight), it is possible to observe past reforms’ macroscopic impact on acquisition speed. This analysis can then be used to inform expectations for today’s reforms and to help benchmark DOD’s future “speed of relevance.”
Acquisition speed can be assessed using two variables: cycle time and cycle time growth. Cycle time is the time elapsed between program initiation (typically milestone B, but sometimes milestone C) and IOC.23 Cycle time growth is the percent change in a program’s estimated and actual cycle time.24 Cycle time, therefore, represents the speed with which DOD fields new capabilities. Cycle time growth represents the accuracy with which DOD is able to predict that speed.
Using data from the Defense Acquisition Management Information Retrieval (DAMIR) System and RAND’s Defense Systems Cost Performance Database,25 cycle time and cycle time growth were calculated for all MDAP programs and subprograms for which data was available.26 MDAPs represent DOD’s most costly and complex programs; therefore, in many ways, they are not representative of much of the technology that DOD acquires.
However, MDAP data is readily available. Furthermore, changes to OSD oversight affect MDAPs more significantly than any other programs.
For these reasons, this analysis and its conclusions are limited to MDAPs only.
Additionally, several assumptions were made when collecting and labeling data. Most importantly, it is assumed that MDAPs are most significantly affected by the policies in place at program initiation.27 Therefore, even if MDAPs spanned more than one reform cycle, they are classified according to the cycle in which they were initiated. It is also important to note that MDAP schedule data is not always reliable or of high quality; therefore, many other assumptions were also required to collect data and these assumptions may affect the analysis results. For more detail on the data collection and analysis assumptions used in this brief, please refer a forthcoming report on this topic, as well as to the detailed endnotes provided at the conclusion of this brief. 28Ultimately, schedule data was collected for over 200 active and complete MDAP programs and subprograms that DOD initiated from fiscal year (FY) 1963 to the present.29 Using this data, it can be observed that despite numerous reform cycles, acquisition speed has remained.
ASSESSING THE “SPEED OF RELEVANCE”
To assess whether DOD fields systems at the “speed of relevance,” DOD cycle times were compared to external benchmarks from the U.S. private sector and China’s People’s Liberation Army (PLA). The comparisons are limited, however, by the availability and quality of opensource data. The best option, therefore, is to compare the data set of over 200 MDAP cycle times to a handful of benchmark systems with rough schedule estimates.
To estimate non-DOD cycle times, the analysis leverages a DARPA report that contains data on the U.S. private sector and uses open-source reporting on PLA systems. In both instances, it is assumed that the dates reported are consistent with the definitions of program initiation and IOC the definitions of program initiation and IOC that were used for MDAPs. For PLA systems in particular, program initiation dates were identified using media reports which stated when the PLA began system development or issued contracts. Such assumptions, of course, limit the ability to draw definitive conclusions. As such, U.S. private-sector and PLA cycle times were used only as rough benchmarks for the “speed of relevance.”
Acknowledging these limitations and using DARPA’s U.S. private-sector data, commercial aircraft cycle times increased from approximately four to seven years since 1965.40 Commercial vehicle cycle times decreased during this time, from approximately seven to two years.41 As shown in Table 5, DOD’s mean aircraft and vehicle cycle times are consistent with the U.S. private sector, but DOD’s worst-case MDAPs significantly exceeded private-sector cycle times. As above, Table 5 contains all complete MDAPs and active MDAPs initiated between FY 1963 and FY 2014 for which data was available.
Based on limited, open-source data on example PLA systems, DOD average cycle times, for the most part, appear to outpace comparable PLA systems—even though the PLA frequently accelerates technology development using espionage, intellectual property theft, and foreign military procurement.42 For example, although DOD’s mean cycle time for aircraft is 6.6 years, the PLA appears to have fielded the J-20 and the Y-20 in approximately 15 and 10 years, respectively.43 Compared to the DOD aircraft shown in Table 5, these example PLA cycle times are closer to DOD’s worstcase cycle time for aircraft.
DOD’s mean cycle time for subs and ships—7.5 years—also appears to outpace some open-source PLA examples. For instance, the PLA appears to have fielded both the Type 093 Shang-class submarine and the Type 052A destroyer in approximately 10 years.44 Notably, the PLA appears to have fielded its new aircraft carrier, the Type 001A Shandong (CV17), rather quickly, in approximately five years.45 Compared to DOD capabilities, however, many of these benchmark systems appear inferior by at least some performance metrics.46 In each example, however, the PLA’s cycle times do appear to outpace DOD’s worst-case cycle times.
THE FUTURE FOR REFORM
This brief demonstrates the utility of using acquisition history to improve the defense community’s understanding of current and future reforms. Using a mix of qualitative and quantitative analysis, this brief observes that reforms which decentralize OSD oversight do not appreciably decrease MDAP cycle time. Instead, that centralized OSD oversight may help reduce cycle times and cycle time growth.
Based on these findings, recent reforms—which instead decentralized OSD oversight—may be ill-suited to achieve their objective of increasing speed, at least for MDAPs.
However, MDAPs are DOD’s most costly and complex programs and do not represent all of the technology that DOD acquires. Acquisition reform itself is complex, and countless factors besides OSD oversight—including workforce, industrial base health, budget, and regulations— all affect acquisition speed in non-simple and non-obvious ways.
This analysis contributes but one perspective on reform cycles and cycle times within an extensive history of acquisition reform.
Morgan Dwyer is a fellow in the International Security Program and deputy director for policy analysis in the Defense-Industrial Initiatives Group at the Center for Strategic and International Studies (CSIS) in Washington, D.C. Brenen Tidwell was a research intern with the Defense-Industrial Initiatives Group at CSIS. Alec Blivas was a program coordinator with the International Security Program at CSIS.
This material is based upon work supported by the Acquisition Research Program under Grant No. HQ00341910011. The views expressed in written materials or publications, and/or made by speakers, moderators, and presenters, do not necessarily reflect the official policies of the Department of Defense nor does mention of trade names, commercial practices, or organizations imply endorsement by the U.S. government.