The July 24 Defense News article, “David Norquist has one word for you: Analytics,” refers to Mr. Norquist’s emphasis on analytics at the Office of the Secretary of Defense as a means for assessing program performance and return on investment. In Mr. Norquist’s written response to policy questions provided in advance by the Senate Armed Services Committee, he stated that “the Department should have an active analytical staff to provide it with relevant data and objective perspectives.” Data analytics drives budget and resource decisions ,and the quality of those decisions depends upon analytics, but analytics depends upon the availability of data. The Department of Defense does not possess the repositories of data it needs to employ such analytics.

All of the potential of “Big Data” and analytics to achieve transformative improvements in efficiency rests upon having both a plan and a process to collect data in a repository where is it accessible to analysts. Mr. Norquist’s testimony does not acknowledge that the services’ many programs do not collect and archive the relevant data from which performance assessments can be derived through analysis.

While it is seemingly intuitive that the services should collect, analyze and archive data, there is no directive from the secretary requiring the service components to do so. Some do and some don’t, but it is left to the services to decide rather than be directed by the Office of the Secretary of Defense, or OSD.

The DoD Instructions in the 8320 series promulgate policy to the services on sharing data but they do not identify data attributes and sources within programs from which data should be collected. If a program exists, it has a budget line and program element. Every program element should have a directive to collect and archive data.

Nowhere is the failure to collect and analyze performance data more evident than in the domain of force readiness and training and the assessment of trainee performance. Using Marine Corps ground training systems as an example, training systems for performance of technical skills that can be objectively assessed, i.e., “scored,” have been developed, tested and fielded with no requirement to record and archive trainee performance for analysis. Even when data is recorded locally at the point of training, there is no strategy or infrastructure built to transport it from the field to the cloud.

Ironically, the Marine Corps does have a program of record for a cloud-hosted repository of training data and individual training performance called the Marine Corps Training Information Management System, or MCTIMS. Inexplicably, MCTIMS hosts mostly data that is manually entered by clerks such as rifle qualification or physical fitness test scores. The only automated training data that populates the MCTIMS records of Marines are the completion certificates for online courses and the annual refresher training courses such as cyber awareness training, combating human trafficking, operational security and tobacco cessation courses. The gamut of individual and unit collective training events have no data recorded or analyzed in MCTIMS.

As one example, the Marine Corps’ Indoor Simulated Marksmanship Trainer program that has been fielded since 1993 has no requirement and no ability to record and upload performance data for shooters. The National Defense Authorization Act for fiscal 2018 contained the following direction to the Department of Defense:

"Therefore, the committee directs the Secretary of Defense to provide a briefing … that includes, at a minimum: a detailed description of the evaluation metrics that each service plans to use to ensure all new small arms simulation training systems and programs of instructions are tested to demonstrate clear and repeated live fire transfer proficiency and combat readiness prior to system acquisition; an assessment of the live fire performance results of existing simulation and synthetic small arms training systems being utilized; a plan to ensure future systems are capable of data collection that gives commands the ability to maintain and track individual and squad-level training records, provide trend analysis and forecast models to reduce training time and accurately determine live fire transfer readiness, and train to multiple proficiency levels and threat evolutions; and, examples of current simulation and synthetic small arms training systems that are documenting cost savings in ammunition, travel, training time, and expedited and improved qualification and remediation rates."

Almost two years have passed since the FY18 NDAA was signed into law, and neither OSD nor the Marine Corps have enacted policy necessary to ensure that Congress is provided the “plan to ensure future systems are capable of data collection.”

In practical terms, data analytics is more of a buzzword than an implemented process in many areas of the DoD. The examples cited above are specifically from the domain of training, but a data collection strategy is lacking from many different domains. It’s time for OSD to publish an instruction mandating the collection and archiving of performance data for all manner of programs with particular attention to data relating to the key performance parameters of acquisition programs.

Walt Yates graduated from Texas A&M University in 1990 with a B.S. in Mechanical Engineering Technology and was commissioned a Second Lieutenant in the United States Marine Corps. Upon graduation from The Basic School he was assigned to the Field Artillery military occupational specialty and served ashore and deployed afloat in billets including platoon commander, fire direction officer, and battery commander and an assignment on recruiting duty.

Col. Walt Yates (ret.) served in the U.S. Marine Corps. His last assignment before retiring was as the Corps’ program manager for training systems in Orlando, Florida.

Share:
More In Commentary