The Pentagon’s top tester has offered a mixed review of the U.S. Army’s Network Integration Evaluation (NIE) program, as well as one of the cornerstone technologies that the service has billed as the backbone for its future tactical communications needs.
The report, written by J. Michael Gilmore, director of Operational Test & Evaluation (DOT&E), was released Jan. 15. Gilmore wrote that the Army’s “execution of the NIEs has shown steady improvement” since the first event in the spring of 2011, lauding the service for incorporating lessons learned from previous events and making the event more expansive, and more challenging, with each iteration. The latest NIE in November 2012, Gilmore wrote, “was the best planned and executed NIE of the three conducted to date.”
Despite that, Gilmore’s office concluded that the Army not only attempts to evaluate too many new technologies at each NIE, but that it has been focusing on a schedule-driven agenda to meet its NIE objectives, as opposed to pursuing an “event-driven schedule appropriate to acquisition system development.”
The DOT&E added that “too many immature systems in the NIE challenges the Army’s capacity to employ appropriate instrumentation, collect relevant data, and conduct full and adequate assessments, detracting from the Army’s capability to perform focused evaluations.”
One of the major selling points of the NIE has been that it involves putting an entire brigade combat team in an operational setting for weeks at a time, pitting the force against a thinking, independent enemy that maneuvers and reacts to the team’s moves.
Gilmore’s office found that despite this unique and rigorous evaluation environment, too many field service representatives (FSRs) from industry attend each event to advise, repair and replace broken equipment. Therefore, the evaluation does not reflect what a unit will observe upon fielding the equipment in a combat environment.
“The density of contract FSRs and spare radios far exceeded what is likely to be the case within fielded brigades,” Gilmore wrote, adding that “easy access and over reliance on FSR support resulted in the test unit not having to realistically execute its field-level maintenance actions. Failure to accurately replicate ‘real world’ maintenance and logistics support causes operational availability rates and ease of maintenance to be overestimated in NIEs.”
The Army admits that the NIE continues to be a work in progress but is committed to working more closely with industry to plan the semi-annual event up to two years in advance in order to better align its modernization schedule with the defense industry’s research and development cycle, service leadership said at a Jan. 9 Industry Day event.
As for some of the specific technologies evaluated at the NIE, Gilmore’s shop continued to be highly critical of the Joint Tactical Radio System (JTRS) Handheld, Manpack, and Small Form Fit (HMS) Manpack Radio, made by General Dynamics.
When asked for comment, General Dynamics declined to address the criticisms in the report. In a written statement, the company called the radio a “first of its kind” and “the only radio that effectively connects lower-tier, dismounted soldiers into the ‘big’ Army network.”
Despite the fact that Gilmore’s office found the two-channel HMS Manpack to be “not operationally effective” after the wrap of NIE 12.2 in May 2012, the Army awarded General Dynamics C4 Systems a $306 million production order for 3,726 HMS radios on Nov. 30 — the first step in a total acquisition objective of 71,814 radios — after subsequent assessments met the service’s standards.
While General Dynamics and the Army insist that the necessary fixes to the program have been made, DOT&E complains that the two “did not address the previous recommendations to perform adequate developmental testing prior to operational testing and to complete necessary documentation to support developmental and operational testing.”
The report also complains that the program has focused more on its projected fielding schedule at the expense of developmental testing. Operational testing, however, “continues to reveal problems that developmental testing should have identified and fixed,” Gilmore concluded.