The Soldier's Network - U.S. Army: Testing The Waters

For as long as there has been a professional Army, there has been testing to make sure Soldiers get weapon systems that work. We are experts at testing tanks and trucks, howitzers and helmets, antennas and ammunition. But what about an Android app, downloaded in the field to a ruggedized smart­phone or tablet? Or a commercially developed radio that adds range and drops weight every other year?

These are the questions the Army test and acquisition communities are tackling today to align our testing approaches to the innovative and adaptable systems that will sup­port Force 2025. Equipping the future force to prevail over emerging challenges will require agility in how we deliver new capabilities, especially in the rapidly progressing realm of network, mission command and cyber. Smart, agile testing is a critical ingre­dient for these programs’ success, and the U.S. Army Test and Evaluation Command (ATEC) is partnering with program executive offices (PEOs) to establish constructs that maintain testing rigor and independent evaluation while responding to the reali­ties of new technology.

Striking this balance will help reduce testing time and cost to the Army and enable us to maintain overmatch against our adversaries. But while these efforts are informed by years of testing knowledge, recent process improvements and lessons learned from the Network Integration Evaluation (NIE) and other events, some technologies push the boundaries of previous experience. With limited military precedent for acquiring and testing these types of capabilities, ATEC and PEO Command, Control and Commu­nications – Tactical (C3T) are combining models from the commercial world, our own best practices and select test cases to lay the groundwork for smart testing for the future.

 

TACTICAL APPS

The Soldier's Network - U.S. Army: Testing The Waters

When you download a new app to your smartphone, you don’t run tests on the phone each time—you trust that the app has been validated and will add new capability to an already functioning system. To provide a similar experience for Soldiers, the Army is establishing a battle rhythm and the supporting technology framework that will allow us to build and deliver tactical apps for various mission functions as needs arise. With the advance validation of the hardware devices and software frameworks hosting the apps, government and industry developers can leverage approved software develop­ment kits and style guides that exist or are being implemented for various computing environments.

This approach allows a variety of organizations to create and sponsor apps, while mak­ing sure that they are building to the appropriate standard—much like Apple’s iOS, Google’s Android or another operating system would serve as the “broker” to ensure that a new feature does no harm to the existing product.

The focus then becomes tailoring the level of testing to the nature of the app itself, whether it is a complex integrated-fires app or radio-configuration app, or a simpler sunrise-sunset app, calcula­tor or training vignette. Some apps are essentially stand-alone capabilities, such as a calculator, while others need to interface with other data, such as pulling GPS location information and other services from the Joint Battle Command – Platform (JBC-P) system. This wide variety means that the pro­gram manager or other app sponsor needs to work closely with the testing counterpart early in the development process to determine the risk-reward trade-off and the best test approach.

We will also take into account the app’s life expectancy. An app showing Soldiers how to identify symptoms of a specific disease outbreak, for example, would probably have a shorter shelf life than an app for fires coordination. The goal is a cooperative assessment between the materiel developer and the testing experts of what’s required for each capability, so that the Army can deliver technologies more quickly and still provide confidence that they’re going to perform as needed for the Soldier, without overburdening the testing process.

All apps will be vetted in the developmen­tal testing and laboratory environment for functionality and security. Those that interface with other systems will also receive an interoperability assessment before fielding. For the more involved apps that also require an operational test, our goal is to establish a cycle wherein apps can be tested in batches a few times a year, allowing sponsors and testers to pool resources and reduce costs.

These tests will vary in location and complexity depending on the capabilities involved. The Army will structure the scenarios and survey questions to obtain data and user assessments focused on the new apps, so that we can determine whether the apps meet requirements and that the Soldier obtained the desired results. This battle rhythm, planned for implementation in the next few years, will allow us to develop, test and deliver tactical apps within a year, and sometimes faster when needed for a rapid response. That may still sound like an eternity compared with the commercial world, but it will allow us to maintain safety and security for Soldiers while offering a vast improvement over today’s four- to five-year cycle for mission command capability.

An early example of the new model is the On Demand Information Networking (ODIN) application developed for the Mounted Android Computing Environment (MACE), a standard framework hosted on JBC-P that enables organizations to build tactical apps for the well-known Android environment. The familiarity of Android makes the apps easier for Soldiers to use and for developers to build. Through MACE, the apps are developed once and are capable of running on multiple hardware platforms at multiple echelons and across multiple networks. In 2014, this framework enabled PEO C3T’s project manager for tactical radios (PM TR) to quickly prototype ODIN, which is designed to allow Soldiers to dynamically connect radio networks over the air, reducing to three clicks and three minutes a process that now requires several days and even weeks to plan and execute.

Unlike a stand-alone app, ODIN does not just ride on a computer and interface with an operator—it needs to share and display information across the network so that users can view and join available radio nets. As PM TR personnel worked to create ODIN, they not only applied the MACE standards, but also interacted regularly with PM Mission Command within PEO C3T to determine what information the app would need to pull from and push across the Blue Force Tracking network; how it would function in bandwidth-disadvantaged environ­ments; and how it would connect with other aspects of the JBC-P system.

This team effort extended into risk reduc­tion in the integrated laboratories at Aberdeen Proving Ground, MD, followed by an operational assessment with ATEC and Soldiers at the NIE 15.1, which began in October 2014 at Fort Bliss, TX, and White Sands Missile Range, NM. In all, ODIN took less than a year from idea to test. As the Army moves forward to formally evaluate the host MACE frame­work itself over the next year, we are documenting lessons learned from the ODIN experience to help shape future agile app development and evaluation for the MACE infrastructure.

 

RADIO MARKETPLACE

The Soldier's Network - U.S. Army: Testing The Waters

A Soldier uses a MUOS equipped Manpack Radio to communicate at the Army’s Network Integration Evaluation 13.1. (Photo courtesy of the U.S. Army)

Another area of technology that demands a new testing approach is the emergence of software-defined tactical radios, which use high-bandwidth waveforms to trans­mit and receive voice, data, video and other information across great distances and beyond line of sight. Approved by the Office of the Secretary of Defense, the Army’s acquisition strategy for the Hand­held, Manpack, Small Form Fit Rifleman and Manpack radios does not follow the traditional structure of development, test, achieving full-rate production and then buying large quantities of a system from a single vendor. Instead, the strategy calls for a “radio marketplace” in which multiple qualified vendors will compete for smaller-quantity delivery orders on a regular basis, driving innovation through competition.

The purchase of nondevelopmental-item commercial radios for military opera­tions is possible because the radios will use secure, standard waveforms that are owned by the government, certified by the Joint Tactical Networking Center and made available to run on multiple hardware models that industry produces. But while comparisons to smartphones exist in the commercial communications market—with most consumers upgrad­ing their cellphones every few years as both phone and networking technology evolve—the radio marketplace concept is new territory for the Army.

To support the strategy, the Army has developed an iterative process of quali­fication testing and operational testing that will allow for maximum vendor par­ticipation and flexibility for technology to evolve in areas such as weight, range, processing power and battery life. Each vendor who wins a Rifleman or Manpack radio contract will first provide a limited number of radios to participate in a qual­ification test, involving straightforward lab and field assessments in a controlled environment. Products that meet speci­fications will proceed to an operational test featuring Soldier scenarios in a more dynamic environment. The operational test results will inform the Army’s selec­tion of vendors, who will then compete for delivery orders for each radio.

The strategy also includes on-ramp oppor­tunities for vendors whose technologies mature significantly following the first round of tests, and the Army is develop­ing the protocols to support that process. We must also be able to tailor the tests to focus on specific new features—such as greater range, processing power or bat­tery life—as they emerge on the market, without neglecting the rest of the system. As with apps, the key to successful testing supporting the radio marketplace will be flexibility and communication between the acquisition and test communities, so that we can decide in partnership how to match the process to the product.

 

FACING THE FUTURE TOGETHER

Aside from specific capabilities, we have taken other steps to deepen the acqui­sition-test partnership that the future force demands. For example, we are holding regular summits and method­ology discussions that unite program managers and test experts as well as affected Army commands, to foster dialogue and increase understanding of various areas including cyber and the collection of reliability, availability and maintainability data.

By helping program offices grasp differ­ent test techniques and boosting testers’ knowledge of the technologies they will evaluate, these sessions contribute to suc­cessful test preparation and execution, saving time and cost. Additionally, as the Army continues to refine its cyber mis­sion, assign responsibilities and identify needed capabilities, we will need to apply smart testing and procurement principles so we can rapidly respond to and correct vulnerabilities. ATEC, U.S. Army Cyber Command and PEOs that provide defen­sive and offensive cyber capabilities are contributing to an overall test strategy to better examine cyber protection early and often at the system-of-systems level.

 

CONCLUSION

We are also moving forward together to execute the Army’s revised construct for the NIE. Beginning in FY16, it will become an annual event focused on program-of-record testing. The new Army Warfighter Assessment, also held at Fort Bliss, will take the place of the other semiannual NIE event and provide a more experimental venue to focus on doctrinal development supporting new technologies for Force 2025 and Beyond.

While Army leadership is still finalizing several details of the transition, we believe the new approach will make disciplined use of test resources while continuing to modernize the network incrementally for the future force. Like other initiatives to support Force 2025, the future NIE con­struct will require the acquisition and test communities to adapt in several areas, from the timing of risk reduction and training to holding tests at other venues so that we continue to meet milestones. In all of these cases, it is our role to work together to provide the agility that will translate the 2025 vision into reality.

For more information, visit http://www.atec.army.mil/ or http://peoc3t.army.mil/c3t/ ; or the articles at http://www.army.mil/standto/archive_2014-10-21/http://www.army.mil/article/125511/  and http://www.army.mil/article/127310/.

 

MG PETER D. UTLEY is the commanding general of ATEC. He holds an M.A. in national security and strategic studies from the U.S. Naval War College and a B.S. in biology from The Citadel, the Military College of South Carolina.

MG DANIEL P. HUGHES is the PEO for C3T. He holds an MBA from Oklahoma City University, an M.S. in national resource strategy from the Industrial College of the Armed Forces and a B.A. in political science from the University of Texas at Arlington. He is Level III certified in program management and a member of the Army Acquisition Corps.

Source: MG Daniel Hughes & MG Peter Utley, “Testing The Waters,” Army AL&T Magazine, 1/7/2015