A group of civilian information technologists came away from the Navy’s Trident Warrior 2010 exercise convinced they had proven that the nation’s public cloud computing capacity could be part of the service’s IT future, and that this future didn’t have to be far off.
Using a demonstration scenario in which part of coastal California was on fire, the technologists stationed a simulated amphibious fleet offshore. In a real crisis, the fleet would deploy Marines to help fight the blaze, and the ships would stay in the vicinity to provide communications, information and other support.
The ships used on-shore computer infrastructure owned by a private company, Amazon. The fleet received, processed and sent data through the cloud. Information was categorized with different levels of security based on the users. Military data was the most secure, followed by information for fire departments, followed by information for concerned civilians.
Cloud advocates saw great promise.
“But the Navy did nothing with it,” said Kevin Jackson, a former E-2C Hawkeye pilot who was then vice president of the information services company Dataline, a participant in the exercise. Now he is the vice president of cloud services for NJVC, a Northern Virginia firm.
The Navy’s inaction has commercial cloud advocates concerned. They worry that the service is being slow to accept the public cloud partly because of what they consider a misconception that only a private cloud — meaning one run by the government — can be secure. For its part, the Navy says it is looking to experts at the Defense Information Systems Agency to provide secure cloud capacity. Public cloud advocates say the Navy risks missing a chance to shrink manpower demands, reduce the need for expensive space and cooling equipment on ships, and provide more flexibility to share information and rapidly upgrade computing tools.
Cloud advocates wonder if they are going to have fight to get the technology into combat information centers at sea. In part, they are struggling with the Navy’s culture and tradition of independent ocean operation, as well as its procurement mechanism, which is designed more for systems than services. And they also are aware that the Navy is still smarting from an earlier venture into commercially owned and provided information technology, the Navy-Marine Corps Intranet. That information system for shore sites was provided by EDS, which was later bought by Hewlett Packard.
Rolling out NMCI proved to be an expensive, morale-taxing project for the Navy. The service is in the early stages of procuring the successor to NMCI, called the Next Generation Enterprise Network, and this time, it wants to do more of the work itself.
“While we were telling the Navy that the cloud would enhance the delivery of information technology, the Navy was in its earliest stages of the NGEN procurement,” Jackson said. “They had gotten burned with the previous IT services contract, so they are really gun shy,” Jackson said. “One of the biggest problems was ownership of the IT infrastructure and of the data that infrastructure uses.”
That would have to be addressed in any cloud-services contract.
The nature of cloud computing is that hardware is rented, and the client depends on the cloud-service provider to secure data and access it on a pay-as-you-go basis. Money is saved by giving the service provider freedom to divvy up data and shift it wherever there is excess storage capacity. The thought of the military storing sensitive information in a public cloud gives heartburn to many military information security experts. Who owns the data? How can security be verified? What happens to the data when it’s time to end the contract?
On top of those questions lies the weight of the NMCI history.
“The Navy and Marine Corps really got burned, and I’m saying after Trident Warrior that this cloud is great. And the Navy is wary,” Jackson said.
Navy Capt. Shawn Hendricks, who is responsible for procuring and standing up NGEN, said that the network will be built with cloud computing in its future. But, he added, that linkup would be with a cloud created by the Defense Information Systems Agency, not a commercial vendor. Cloud experts call that a private cloud.
In addition, as a shore network, NGEN is designed for compatibility with the shipboard Consolidated Afloat Network Enterprise System, the new computing equipment and software that Northrop Grumman has been selected to install on Navy ships, starting with two destroyers. CANES isn’t designed to link with the cloud — at least not yet.
CLOUD BY AMAZON
The Trident Warrior 2010 demonstration was run using Amazon as the cloud-services provider. Amazon and Google have been pioneers in cloud computing. They provided a collaboration environment to the Navy with applications in the cloud.
Metadata was used for information security, and encryption was enabled. Existing ship radios managed the cloud infrastructure quite well.
But that was in littoral waters, where on-shore capability could be used. What about when the water is blue and land is thousands of miles away?
“It’s a nonstarter for now,” Jackson said flatly, and therein lies his and others’ main concern because the Navy still sees open ocean as its primary operating area. “One of the key aspects of the cloud is ubiquitous access. Because you’re talking about large amounts of data and bandwidth requirements, you need a very robust wireless infrastructure, which only exists on land and in littoral regions. You don’t have it in the middle of the ocean.”
Actually, there is a large amount of bandwidth onboard ship, but it has to be managed to accommodate demands for communications and for data for weapons operation. Add a future of increased ISR capability and there’s a struggle to find room for ubiquitous access to the cloud.
CANES is being installed on 54 of more than 300 Navy ships under a $637.8 million contract with Northrop Grumman that was announced in February and was briefly protested by Lockheed Martin.
Lockheed withdrew its protest in 10 days.
Besides linking five existing systems, CANES offers a virtualized infrastructure, meaning a specific data storage structure can be quickly replicated on other computers. Proponents say CANES, therefore, can be a springboard to future cloud computing. But the question is how far the Navy will have to leap off that springboard to take the cloud to sea.
“Virtualization is generally required but not sufficient for implementing cloud computing,” Jackson said.
Former Navy Chief Information Officer Rob Carey, long a champion of cloud computing, has said that ships with CANES onboard can become “gray clouds” within their own hulls.
And since the Navy deploys ships in battle groups, it is studying using the bandwidth of each vessel to add cloud capability to the group, once CANES installations provide uniform and streamlined operation of what is now a hodgepodge system of IT control.
The Space and Naval Warfare Systems (SPAWAR) command, which includes a cloud computing office in its CANES program suite, declined to comment on CANES for this story.
Even if the Navy’s concerns with the business model of cloud computing can be overcome, there remain concerns about data security in a commercial environment. Congress, through the National Defense Authorization Act of 2012, told the Department of Defense to seek commercial cloud providers before going to government services.
But those concerns are a canard, cloud proponents say, because security procedures are readily transferable to the cloud. And, added Jackson, because the cloud “is highly automated and standardized, it enables reduction in the use of humans. Fewer humans actually enhance the security aspect of cloud IT infrastructure.”
The Navy’s mission is changing, with an increased demand for humanitarian assistance and operations in littoral waters. Some see that change as a reason for hybrid information technology onboard ship. That would allow conventional data exchange at sea, but an opportunity to switch to cloud computing near shore. It would, cloud advocates say, facilitate passing data back and forth with other services and civilian agencies operating on land.
It also would enable software updates or “patches” to be transmitted to the ship quickly. Some of these updates now have to wait until the ship is in port.
But “things change slowly in the Navy,” Jackson said. “And there has to be a mindset change. The whole ethos of the Navy is the ship has to be able to operate independently in blue ocean, so why would you do anything to leverage something that’s off-board, not on the ship, like the cloud?”
Sometimes you do it because you have to do it. Navy Chief Information Officer Terry Halvorsen has mandated a $2 billion cut in IT operations and says that he sees a migration to cloud computing as a means to that end.
“The federal chief information officer said that the government’s transition to the cloud would take at least 10 years,” Jackson said. “The Department of Defense transition to the cloud will probably take twice that long. And the Navy probably will be toward the end.
“But I think NGEN and CANES are trying to embrace the new models around cloud computing. I think it’s quite possible — in fact, I think it’s probable — that if you look back at the demonstration that we did in 2010, the Navy is going to leverage CANES to support operations ashore with that model.”
For now, though, the Navy studies the cloud with an eye to the future. And it doesn’t say when that future will be.
This story appears in the April edition of C4ISR Journal.