How Government's Driving Cloud Computing Ahead

Federal agencies are testing use cases as well as pushing standards and definitions that could help the business world.

July 6, 2009

8 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Cloud computing may still be emerging as an IT delivery model, but U.S. government agencies are forging ahead with plans to adopt cloud services or build their own. The attitude among government technology decision makers seems to be that the benefits outweigh the risks and that the risks can be mitigated with planning and careful implementation.

With a nudge from federal CIO Vivek Kundra -- Kundra was an early adopter of Google Apps when he was CTO for the District of Columbia -- a growing number of federal agencies are plugging into the cloud. The Defense Information Systems Agency (DISA), for example, is well along in building an internal cloud in its data centers. And NASA's Ames Research Center recently launched a cloud computing environment called Nebula.

At the same time, government technology planners are working to ensure that the rollouts go smoothly. The National Institute of Standards and Technology has drafted a definition of cloud computing to keep implementers on the right track. And the General Services Administration has issued an RFI to cloud service and platform providers, in an effort to scope out the market in advance of demand.

At the Federal IT On A Budget Forum in Washington in May, speakers from the Army, DISA, GSA, NASA, NIST, and the departments of Defense, Energy, and Interior hashed through many of the problems they face as their organizations adopt, or contemplate adopting, cloud platforms and services. Security, compliance with federal regulations, interoperability, and IT skills development all came up as issues still to be resolved.

Yet a can-do mind-set has government technology managers sounding like it's a matter of when, not if, they'll overcome those hurdles and implement cloud services. "You're seeing adoption in some places you never would have expected," says Henry Sienkiewicz, technical program director with DISA's Computing Services Division.DISA, which provides IT services to the military branches of the Defense Department, is among the most progressive of the early movers, with a strategy for implementing infrastructure as a service, platform as a service, and software as a service. As part of that effort, DISA's building an internal cloud, called Rapid Access Computing Environment, that will let its clients access data center resources from a self-service portal with a drop-down menu of virtualized IT services. Among the benefits it hopes to achieve are lower IT costs, pay-per-use accounting, accelerated deployment of mainframe-class systems, data center standardization, and flexibility in scaling up and down.

Sienkiewicz expresses optimism that DISA will be able to work through the inevitable challenges. In the area of cloud security, he cites recent "breakthroughs" in user access and control. For example, DISA is adopting a model in which "tenant" applications must comply with a standardized hosting environment, thereby inheriting the access controls of the host. On the question of interoperability, APIs will be the answer--though Sienkiewicz admits he's not sure how that interoperable environment is going to be built. "We don't know who's going to define it, but we're not going to allow vendor lock-in," he says.

Government IT managers and cloud computing vendors have huddled several times over the past few months in early attempts to define standards and specifications. As is always the case with such industry efforts, the standards process takes time. The trick for DISA and other cloud computing implementers is to develop service architectures that won't require an overhaul in the future based on specs yet to be defined. The way some are doing that is by building on tech platforms that seem well positioned now: VMware's vSphere and the open source Eucalyptus software, for example.

There's a level of bureaucracy, to be sure, in getting a new model embraced by government. But private sector adopters might take interest in what the government is producing, given the debate over definitions that seems to pop up whenever cloud computing gets discussed. NIST's definition of cloud computing, now in its 14th draft, serves as a starting point for government agencies. The document describes five "essential characteristics" of the cloud, three delivery models, and four deployment models (see below). "These definitions, attributes, and characteristics will evolve and change over time," write authors Peter Mell and Tim Grance, with NIST's IT Laboratory.

NIST has been working with other government agencies, including the GSA, and tech vendors in coming up with its cloud definition. NIST's definition--all 677 words of it--is more comprehensive than some of the ad hoc attempts floating around the industry, articulating the difference between platform as a service and infrastructure as a service, and describing "community clouds" that are shared by organizations with common interests. "We're scientists, and we weren't content with fuzzy definitions that encompassed anything and everything," Mell says. "We took a taxonomical approach to it that was not always common in definitions."Regulation Not A Deal Breaker
NIST also is looking closely at how cloud computing fits in with regulatory requirements, most importantly the Federal Information Security Management Act. Security controls described in NIST publications are applicable, Mell says, though case studies are lacking. Speaking at the IT On A Budget Forum, Mell called cloud security, despite the challenges, "a doable thing."

Some federal agencies have policies requiring IT administrators to physically inspect data centers where agency data is stored, as well as other agency-specific security requirements that cloud providers might find hard to satisfy. NIST plans to release FISMA guidance this summer that would let one federal agency certify and accredit cloud providers for others, lowering one barrier to adoption.

NIST also participates in the discussion of cloud standards. The government won't mandate specific cloud standards, Mell says, but it does see itself as a catalyst for creating them. "We believe data and application portability between clouds is very important, and we believe having standard cloud interfaces so you can provision resources from the cloud using standards-based mechanisms is very important," he says. NIST is working to identify minimum standards for portability and interoperability.

In another sign of the government's growing interest, the GSA last month issued an RFI to infrastructure-as-a-service providers. The RFI includes a list of 45 questions in areas such as pricing, service-level agreements, operational procedures, data management, security, and interoperability.

At the same time that these due diligence efforts are under way, some agencies are forging ahead with cloud computing prototypes and early deployments. NASA's Ames Research Center recently launched a cloud computing environment called Nebula that combines open source components to create a self-service platform of computing, storage, and network resources.NASA describes Nebula as a combination of infrastructure, platform, and software as a service, and the space agency has created an IT architecture to support that. Components include the Eucalyptus software developed at the University of California at Santa Barbara, the Lustre file system deployed on 64-bit storage nodes, the Django Web application framework, the Solr indexing and search engine, and an integrated development environment. Nebula will be compatible with Amazon Web Services, which means AWS-compatible tools will work with it and Nebula virtual servers can run on Amazon's Elastic Compute Cloud.

Among Nebula's potential uses: rapid development of policy-compliant, secure Web applications and support of education, public outreach, collaboration, and mission support. Nebula could be used in support of a proposed overhaul of NASA's Web sites. In a draft white paper published last month, Chris Kemp, CIO of NASA Ames, proposed a Web application framework that would include templates for user-generated blogs and wikis and an API for other development. He compared it conceptually to Salesforce.com's Force.com and Google's App Engine services.

It's likely that federal agencies eventually will want to connect their internal clouds into an ??ber-cloud of shared data, applications, and IT resources. As part of its other work in this area, NIST has presented the idea of a federal cloud infrastructure where agencies would have their own cloud instances or nodes. Such an approach would make it possible to develop applications that work across cloud nodes, create a more secure environment, ward off vulnerabilities and attacks, and centrally manage cloud resources using tools designed specifically for a Federal Cloud. All of that work, however, would be dependent on the nascent standards effort.

Documents published in support of the 2010 federal budget lay out plans for cloud computing, including the funding of a number of pilot projects, yet another sign that cloud computing will play an increasingly important role in the federal government's IT policies and strategy. "The cloud computing investment in the 2010 budget reflects the administration's desire to drive down costs, drive innovation across the federal government, and make sure we're making available technologies to the workforce that may be available to them elsewhere," federal CIO Kundra said in an interview.

Cloud vendors see the opportunity. Amazon Web Services recently held a two-day training session in Washington, D.C., for IT service firms that work with government agencies. And Susie Adams, CTO of the Microsoft group that works with federal agencies, said interest is high in Microsoft's Azure cloud services as a way to quickly scale IT resources to satisfy Obama's mandate for increased transparency.With so much going on in cloud computing, government tech pros will need to be ambidextrous. They must grapple with unresolved issues around security and governance, while simultaneously mapping a path ahead.
Courtesy of InformationWeek.com

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights