Implementation of Cloud Computing Solutions in Federal Agencies: Part 4 – Cloud Computing for Defense and Intelligence

CloudCamp Federal @ FOSE

By G C Network | February 9, 2009

Sign up now CloudCamp Federal @ FOSE, March 10,2009, 3pm – 8:30pm at the Walter E. Washington Convention Center, 801 Mount Vernon Place NW , Washington, DC. As a follow-up…

Thank You NVTC “Cool Tech” and TechBISNOW !!

By G C Network | February 6, 2009

Thank you to Dede Haas, Chris D’Errico and the Northern Virginia Technology Council for the opportunity to speak at yesterday’s NVTC “Cool Tech” Committee meeting! The Agilex facilities were awesome…

A Significant Event in Cloud Interoperability

By G C Network | February 6, 2009

On Jan 20th, GoGrid released it’s API specification under a Creative Commons license. “The Creative Commons Attribution Share Alike 3.0 license, under which the GoGrid cloudcenter API now falls, allows…

Booz|Allen|Hamilton & Dataline Sponsor 2nd Government Cloud Computing Survey

By G C Network | February 4, 2009

Dataline, Booz|Allen|Hamilton and the Government Cloud Computing Community have teamed together to sponsor the 2nd Government Cloud Computing Survey. Cloud Computing has come a long way since the first survey six months…

Gartner Lays Out 7-year Plan for Cloud Computing

By G C Network | February 3, 2009

According to Gartner’s new report, cloud computing will go through three phases over seven years before it will mature as an industry; – Phase 1: 2007 to 2011 — Pioneers…

Cloud Interoperability Magazine Launches

By G C Network | February 3, 2009

My congratulations goes out today to Reuven Cohen on the launch of Cloud Interoperability Magazine. The site will focus on Cloud Computing, standardization efforts, emerging technologies, and infrastructure API’s. As the new…

Why Can’t We Eliminate the “Technology Refresh” RFP?

By G C Network | February 2, 2009

In order to maintain life cycle and technology, the Navy is upgrading server farms at fifteen (15) sites and any future sites throughout the Far East, Europe and Middle East…

Cloud & the Government Session at Cloud Computing Expo

By G C Network | January 29, 2009

Earlier this week I announced that I will be presenting at SYS-CON’s 2nd International Cloud Computing Conference & Expo in New York City this coming March 30-April 1, 2009. During…

CSC and Terremark target US Government with Cloud Computing

By G C Network | January 27, 2009

Today’s announcement by CSC reinforced the strong wave of cloud computing towards the Federal space. Ranked by Washington Technology Magazine as 9th largest (by contract dollar value) government contractor, this…

Should my agency consider using cloud computing?

By G C Network | January 26, 2009

This is clearly the question on the minds and lips of every government IT decsionmaker in town. Why should a government agency even consider cloud computing?  In reality, the decision…

(This is part 4 of the series entitled “Implementation of Cloud Computing Solutions in Federal Agencies”. First published on Forbes.com, this series provides the content of a whitepaper I recently authored. A copy of the complete whitepaper will be available at NJVC.com starting September 7, 2011.)

The defense and intelligence communities are not immune to cloud computing. Arguably more than any other government agencies, their missions require a fabric of utility computing that scales on demand and enables self discovery and self-service access to secure, timely and relevant information in support of mission: individual or shared. The traditional IT model requires system engineering that binds most software to the hardware and does not provide an enterprise suite of functionality or allow for increased flexibility and a governed lifecycle of services. Designing software independence from the hardware allows an operating system, applications and data to “live” across the enterprise and is fundamental to the transformation of compute, storage and network functionality.

Defense is dealing with a $78 billion budget cut—the first since September 11, 2001—and another $100 billion in other cost-cutting measures over a five-year period commencing in FY 2012. Defense Secretary Robert Gates is directing that the budget be cut from agency administrative and structural areas (e.g., the Office of the Assistant Secretary of Defense for Network Intelligence and Information, the Business Transformation Agency, and the Joint Forces Command are in the process of being eliminated or disestablished with some essential functions transferred to other organizations with the Pentagon).

In an official statement on the proposed budget costs provided on January 6, 2011, Secretary Gates said: “First, reforming how the department uses information technology, which costs us about $37 billion a year. At this time all of our bases and headquarters have their own separate IT infrastructure and processes, which drive up costs and create cyber vulnerabilities. The department is planning to consolidate hundreds of data centers and move to a more secure enterprise system, which we estimate could save more than $1 billion a year.” Department of Defense Chief Information Officer Terry Takai also publically commented about the potential IT budget cuts at an April 21, 2011, INPUT event. Takai commented on DoD’s support of the move of some of its IT operations to the cloud—particularly data centers.

Cloud Computing and Mission Support 

Information is often the decisive discriminator in modern conflict. Studies of recent mission failures highlighted this fact, finding that many of these failures were caused by:

  • Existence of data silos
  • Human-based document exploitation process
  • Reliance on “operationally proven” processes and filters typically used to address the lack of computational power or decision time

Also disturbing is that in most of these cases, the critical piece of information necessary for mission was in possession. The failure wasn’t in obtaining the information, but in locating and applying it to the mission at hand. Cloud computing uniquely addresses all of these important issues.

Data silos evolved from a system-centric IT proc urement policy and an almost reflexive reliance onrelational database technology. In developing early data processing systems, the high cost of memory and storage led to a premium being placed on the efficiency of application data access and retrieval. Relational database technology effectively addressed this need, which in turn led to its pervasive use across government. In modern IT system development, memory and storage are cheap—and getting cheaper—which has led to internet-scale storage and search paradigms that are the stuff of everyday use today. The world’s largest databases cannot, in fact, be searched quickly using a relational database management approach. Today’s ability to search multi-petabyte data stores in milliseconds virtually eliminates the need for data silos. This capability is realized in cloud-based storage. Documents are the persistent records of human activity. As such, they are used to provide insight into the societal structure and processes of our opponents. Conflict, however, is entity and event centric. The intelligence professional must, therefore, interpret documents and translate that data into operationally relevant entities and events. The time and resource intensive nature of this skillcraft is perfectly suited for the precision search and analytic capabilities of the modern compute cloud. The use of highly standardized and virtualized commodity infrastructure, not only make the automation of this function possible, but it enables real-time continuous processing of the now digital document flow of our adversaries. This commodity also removes the human from this tedious task, allowing intelligence professionals to apply higher order professional analysis and insight.

The human-based document exploitation process led directly to an institutional reliance on the aforementioned “operationally proven” processes and filters. Instantiated by the use of multi-page structure query language and the ubiquitous goal of obtaining an appropriate “working set” of data, these time-honored processes were born from the need to meet critical decision timelines within a computationally inadequate environment. Cloud techniques and technologies can now be used to work on all the data. And with an ability to leverage the power of a supercomputer at will, the working set requirement is now an anachronism and critical decision timelines can now be more easily met.

Cloud computing is unique in its ability to address these critical defense and intelligence mission needs. That’s why cloud computing is critical to our national defense. As a bonus, cloud computing offers defense and intelligence agencies the ability to increase efficiencies and incur marked cost savings during their lifecycles to alleviate some of the pressure of budget reductions. Moving IT operations to the cloud also will assist in enhanced collaboration.

https://www.defense.gov/speeches/speech.aspx?speechid=1527
Bookmark and Share

Cloud Musings on Forbes
( Thank you. If you enjoyed this article, get free updates by email or RSS – KLJ )

Follow me at https://Twitter.com/Kevin_Jackson

G C Network

2 Comments

  1. Android app development on November 14, 2011 at 11:18 am

    Thanks for the experiment. It was very informative and useful. I keep in mind. Thanks a lot for sharing such a awe-some information.
    Android app developer| Android apps development|



  2. Madhuri Naidu on January 30, 2012 at 10:05 am

    Cloud computing is really an interesting part to know and learn about. U have made an interesting post.
    Free Ecommerce Software | Web Shopping Cart