Cloudera must be reading the script!

Strategies And Technologies for Cloud Computing Interoperability (SATCCI)

By G C Network | March 4, 2009

As I alluded to in an earlier post, a major cloud computing interoperability event will be held in conjunction with the Object Management Group (OMG) March Technical Meeting on March…

Government Cloud Computing E-zine Launched

By G C Network | March 3, 2009

Today marks the launch of a new electronic magazine dedicated to addressing cloud computing within the government space. Over the last year during my personal exploration of this marketspace, I’ve…

NCOIC Plenary: Cloud Computing Working Group

By G C Network | March 2, 2009

Last week, I had the pleasure of participating in the NCOIC Cloud Computing Working Group. Led by Cisco Systems Distinguished Engineer, Mr. Krishna Sankar of Cisco Systems, the meeting purpose…

2nd Government Cloud Computing Survey – A Sneak Peek

By G C Network | February 25, 2009

This month, we’re in the middle of collecting data for our 2nd Government Cloud Computing Survey. to peek your curiosity (an to entice your participation) here is a sneak peek…

Government could save billions with cloud computing

By G C Network | February 23, 2009

In a recent study, published by MeriTalk, Red Hat and DLT Solutions, the Federal government could save $6.6 billion by using cloud computing or software-as-a-service. “Looking at 30 federal agencies,…

Cloud Games at FOSE 2009

By G C Network | February 19, 2009

ONLINE REGISTRATION NOW AVAILABLE Booz Allen Hamilton is launching its Cloud Computing Wargame (CCW)T at FOSE March 10-12, 2009 in Washington, DC. The CCW is designed to simulate the major…

IBM and Amazon

By G C Network | February 16, 2009

According to the Amazon Web Services (AWS) site, you can now use DB2, Informix, WebSphere sMash, WebSphere Portal Server or Lotus Web Content Management on Amazon’s EC2 cloud. “This relationship…

A Berkeley View of Cloud Computing

By G C Network | February 13, 2009

Yesterday, Berkeley released their View of Cloud Computing with a view that cloud computing provides an elasticity of resources, without paying a premium for large scale, that is unprecedented in…

Cloud Economic Models

By G C Network | February 11, 2009

One of the most important drivers of cloud computing in the Federal space is its perceived “compelling” economic value. Some initial insight on the economic argument is now available on…

Cloud Computing In Government: From Google Apps To Nuclear Warfare

By G C Network | February 10, 2009

Today, I want to thank John Foley of InformationWeek for an enjoyable interview and his excellent post, Cloud Computing In Government: From Google Apps To Nuclear Warfare. Our discussion covered…

“Cloud computing leapt out as the most obvious way to address enterprise large data problems” – Ken Pierce, IT Specialist, DIA-DS/C4ISR

“We view Hadoop as the key enabler…[in] optimizing the [cloud infrastructure] platform to ingest and present information effectively in the petascale.” – Robert Ames, Director & Deputy CTO, IBM Federal

Successful mission accomplishment in the DoD, DHS and Intelligence Communities revolve around their ability to process “Big Data”. Hadoop is all about processing “Big Data”.

The ability to process big data is crucial to mission accomplishment because this is the core technology for processing terabyte-sized datasets with on-line applications. This capability is also needed to enable low-latency in automated decision tools. Since a typical software engineer has never used a thousand machines in parallel to process a petabyte of data, new software tools are critical to the sucessful implementation of solutions in this domain. That’s where Hadoop comes in.

Apache Hadoop is a Java software framework that supports data intensive distributed applications. This open source implementation of Google’s distributed file system and MapReduce technologies enables applications to work with thousands of nodes and petabytes of data. Cloudera was founded to provide enterprise-level support to users of Apache Hadoop. They have extensive experience and deep expertise in the commercial use of open source software and Hadoop.

During the Cloud Computing Summit, I met Christophe Bisciglia, a Google alumni that recently founded Cloudera. During his time at Google, Christophe created and managed the Academic Cloud Computing Initiative. His success led to an extensive partnership with the National Science Foundation (NSF) which makes Google-hosted Hadoop clusters available for research and education worldwide. Our discussions quickly focused on how Hadoop made the automation of intelligence exploitation feasible.

I can’t wait to see the fruit of this potential marriage.

Follow me at https://Twitter.com/Kevin_Jackson

G C Network