Cloudera must be reading the script!

Federal Cloud Computing Strategy Officially Launched

By G C Network | February 14, 2011

Federal CIO Vivek Kundra officially launched the Federal Cloud Computing Strategy today. While this is clearly not new news, the document does state the government’s position in a very succint manner.…

GEOINT’s Future is in the Cloud

By G C Network | January 31, 2011

Recently, Geospatial Intelligence Forum Magazine asked me for my thoughts on the role of cloud computing in the future of geospatial intelligence.My response was recently published in their December 2010…

eTechSuccess: Patterns of Success – Kevin Jackson

By G C Network | January 27, 2011

 My sincere appreciation to John Baker for the eTechSuccess: Patterns of Success interview. John and I worked together IBM as part of the Wireless Emerging Business Organization. His team and…

USBE&IT Winter Issue Focuses on Cyber Security

By G C Network | January 19, 2011

Thank You USBE&IT Publisher Mr Tyrone Taborn for such an inspiring issue and my sincere appreciation to Mr. Frank McCoy for my inclusion in his list of Cyber visionaries! The Homeland…

Global GovCloud with Cisco and VCE

By G C Network | January 18, 2011

Last week I had the awesome experience of participating in a global telepresence conference on government cloud computing. Joining me as presenters were Blake Salle, Senior Vice President of VCE,…

NIST Cloud Computing Collaboration Twiki Launches

By G C Network | December 30, 2010

Today I received my credentials for the NIST Cloud Computing Collaboration Site. “The National Institute of Standards and Technology (NIST) has been designated by Federal Chief Information Officer Vivek Kundra…

GovCloud Predicitons for 2011

By G C Network | December 30, 2010

Happy New Year All!! 2011 will be the breakout year for GovCloud! Pressure to reduce budget, pressure to manage I resources better and the political pressure of the next presidential…

Vivek Kundra Unveils 25-Point IT Management Reform Program

By G C Network | December 10, 2010

Yesterday the US Federal CIO, Vivek Kundra, unveiled an ambitious 25-point implementation plan for delivering more value to the American taxpayer. This plan focuses on execution and is designedto establish…

GSA and Unisys/Google Marks GovCloud Watershed

By G C Network | December 4, 2010

As widely reported this week, the United States General Services Administration (GSA) has awarded a contract to Unisys to create a secure cloud-based email and collaboration platform. The solution will…

NIST Moves Forward on Cloud Computing

By G C Network | November 8, 2010

Last week the National Institute of Standards and Technology (NIST) held their second Cloud Computing Forum and Workshop. Skillfully shepherded by Ms. Dawn Leaf, the agency’s senior executive of cloud computing,…

“Cloud computing leapt out as the most obvious way to address enterprise large data problems” – Ken Pierce, IT Specialist, DIA-DS/C4ISR

“We view Hadoop as the key enabler…[in] optimizing the [cloud infrastructure] platform to ingest and present information effectively in the petascale.” – Robert Ames, Director & Deputy CTO, IBM Federal

Successful mission accomplishment in the DoD, DHS and Intelligence Communities revolve around their ability to process “Big Data”. Hadoop is all about processing “Big Data”.

The ability to process big data is crucial to mission accomplishment because this is the core technology for processing terabyte-sized datasets with on-line applications. This capability is also needed to enable low-latency in automated decision tools. Since a typical software engineer has never used a thousand machines in parallel to process a petabyte of data, new software tools are critical to the sucessful implementation of solutions in this domain. That’s where Hadoop comes in.

Apache Hadoop is a Java software framework that supports data intensive distributed applications. This open source implementation of Google’s distributed file system and MapReduce technologies enables applications to work with thousands of nodes and petabytes of data. Cloudera was founded to provide enterprise-level support to users of Apache Hadoop. They have extensive experience and deep expertise in the commercial use of open source software and Hadoop.

During the Cloud Computing Summit, I met Christophe Bisciglia, a Google alumni that recently founded Cloudera. During his time at Google, Christophe created and managed the Academic Cloud Computing Initiative. His success led to an extensive partnership with the National Science Foundation (NSF) which makes Google-hosted Hadoop clusters available for research and education worldwide. Our discussions quickly focused on how Hadoop made the automation of intelligence exploitation feasible.

I can’t wait to see the fruit of this potential marriage.

Follow me at https://Twitter.com/Kevin_Jackson

G C Network