Cloud Storage 2.0 Set To Dominate Market

EITAGlobal and Cloud Expert Kevin L. Jackson Team To Deliver Business Focused Cloud Computing Training

By G C Network | December 9, 2013

Today I am announcing my partnership with EITAGlobal to produce and deliver a series of business focused training webinars on cloud computing. Headquartered in Fremont, California, EITAGlobal is a continuing professional…

NRRC Video Series – Video 6 : Telos Demonstrates Cloud-based Communications System

By G C Network | December 8, 2013

In September, the NCOIC delivered the Geospatial Community Cloud (GCC) demonstration. Sponsored by the National Geospatial-Intelligence Agency, this demonstration showed how an interoperable, hybrid-cloud operating environment can be quickly enabled…

NRRC Video Series – Video 5 : Boeing Ozone Dashboards for Emergency Management

By G C Network | December 7, 2013

In September, the NCOIC delivered the Geospatial Community Cloud (GCC) demonstration. Sponsored by the National Geospatial-Intelligence Agency, this demonstration showed how an interoperable, hybrid-cloud operating environment can be quickly enabled…

Proud To Be Joining Veterans 360 and Cloud for Vets!

By G C Network | December 5, 2013

Today I’m proud and honored to announce my appointment to the Board of Advisors for Veterans 360. Their mission is to provide young, recently-separated combat veterans with a program of…

Take Charge, Lead Change, Do Cloud Right!

By G C Network | December 4, 2013

The theme for next week’s Gartner Data Center conference, “Taking Charge, Leading Change — Your I&O Transformation Can’t Wait”, is a real wake up call for today’s I&O leaders. Gartner’s…

Cloud Computing at the Potomac Officer’s Club

By G C Network | November 27, 2013

Across the Federal IT landscape, terms like “cloud”, “cloud computing” and Software as a service (SaaS) are at the center of a seismic shift by the agencies away from building…

NRRC Video Series – Video 2 : Building the Cloud Infrastructure

By G C Network | November 26, 2013

In September, the NCOIC delivered the Geospatial Community Cloud (GCC) demonstration.  Sponsored by the National Geospatial-Intelligence Agency, this demonstration showed how an interoperable, hybrid-cloud operating environment can be quickly enabled…

NRRC Video Series – Video 1 : Introduction and Overview

By G C Network | November 25, 2013

In September, the NCOIC delivered the Geospatial Community Cloud (GCC) demonstration.  Sponsored by the National Geospatial-Intelligence Agency, this demonstration showed how an interoperable, hybrid-cloud operating environment can be quickly enabled…

NCOIC To Rollout Open Process for Creating Secure, Hybrid IT Computing Environments

By G C Network | November 20, 2013

Network Centric Operations Industry Consortium to offer capability to organizations that seek to combine traditional and cloud infrastructures for greater efficiency and mission success WASHINGTON-November 19, 2013-Next month, the Network…

So what kind of consultant are you?

By G C Network | November 15, 2013

Yesterday over lunch, a good friend of mine from the Limelight Marketing Group and I started talking about my recent transition. As you can imagine being in the DC metro…

The enterprise data storage marketplace is poised to become a battlefield. No longer the quiet backwater of cloud computing services, the focus of this global transition is now going from compute to storage. An overview of recent storage market history is needed to understand why this transition is important.
 

 

Before 2007 and the birth of the cloud computing market we are witnessing today, the on-premise model hosted in large local data centers dominated enterprise storage.  Key marketplace players were EMC (before the Dell acquisition), NetApp, IBM, HP (before they became HPE) and Hitachi. Company employees managed information technology resources (compute, storage, network) and companies tightly controlled their data in facilities they managed. Data security, legal and regulatory concerns, for the most part, were very localized. The data itself was highly structured (i.e., Relational Databases and SQL) in support of serially executed mostly static business processes. This structured approach worked because consumer segments in most industries were homogeneous, segregated and relatively static.  Companies also felt relatively safe in their industry vertical due to the high financial and operational barriers prospective new competitive entrants would face.
 
 
On March 13, 2006, Amazon Web Services launched Simple Storage Service (S3). Although not widely appreciated at the time, that announcement was the launch of Cloud Storage 1.0 and heralded a gradual but steady global adoption of cloud-basedstorage services. Surveys show that by 2016 approximately 30% of all businesses had transitioned to cloud-based storage.  Although cloud compute service and application management were the primary reason for migrating to the cloud, the rapid growth of unstructured data (Social media, Hadoop, Big Data Analytics) significantlyheightened the importance of cloud-based storage. Rapidly changing business processes that increased the need to target smaller consumer segments (localization, online retail) also contributed to rapid data growth and breadth.  Over time Google, Microsoft, Rackspace, and other cloud storage vendors entered the market. Coincident with the transition to cloud storage, international data security, legal and regulatory concerns also grew. Even though the daily news greeted everyone with multiple high profile data breaches and data loss incidents, fines were minimal and very few mandatory notification laws existed. Cloud storage technology was characterized by:
  • Implemented through a 2005-2008 technology base;
  • Primarily being hosted on Linux or Windows operating systems;
  • The use of proprietary, incompatible and competing APIs;
  • The vendor selection also threatened vendor lock-in;
  • Additional charges for data manipulation activity (puts, gets, deletes);
  • A continuing requirement to manage multiple storage options and pricing tiers.

 

As if this was not challenging enough, vertical industry barriers were shattered by digital transformation and the elimination of significant startup capital investment requirements.
As we all prepare the champagne and noisemakers for the birth of 2018, Cloud Storage 2.0 is already with us. Massive transition to cloud computing has commoditized storage.  Industry observers’ expectations storage to become an IT utility.  Increased data volumes and the sourcing of unstructured data (Crowdsourcing, Social Media Analytics) have elevated the importance of previously benign enterprise storage technology decisions. Many business processes are now expected to be dynamically executed in a parallel fashion (Agile business, Social Media Customer Service). Blended consumer segments and need to target and satisfy individual consumers is common. There are also significant changes on the data security front. These changes include:
  • Significant fines for data loss or breach;
  • Mandatory data breach reporting laws; and
  •  Heightened international data security, privacy, legal and regulatory concerns (i.e., National data sovereignty Laws, BREXIT and EU General Data Protection Regulation (GDPR))

 

Corporate risk introduced by these dramatic changes means that cloud storage vendors need to drastically up their game as well. Customers no longer want to deal with continually balancing between the cost to store data and the risk associated with intentional deletion. In fact, expanding legal and regulatory requirements are now the driving force behind operational needs to execute real-time retrieval of complex data assets (i.e., biometrics, social media analytics, dynamic data streams). These needs mean that Cloud storage 2.0 minimum requirements now include:
  • Using new and improved purpose-built operating systems;
  • Native control of storage disk for higher density and faster access speed;
  • Solutions optimized for the storage and analysis of unstructured data;
  • Significant reduction of multiple storage tiers and options;
  • Elimination of separate charges for data manipulation activity (puts, gets deletes);
  • Storage immutability (data or objects cannot be unintentionally modified after creation);
  • Significantly reduced pricing; and
  • Use of standards-basedinterface APIs.

 

 

 

These are the many reasons why enterprises must think before accepting storage services from the current cloud industry leaders. Don’t settle for a Cloud Storage 1.0 band-aid when you should buy a Cloud Storage 2.0 solution. When your team is evaluating options:
  • Compare access speeds and select the vendor that can offer the fastest possible access;
  • Use storage with a pricing structure that allows you to retain all of your data for as long as needed;
  • Make sure your company is ready to meet the new data security regulations;
  • Choose cloud storage that is inter-operable across the most extensive ecosystem (partners, storage applications, formats); and
  • Always evaluate the solution’s scalability, durability, immutability, and legal compliance capabilities.

 

 

Change is happening now so don’t get fooled!


( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)

 

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2017)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network