Cloud Storage 2.0 Set To Dominate Market

A bald man in a suit smiles for the camera.

Cloud Computing Evolves: An Interview with Mats Johansson

By G C Network | June 6, 2019

Recently, Ericsson Digital released an amazing report on Edge Computing and 5G. In it, they explained how distributed cloud computing is paving the way for the future of network communications. They…

Rexroth unveils rexroth rexroth rexroth rexroth rex.

The IoT Nexus: Bosch Connected World 2019 in Berlin

By G C Network | May 11, 2019

Next week, I will be influencing #LikeABosch as I accept an invitation from the company to attend Bosch ConnectedWorld 2019 (BCW19) in Berlin, Germany. This is one of the world’s largest international…

A group of people sitting at a conference table.

Survive and Thrive With Digital Transformation

By G C Network | April 17, 2019

First cloud computing then multi-cloud. How can we get ahead of this digital transformation nightmare? These are the laments heard in conference rooms and board meeting around the world. While…

A cartoon man standing next to a white tesla model 3.

The “George Jetson” of Today

By G C Network | April 13, 2019

  He grew up in Silicon Valley, landed his first job at Apple Computers, was introduced to Nobel Prize winners by his dad and today, he takes a self-driving car…

A city at night with the words 5g on it.

MWC19: Where Telecommunications and Cloud Meet

By G C Network | March 23, 2019

As a cloud solution architect, my passion is learning the details about how cloud computing uniquely supports specific business cases. This curiosity is what drove my excitement when Ericsson invited…

Tulane university school of professional advancement logo.

Tulane University SoPA Selects “Architechting Cloud Computing Solutions”

By G C Network | February 16, 2019

Last week, Packt Publishing announced that “Architecting Cloud Computing Solutions” by Kevin L. Jackson and Scott Goessling was selected for use by the Tulane University School of Professional Advancement, Applied Computing Systems & Technology Program as the textbook for…

A group of doctors looking at a computer screen.

5G Wireless Technology Connecting Healthcare

By G C Network | February 16, 2019

Healthcare is in the middle of massive change. Called digital transformation by many, this term describes the industry’s pursuit of the many promises offered by connected patients, connected caregivers, and…

A woman wearing a red jacket and necklace.

Maria Lensing: The Network Platform for Healthcare’s Future

By G C Network | February 16, 2019

As a girl, Maria and her family traveled to Memphis, Tennessee to get cancer treatment for her sick brother. The miracle she observed, as the healthcare providers saved her brother’s…

An image of a network of dots and lines.

How “Big Iron” Does “Big Regulation”

By G C Network | January 10, 2019

According to Verizon, there were over there were over 53,000 security incidents in 2017, with over 2,200 of those identified as confirm data breaches. A Ponemon Institute study also showed…

Two different signs that are side by side.

Mainframe Synergies for Digital Transformation

By G C Network | January 10, 2019

In July  of 2018, Broadcom announced its intentions to acquire CA Technologies. In the press release, Hock Tan, President and Chief Executive Officer of Broadcom, said: “This transaction represents an…

The enterprise data storage marketplace is poised to become a battlefield. No longer the quiet backwater of cloud computing services, the focus of this global transition is now going from compute to storage. An overview of recent storage market history is needed to understand why this transition is important.
 

 

Before 2007 and the birth of the cloud computing market we are witnessing today, the on-premise model hosted in large local data centers dominated enterprise storage.  Key marketplace players were EMC (before the Dell acquisition), NetApp, IBM, HP (before they became HPE) and Hitachi. Company employees managed information technology resources (compute, storage, network) and companies tightly controlled their data in facilities they managed. Data security, legal and regulatory concerns, for the most part, were very localized. The data itself was highly structured (i.e., Relational Databases and SQL) in support of serially executed mostly static business processes. This structured approach worked because consumer segments in most industries were homogeneous, segregated and relatively static.  Companies also felt relatively safe in their industry vertical due to the high financial and operational barriers prospective new competitive entrants would face.
 
 
On March 13, 2006, Amazon Web Services launched Simple Storage Service (S3). Although not widely appreciated at the time, that announcement was the launch of Cloud Storage 1.0 and heralded a gradual but steady global adoption of cloud-basedstorage services. Surveys show that by 2016 approximately 30% of all businesses had transitioned to cloud-based storage.  Although cloud compute service and application management were the primary reason for migrating to the cloud, the rapid growth of unstructured data (Social media, Hadoop, Big Data Analytics) significantlyheightened the importance of cloud-based storage. Rapidly changing business processes that increased the need to target smaller consumer segments (localization, online retail) also contributed to rapid data growth and breadth.  Over time Google, Microsoft, Rackspace, and other cloud storage vendors entered the market. Coincident with the transition to cloud storage, international data security, legal and regulatory concerns also grew. Even though the daily news greeted everyone with multiple high profile data breaches and data loss incidents, fines were minimal and very few mandatory notification laws existed. Cloud storage technology was characterized by:
  • Implemented through a 2005-2008 technology base;
  • Primarily being hosted on Linux or Windows operating systems;
  • The use of proprietary, incompatible and competing APIs;
  • The vendor selection also threatened vendor lock-in;
  • Additional charges for data manipulation activity (puts, gets, deletes);
  • A continuing requirement to manage multiple storage options and pricing tiers.

 

As if this was not challenging enough, vertical industry barriers were shattered by digital transformation and the elimination of significant startup capital investment requirements.
As we all prepare the champagne and noisemakers for the birth of 2018, Cloud Storage 2.0 is already with us. Massive transition to cloud computing has commoditized storage.  Industry observers’ expectations storage to become an IT utility.  Increased data volumes and the sourcing of unstructured data (Crowdsourcing, Social Media Analytics) have elevated the importance of previously benign enterprise storage technology decisions. Many business processes are now expected to be dynamically executed in a parallel fashion (Agile business, Social Media Customer Service). Blended consumer segments and need to target and satisfy individual consumers is common. There are also significant changes on the data security front. These changes include:
  • Significant fines for data loss or breach;
  • Mandatory data breach reporting laws; and
  •  Heightened international data security, privacy, legal and regulatory concerns (i.e., National data sovereignty Laws, BREXIT and EU General Data Protection Regulation (GDPR))

 

Corporate risk introduced by these dramatic changes means that cloud storage vendors need to drastically up their game as well. Customers no longer want to deal with continually balancing between the cost to store data and the risk associated with intentional deletion. In fact, expanding legal and regulatory requirements are now the driving force behind operational needs to execute real-time retrieval of complex data assets (i.e., biometrics, social media analytics, dynamic data streams). These needs mean that Cloud storage 2.0 minimum requirements now include:
  • Using new and improved purpose-built operating systems;
  • Native control of storage disk for higher density and faster access speed;
  • Solutions optimized for the storage and analysis of unstructured data;
  • Significant reduction of multiple storage tiers and options;
  • Elimination of separate charges for data manipulation activity (puts, gets deletes);
  • Storage immutability (data or objects cannot be unintentionally modified after creation);
  • Significantly reduced pricing; and
  • Use of standards-basedinterface APIs.

 

 

 

These are the many reasons why enterprises must think before accepting storage services from the current cloud industry leaders. Don’t settle for a Cloud Storage 1.0 band-aid when you should buy a Cloud Storage 2.0 solution. When your team is evaluating options:
  • Compare access speeds and select the vendor that can offer the fastest possible access;
  • Use storage with a pricing structure that allows you to retain all of your data for as long as needed;
  • Make sure your company is ready to meet the new data security regulations;
  • Choose cloud storage that is inter-operable across the most extensive ecosystem (partners, storage applications, formats); and
  • Always evaluate the solution’s scalability, durability, immutability, and legal compliance capabilities.

 

 

Change is happening now so don’t get fooled!


( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)

 

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2017)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network