Is Data Classification a Bridge Too Far?

Federal Cloud Computing Strategy Officially Launched

By G C Network | February 14, 2011

Federal CIO Vivek Kundra officially launched the Federal Cloud Computing Strategy today. While this is clearly not new news, the document does state the government’s position in a very succint manner.…

GEOINT’s Future is in the Cloud

By G C Network | January 31, 2011

Recently, Geospatial Intelligence Forum Magazine asked me for my thoughts on the role of cloud computing in the future of geospatial intelligence.My response was recently published in their December 2010…

eTechSuccess: Patterns of Success – Kevin Jackson

By G C Network | January 27, 2011

 My sincere appreciation to John Baker for the eTechSuccess: Patterns of Success interview. John and I worked together IBM as part of the Wireless Emerging Business Organization. His team and…

USBE&IT Winter Issue Focuses on Cyber Security

By G C Network | January 19, 2011

Thank You USBE&IT Publisher Mr Tyrone Taborn for such an inspiring issue and my sincere appreciation to Mr. Frank McCoy for my inclusion in his list of Cyber visionaries! The Homeland…

Global GovCloud with Cisco and VCE

By G C Network | January 18, 2011

Last week I had the awesome experience of participating in a global telepresence conference on government cloud computing. Joining me as presenters were Blake Salle, Senior Vice President of VCE,…

NIST Cloud Computing Collaboration Twiki Launches

By G C Network | December 30, 2010

Today I received my credentials for the NIST Cloud Computing Collaboration Site. “The National Institute of Standards and Technology (NIST) has been designated by Federal Chief Information Officer Vivek Kundra…

GovCloud Predicitons for 2011

By G C Network | December 30, 2010

Happy New Year All!! 2011 will be the breakout year for GovCloud! Pressure to reduce budget, pressure to manage I resources better and the political pressure of the next presidential…

Vivek Kundra Unveils 25-Point IT Management Reform Program

By G C Network | December 10, 2010

Yesterday the US Federal CIO, Vivek Kundra, unveiled an ambitious 25-point implementation plan for delivering more value to the American taxpayer. This plan focuses on execution and is designedto establish…

GSA and Unisys/Google Marks GovCloud Watershed

By G C Network | December 4, 2010

As widely reported this week, the United States General Services Administration (GSA) has awarded a contract to Unisys to create a secure cloud-based email and collaboration platform. The solution will…

NIST Moves Forward on Cloud Computing

By G C Network | November 8, 2010

Last week the National Institute of Standards and Technology (NIST) held their second Cloud Computing Forum and Workshop. Skillfully shepherded by Ms. Dawn Leaf, the agency’s senior executive of cloud computing,…

Today data has replaced money as the global currency for trade.

“McKinsey estimates that about 75 percent of the value added by data flows on the Internet accrues to “traditional” industries, especially via increases in global growth, productivity, and employment. Furthermore, the United Nations Conference on Trade and Development (UNCTAD) estimates that about 50 percent of all traded services are enabled by the technology sector, including by cross-border data flows.”

As the global economy has become fully dependent on the transformative nature of electronic data exchange, its participants have also become more protective of data’s inherent value. The rise of this data protectionism is now so acute that it threatens to restrict the flow of data across national borders. Data-residency requirements, widely used to buffer domestic technology providers from international competition, also tends to introduce delays, cost and limitations to the exchange of commerce in nearly every business sector. This impact is widespread because it is also driving:

  • Laws and policies that further limit the international exchange of data;
  • Regulatory guidelines and restrictions that limit the use and scope of data collection; and
  • Data security controls that route and allow access to data based on user role, location and access device.

A direct consequence of these changes is that the entire business enterprise spectrum is now faced with the challenge of how to classify and label this vital commerce component.

Figure 1– The data lifecycle

The challenges posed here are immense. Not only is there an extremely large amount of data being created everyday but businesses still need to manage and leverage their huge store of old data. This stored wealth is not static because every bit of data possesses a lifecycle through which it must be monitored, modified, shared, stored and eventually destroyed. The growing adoption and use of cloud computing technologies layers even more complexity to this mosaic. Another widely unappreciated reality being highlighted in boardrooms everywhere is how these changes are affecting business risk and internal information technology governance. Broadly lumped into cybersecurity, the sparsity of legal precedent in this domain is coupled almost daily with a need for headline driven, rapid fire business decisions.

To deal with this new reality, enterprises must standardize and optimize the complexity associated with managing data. Success in this task mandates a renewed focus on data classification, data labeling and data loss prevention. Although these data security precautions have historically been
glossed over as too expensive or too hard, the penalties and long term pain associated with a data breach incident has raised the stakes considerably. According the Global Commission on Internet Governance, the average financial cost of a single data breach could exceed $12,000,000 [1] , which includes:

  • Organizational costs: $6,233,941
  • Detection and Escalation Costs: $372,272
  • Response Costs: $1,511,804
  • Lost Business Costs: $3,827,732
  • Victim Notification Cost: $523,965

So is adequate data classification still just simply a bridge too far?

While the competencies required to implement an effective data management program are significant, they are not impossible. Relevant skillsets are, in fact, foundational to the deployment of modern business automation which, in turn, represents the only economical path towards streamlining repeatable processes and reducing manual tasks. Minimum steps include:

  • Improving enterprise awareness around the importance of data classification
  • Abandoning outdated or realistic classification schemes in order to adopt less complex ones
  • Clarifying organizational roles and responsibilities while simultaneously removing those that have been tailored to individuals
  • Focus on identifying and classifying data, not data sets.
  • Adopt and implement a dynamic classification model.[2] 

The modern enterprise must either build these competencies in-house or work with a trusted third party to move through these steps. Since the importance of data will only increase, the task of implementing a modern data classification and modeling program is destined to become even more business critical.

( This post was brought to you by IBM Global Technology Services. For more content like this, visit Point B and Beyond.)

[1]Global Cyberspace Is Safer Than You Think: Real Trends In Cybercrime, Centre for International Governance Innovation 2015, https://www.cigionline.org/sites/default/files/no16_web_1.pdf


[2] Recommended steps adapted from “Rethinking Data Discovery And Data Classification by Heidi Shey and John Kindervag, October 1, 2014, available from IBM at https://www-01.ibm.com/common/ssi/cgi-bin/ssialias?htmlfid=WVL12363USEN

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2015)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network