Could Budget Sweeps Fix Your Cybersecurity Problem?

Strategies And Technologies for Cloud Computing Interoperability (SATCCI)

By G C Network | March 4, 2009

As I alluded to in an earlier post, a major cloud computing interoperability event will be held in conjunction with the Object Management Group (OMG) March Technical Meeting on March…

Government Cloud Computing E-zine Launched

By G C Network | March 3, 2009

Today marks the launch of a new electronic magazine dedicated to addressing cloud computing within the government space. Over the last year during my personal exploration of this marketspace, I’ve…

NCOIC Plenary: Cloud Computing Working Group

By G C Network | March 2, 2009

Last week, I had the pleasure of participating in the NCOIC Cloud Computing Working Group. Led by Cisco Systems Distinguished Engineer, Mr. Krishna Sankar of Cisco Systems, the meeting purpose…

2nd Government Cloud Computing Survey – A Sneak Peek

By G C Network | February 25, 2009

This month, we’re in the middle of collecting data for our 2nd Government Cloud Computing Survey. to peek your curiosity (an to entice your participation) here is a sneak peek…

Government could save billions with cloud computing

By G C Network | February 23, 2009

In a recent study, published by MeriTalk, Red Hat and DLT Solutions, the Federal government could save $6.6 billion by using cloud computing or software-as-a-service. “Looking at 30 federal agencies,…

Cloud Games at FOSE 2009

By G C Network | February 19, 2009

ONLINE REGISTRATION NOW AVAILABLE Booz Allen Hamilton is launching its Cloud Computing Wargame (CCW)T at FOSE March 10-12, 2009 in Washington, DC. The CCW is designed to simulate the major…

IBM and Amazon

By G C Network | February 16, 2009

According to the Amazon Web Services (AWS) site, you can now use DB2, Informix, WebSphere sMash, WebSphere Portal Server or Lotus Web Content Management on Amazon’s EC2 cloud. “This relationship…

A Berkeley View of Cloud Computing

By G C Network | February 13, 2009

Yesterday, Berkeley released their View of Cloud Computing with a view that cloud computing provides an elasticity of resources, without paying a premium for large scale, that is unprecedented in…

Cloud Economic Models

By G C Network | February 11, 2009

One of the most important drivers of cloud computing in the Federal space is its perceived “compelling” economic value. Some initial insight on the economic argument is now available on…

Cloud Computing In Government: From Google Apps To Nuclear Warfare

By G C Network | February 10, 2009

Today, I want to thank John Foley of InformationWeek for an enjoyable interview and his excellent post, Cloud Computing In Government: From Google Apps To Nuclear Warfare. Our discussion covered…

A recent roundtable discussion in Washington, DC with Federal IT and Cyber leaders focused on the business drivers, challenges and evolving strategies around cybersecurity in government.  After an opening presentation by Jim Quinn, the lead systems engineer for the Continuous Diagnostics and Mitigation program at the Department of Homeland Security, the discussion highlighted the need for data security. Key takeaways included:

  • A new emphasis on data-level security across government that puts security controls closer to the data itself, rather than focusing on the perimeter.
  • The urgency around data security is increasing, with 71 percent of agencies having been breached, which is a threefold increase from three years ago.
  • Need to deal with an expanding requirement to add more and more capabilities to mission systems with the understanding that protecting data is part of the mission.
  • Agencies that only focus their time, energy and budget on meeting various mandates are having trouble keeping up with evolving cyber threats.
  • While agencies have much flexibility in how they acquire, manage and deliver information and services, they are still responsible for protecting their data. Agencies must, therefore, approach data security at the enterprise level.
  • Data security is a matter of law. 44 U.S.C., Sec. 3542 directs agencies to ensure the confidentiality, integrity, andavailability of government data.

As I’ve written many times before, organizations need to focus on how to transition to a hybrid IT future.  The overall information technology marketplace is also undergoing these dramatic shifts toward data-centric security.  Data management has moved from the management of structured data into an environment where real-time analysis and reporting of streaming data is essential. 

International commerce is also entering an environment of stricter data management regulations and national data sovereignty laws that, if violated, introduce the possibility of punishing remedies and fines. This rapid progression has also driven a massive change in information technology services. Cloud and managed service providers are meeting this need through the innovative creation and deployment of API accessible, immediately consumable, data manipulation services. Enterprise IT organizations have shown themselves unable to keep pace with the blistering increase in the number and breadth of broader IT marketplace services.  It’s also not cost-effective or even desirable for them to try.

With the recent focus on data-level security and year-end budget sweeps around the corner, shouldn’t your agency be looking at how to better store and protect its data? Mandates around IT Modernization and Cloud Computing aren’t going away soon either.  With cloud and managed service provider data storage solutions so accessible, your current on-premise solution may be hurting your mission in many ways including:
  • High CAPEX driven by significant upfront equipment costs lead to poor ROIs with long payback periods;
  • High OPEX characterized by recurring power, cooling and rack space expenses;
  • Expensive monthly hardware and software maintenance and support fees;
  • Excessive system administration cost and complexity all lead to high ongoing operations expenses;
  • Obsolescence concerns caused by storage vendors that regularly retire products and discontinue support plans, often subjecting customers to costly and disruptive upgrades;
  • High mission operational risk due to an inability to replicate live data to a secondary data center; and
  • Complex legacy storage solutions that are difficult to configure and administer.

Take a minute to think about this. Maybe those year-end sweep dollars would be better spent on improving your mission readiness with a cloud storage solution like Wasabi. Wasabi is hot cloud storage. It’s being used as a way to archive data, or used as 2ndcopy, because the price for storage on Wasabi is so low and they’ve made cloud storage prices predictable with no egress charges.. It’s also secure with 11 nines of durability. Wasabi offers immutability so your data is protected from most common causes of data loss.  Finally Wasabi is high-performing; 6 times faster than its competitors. It’s easy to test by signing up for a free trial at wasabi.com

This post was brought to you by Wasabi Hot Storage 

 

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2016-2018)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network