Could Budget Sweeps Fix Your Cybersecurity Problem?

Cloud Computing and the Process Integration Era

By G C Network | December 17, 2008

The Industry Advisory Council (IAC) is a non-profit, non-partisan organization dedicated to fostering improved communications and understanding between government and industry. through its affiliation with the American Council for Technology…

The Tactical Cloud

By G C Network | December 16, 2008

When cloud computing first came in vogue, there was a rather serious discussion about the private cloud concept. The whole idea of cloud computing seemed to argue against implementing such…

“Cloud Musings” Now on SYS-CON Media “Cloud Computing Journal” !!

By G C Network | December 15, 2008

I’m happy to announce that a recent “Cloud Musings” article, “Commercial vs Federal Cloud Computing ” has been reposted on SYS-CON Media’s “Cloud Computing Journal“. Thank you SYS-CON for making…

How to make clouds interoperable and standard !!

By G C Network | December 12, 2008

This has been a huge part of my life over the past few weeks! This is my personal view. WARNING: DON’T EXPECT THE ANSWER TO BE FOUND BELOW !!! There…

The Tension between Public and Private Clouds

By G C Network | December 11, 2008

Last week, during discussion on cloud interoperability and standards in Israel, I saw for the first time a real dichotomy in the value of public (external) and private (internal) clouds.…

Cloud Computing for Continuity of Operations (COOP)

By G C Network | December 10, 2008

Recently, I’ve been focusing on cloud computing for COOP. The way I looked at it, many government agencies are already using commercial shared facilities as COOP sites and that the…

NCOIC Plenary Session

By G C Network | December 9, 2008

Hopping a plane to the west coast today to attend the NCOIC Plenary in Costa Mesa, California. First day “Cloud Computing for Net-Centric Operations” agenda includes: David Ryan, Chief Architect…

Dataline named “Top 100 Cloud Computing Company”

By G C Network | December 9, 2008

SYS-CON’s Cloud Computing Journal included Dataline in its expanded list of the most active players in the cloud ecosystem. In adding Dataline to the “Top 100” list, Jeremy Geelan noted…

Autoscaling into the cloud- Good or Bad?

By G C Network | December 8, 2008

I always thought saw the ability to autoscale into a cloud infrastructure as a good thing. George Reese presented a differing view on the O’Reilly blog recently. “Auto-scaling is the…

Cloudera must be reading the script!

By G C Network | December 4, 2008

“Cloud computing leapt out as the most obvious way to address enterprise large data problems” – Ken Pierce, IT Specialist, DIA-DS/C4ISR “We view Hadoop as the key enabler…[in] optimizing the…

A recent roundtable discussion in Washington, DC with Federal IT and Cyber leaders focused on the business drivers, challenges and evolving strategies around cybersecurity in government.  After an opening presentation by Jim Quinn, the lead systems engineer for the Continuous Diagnostics and Mitigation program at the Department of Homeland Security, the discussion highlighted the need for data security. Key takeaways included:

  • A new emphasis on data-level security across government that puts security controls closer to the data itself, rather than focusing on the perimeter.
  • The urgency around data security is increasing, with 71 percent of agencies having been breached, which is a threefold increase from three years ago.
  • Need to deal with an expanding requirement to add more and more capabilities to mission systems with the understanding that protecting data is part of the mission.
  • Agencies that only focus their time, energy and budget on meeting various mandates are having trouble keeping up with evolving cyber threats.
  • While agencies have much flexibility in how they acquire, manage and deliver information and services, they are still responsible for protecting their data. Agencies must, therefore, approach data security at the enterprise level.
  • Data security is a matter of law. 44 U.S.C., Sec. 3542 directs agencies to ensure the confidentiality, integrity, andavailability of government data.

As I’ve written many times before, organizations need to focus on how to transition to a hybrid IT future.  The overall information technology marketplace is also undergoing these dramatic shifts toward data-centric security.  Data management has moved from the management of structured data into an environment where real-time analysis and reporting of streaming data is essential. 

International commerce is also entering an environment of stricter data management regulations and national data sovereignty laws that, if violated, introduce the possibility of punishing remedies and fines. This rapid progression has also driven a massive change in information technology services. Cloud and managed service providers are meeting this need through the innovative creation and deployment of API accessible, immediately consumable, data manipulation services. Enterprise IT organizations have shown themselves unable to keep pace with the blistering increase in the number and breadth of broader IT marketplace services.  It’s also not cost-effective or even desirable for them to try.

With the recent focus on data-level security and year-end budget sweeps around the corner, shouldn’t your agency be looking at how to better store and protect its data? Mandates around IT Modernization and Cloud Computing aren’t going away soon either.  With cloud and managed service provider data storage solutions so accessible, your current on-premise solution may be hurting your mission in many ways including:
  • High CAPEX driven by significant upfront equipment costs lead to poor ROIs with long payback periods;
  • High OPEX characterized by recurring power, cooling and rack space expenses;
  • Expensive monthly hardware and software maintenance and support fees;
  • Excessive system administration cost and complexity all lead to high ongoing operations expenses;
  • Obsolescence concerns caused by storage vendors that regularly retire products and discontinue support plans, often subjecting customers to costly and disruptive upgrades;
  • High mission operational risk due to an inability to replicate live data to a secondary data center; and
  • Complex legacy storage solutions that are difficult to configure and administer.

Take a minute to think about this. Maybe those year-end sweep dollars would be better spent on improving your mission readiness with a cloud storage solution like Wasabi. Wasabi is hot cloud storage. It’s being used as a way to archive data, or used as 2ndcopy, because the price for storage on Wasabi is so low and they’ve made cloud storage prices predictable with no egress charges.. It’s also secure with 11 nines of durability. Wasabi offers immutability so your data is protected from most common causes of data loss.  Finally Wasabi is high-performing; 6 times faster than its competitors. It’s easy to test by signing up for a free trial at wasabi.com

This post was brought to you by Wasabi Hot Storage 

 

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2016-2018)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network