How to Put Public Sector Data Migration Hassles on the Road to Extinction

Getting Your Network in the Cloud

By G C Network | May 25, 2016

Join us with Virtual Newsmakers on Saturday, May 28th at 11:00am for a YouTube Livestream on cloud computing. Virtual Newsmakers is a webcast show featuring virtual newsmakers, who are bridging…

Enterprise Networking in a Cloud World

By G C Network | May 17, 2016

Enterprises must rethink network management in the cloud computing world. This new reality is driven by the rise of software defined networking, the virtualization of everything and a business imperative…

The Game of Clouds 2016

By G C Network | May 13, 2016

In the mythical, medieval land of AWS, a civil war brews between the several noble Cloud Services over rulership. Meanwhile, across the sea, the former controlling dynasty, Traditional IT, attempts…

10 Ways to Flash Forward

By G C Network | May 7, 2016

Not to long ago I was honored to be included as a storage expert in the Dell ebook, “10 Ways to Flash Forward: Future-Ready Storage Insights from the Experts.” This…

The Future of Storage

By G C Network | April 28, 2016

A few weeks ago I had the pleasure of doing a Blab on advanced storage with Daniel Newman and Eric Vanderburg.  We covered some pretty interesting points on enterprise storage…

DevOps and Hybrid Infrastructure Synergy

By G C Network | April 3, 2016

(This post first appeared in IBM’s Point B and Beyond) The definition of DevOps emphasizes collaboration and communication between software developers and other IT professionals while automating the software delivery…

Are electronic medical records worth it?

By G C Network | March 23, 2016

The use of Electronic Medical Records (EMR) by medical professionals has increased dramatically. According to HealthIT.gov, 2015 statistics show that 56 percent of all U.S. office-based physicians (MD/DO) have demonstrated meaningful use…

Finding a Framework for Hybrid Cloud Risk Management

By G C Network | March 6, 2016

 (Sponsored by IBM. Originally published on Point B and Beyond) Hybrid cloud is rapidly becoming essential to today’s information technology processes. This is why hybrid cloud risk management has become…

Cancer, cloud and privacy shield

By G C Network | February 23, 2016

(Originally published in Dell PowerMore) For more than 10 years, the rapid rise of cloud computing has enabled an even more rapid application of cloud to genomic medicine. In fact,…

Hybrid Cloud Versus Hybrid IT: What’s the Hype?

By G C Network | February 3, 2016

(Originally posted on Point B and Beyond) Once again, the boardroom is in a bitter battle over what edict its members will now levy on their hapless IT organization. On…


With careful planning and the right technology, Federal, State and Local Government IT Leaders can overcome fears of data migrations, breaking free from archaic procedures to lead the pack 
By David Wegman, Vision Solutions, Senior Vice President, Integrated Accounts


Jurassic World, the latest installment in the Jurassic Park film series, opened this week – and there’s a lot of hype surrounding the premiere as fans immerse themselves in a world of Mesozoic Era-inspired fantasy. While the creatures that make the theme park their home are strikingly realistic, their real-life counterparts became extinct millennia ago.  Many believe that the once mighty dinosaur population fell in large part because it failed to evolve with the changing world around it. Public sector institutions face a similar plight today, especially as technology advancements demand they constantly evolve in order to keep up.
Much like the dinosaurs fought for survival, governmental organizations must fight for resources. They must embrace change in order to thrive, and part of that involves modernizing systems, streamlining processes and migrating vast amounts of data. However, many organizations postpone such work due to uncertainties about the impact and technology risks associated with these procedures, including the inherent downtime associated with most migration methodologies.
Many public sector CIOs and IT leaders are concerned about the fallout from failed migrations, which are a painful waste of time and resources. And their concerns are not unfounded: In its 2015 State of Resilience report, Vision Solutionsrevealed that 36 percent of respondents had experienced a migration failure. While failures are a relatively common occurrence, they are not inevitable. A thorough planning process and the right resources go a long way to improve the chances of success.
Regardless of the reason for a migration, significant complexity and potential pitfalls litter the path from point A to point B. In addition to understanding the migration process and identifying who is going to do the work, users must assess downtime’s potentially negative impacts.
The fact is, migrations are complex; even those that sound simple are inherently complex. Because most servers and databases are not single instances within a data center but interconnected to other systems and databases, including mid-range and mainframe systems, there is immense variety that complicates migrations today. These various platforms need to be coordinated in migration waves to mitigate their impacts.  This all presents complexity – and complexity presents risks.
The first task toward successful migrations is to map out a thorough migration plan beforehand: IT needs to determine what they are going to migrate, what it is all connected to, who is going to do the work and when they can get a “migration window” from their unit, as well as how they will navigate around the many real-time issues that may arise along the way. Planning ahead for potential issues gives IT clarity on factors that may affect the migration process, allowing the migration team to address problems in advance and in real-time.
All migrations need testing before deployment; and testing further contributes to the time and resource-intensive nature of migrations. Traditional pre-migration testing can take anywhere from a couple of weeks to a couple of months, depending on how complex the applications, databases and server inter-connections are in the data center. The process typically involves halting production periodically to take snapshots of data and testing those snapshots.  Each time the process is completed, IT must restore the database and start over. This typically involves multiple test runs and multiple cycles within the organization. The entire process can take anywhere from hours to entire days depending largely on the factors outlined above.
IT often faces an uphill battle in convincing leadership to agree to a migration project due to downtime risks and impacts. This can cause substantial delays, compounding the migration’s complexity. Elected and appointed officials may not always have exact numbers on hand, but they do realize that downtime is costly. Nineteen percent of respondents in the Vision Solutions 2015 State of Resilience report indicated that the cost of downtime ranged from $10,000 to $50,000 per hour. Fearing the steep costs of downtime and associated risks, leadership may hesitate to green light migration projects, preventing successful execution.
How can public sector IT leaders address this problem? Government IT leaders need more than ever to seek out trusted partners with a track record of helping organizations like them accomplish important migrations and system upgrades. Rather than reacting to perceived risks, limited expertise or cost of downtime, they should be confident and proactive, following best practices to set them up for success. Organizations that embrace proven technology and methodologies will be well-positioned to realize the full benefits of smooth migrations.
Uncertainty, risk and extended downtime don’t need to be migration realities. By working with a trusted partner and utilizing modern technology and methodology, public sector IT leaders can achieve near-zero downtime during migrations, minimizing impact on the organization and users. But to do so, they should consider the following when selecting a migration solution:
1. Real-time replication is paramount: Organizations should look for solutions that offer the most flexibility and currency of data possible while minimizing impact to users during testing and migration. This typically requires a software-based solution that replicates any activity taking place on the production server to the target server in real-time, allowing IT to keep the production server up and running rather than freezing it or periodically pausing it for snapshots. The production server remains fully functional, data is as current as the last transaction and users continue working. IT can test applications on the new server, and prove the migration methodology and plan, without impacting the production environment. Ultimately, this makes IT more productive on other tasks with improved uptime – all while migration is taking place.
A second consideration is how to take the distance from production server to target server out of the equation. Because real-time replication sends the changes as they occur, it minimizes the amount of communication line and distance becomes less of an issue. When coupled with compression and throttling in a product, this creates a high degree of efficiency.
Finally, because databases and servers are maintained in sync at all times, IT does not need to freeze production and wait for final validation of the testing server to finally perform the migration. Weekend migrations are no longer the norm as the switch to new environments can occur at any time the organization is ready and take place in as little as 20 minutes– a notable improvement over switch times in traditional migrations.
2. Unified consoles simplify the process: Another feature government IT leaders should demand in their migration solution is a unified console that allows IT to work on all types of migrations with a common workflow across operating systems and platforms. This provides a major advantage as it mitigates the need for different skillsets typically required for different types of migrations by platform or workload.
While IT staff certainly needs to understand the underlying architectures and databases, by using a uniform console and workflow, it reduces training time and maximizes the existing team’s skillset. A single operator can perform parallel migrations across multiple platforms after product training sessions, minimizing the drain on resources.
3. Consolidating migration streams delivers faster execution: Simultaneous executions also ease the impacts to the organization. A solution that allows users to run parallel streams of migrations saves organizations significantly more time than traditional methods. This method facilitates near-zero downtime, shortens time to completion, mitigates costs and frees up IT resources to focus on other projects.
4. Automation minimizes risk: Traditional methods typically require a fair degree of manual work, which equals a higher degree of risk.  While no migration can happen without people, solutions that reduce the required amount of human interaction from migrations via automation diminish risk. This is very important for organizations to keep in mind, as failing to choose a solution that provides APIs the ability to automate as much of the work as possible introduces additional human interaction and therefore risk. This lack of automation often results in failed migrations, migrations that run over budget or last longer than expected.
5. Hardware- and software-independent solutions enable flexibility: Every server is different and topologies change rapidly. Migrating across server types, chipsets, storage devices, databases, versions and the like all need to be addressed in a migration plan. A hardware- and software-independent solution reduces the risk potential in these areas. This model allows users to migrate data seamlessly from any one type of environment to another. The options are virtually endless – from physical to virtual to cloud across any operating system, chipset or storage device.
Using platform-independent technology makes many scenarios possible including migrating between storage from different vendors, migrating to a server located anywhere in the world, consolidating servers with many-to-one migration and moving operations to a new data center across extended distances with very little downtime.
Evolve Continuously to Achieve Migration Success
While data migrations will always entail a certain amount of risk and downtime, modern solutions have greatly improved chances for a positive outcome. IT leaders at Federal, State and Local governments and those at their agencies that act confidently, instead of out of fear, to take on important migrations for their organization will come out on the top of the food chain, evolving to thrive to the benefit of both their organization and the taxpayer.  

 

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2015)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network