How to Put Public Sector Data Migration Hassles on the Road to Extinction

One AWESOME Week of Cloud Computing

By G C Network | June 11, 2010

We just finished up five AWESOME days of cloud computing training with USAREUR in Schwetzingen, Germany ! CHECK IT OUT !! Create your own video slideshow at animoto.com. Sure we…

NGA Exploring “Community Cloud” With NCOIC

By G C Network | June 7, 2010

The National Geospatial-Intelligence Agency (NGA) is looking to leverage industry expertise through collaboration with the Network Centic Operations Industry Consortium (NCOIC). NGA provides timely, relevant and accurate geospatial intelligence in…

DoD, NASA and GSA Address Secure Cloud Computing

By G C Network | May 29, 2010

On Thursday, May 26th, the Federal Executive Forum featured three important Federal cloud computing leaders: David McClure- Associate Administrator, GSA Office of Citizen Services and Communications Col. Kevin Foster- Office of…

Cloud Computing Day at DoDIIS

By G C Network | May 25, 2010

I’m declaring Monday, May 24th, as Cloud Computing Day at DoDIIS.  Lieutenant General Richard Zahner, Army Deputy Chief of Staff, G2, seemed to get things going with his video that…

Vivek Kundra – State of Public Sector Cloud Computing

By G C Network | May 25, 2010

Last week Federal Chief Information Officer Vivek Kundra release his report on the “State of Public Sector Cloud Computing”. The report not only details Federal budget guidance issued to agencies…

Cloud Computing at DoDIIS

By G C Network | May 18, 2010

Next week in Phoenix, AZ, the Defense Intelligence Agency will host the 2010 Department of Defense Intelligence Information Systems (DoDIIS) Worldwide Conference. The theme of this event is “Mission Powered…

Open Group Publishes Guidelines on Cloud Computing ROI

By G C Network | April 29, 2010

In an important industry contribution, The Open Group has published a white paper on how to build and measure cloud computing return on investment (ROI). Produced by the Cloud Business…

The Army’s iPhone Story

By G C Network | April 15, 2010

Sandra Erwin of National Defense magazine just published an excellent article on the Army’s foray into developing soldier-friendly smartphone applications.  Giving credit to Army CIO Lt. Gen, Jeffrey Sorenson and…

Vivek Kundra Steps Up to Cloud Computing’s Next Challenge

By G C Network | April 11, 2010

” [C]loud customers must be able to easily store, access, and process data across multiple clouds; weave together a mesh of different services to meet their needs; and have a…

Cloud Computing’s Next Challenge

By G C Network | March 26, 2010

Earlier this month, Melvin Greer and I teamed up on a Military Information Technology piece. Melvin is a senior research engineer and cloud computing chief architect at Lockheed Martin, and…


With careful planning and the right technology, Federal, State and Local Government IT Leaders can overcome fears of data migrations, breaking free from archaic procedures to lead the pack 
By David Wegman, Vision Solutions, Senior Vice President, Integrated Accounts


Jurassic World, the latest installment in the Jurassic Park film series, opened this week – and there’s a lot of hype surrounding the premiere as fans immerse themselves in a world of Mesozoic Era-inspired fantasy. While the creatures that make the theme park their home are strikingly realistic, their real-life counterparts became extinct millennia ago.  Many believe that the once mighty dinosaur population fell in large part because it failed to evolve with the changing world around it. Public sector institutions face a similar plight today, especially as technology advancements demand they constantly evolve in order to keep up.
Much like the dinosaurs fought for survival, governmental organizations must fight for resources. They must embrace change in order to thrive, and part of that involves modernizing systems, streamlining processes and migrating vast amounts of data. However, many organizations postpone such work due to uncertainties about the impact and technology risks associated with these procedures, including the inherent downtime associated with most migration methodologies.
Many public sector CIOs and IT leaders are concerned about the fallout from failed migrations, which are a painful waste of time and resources. And their concerns are not unfounded: In its 2015 State of Resilience report, Vision Solutionsrevealed that 36 percent of respondents had experienced a migration failure. While failures are a relatively common occurrence, they are not inevitable. A thorough planning process and the right resources go a long way to improve the chances of success.
Regardless of the reason for a migration, significant complexity and potential pitfalls litter the path from point A to point B. In addition to understanding the migration process and identifying who is going to do the work, users must assess downtime’s potentially negative impacts.
The fact is, migrations are complex; even those that sound simple are inherently complex. Because most servers and databases are not single instances within a data center but interconnected to other systems and databases, including mid-range and mainframe systems, there is immense variety that complicates migrations today. These various platforms need to be coordinated in migration waves to mitigate their impacts.  This all presents complexity – and complexity presents risks.
The first task toward successful migrations is to map out a thorough migration plan beforehand: IT needs to determine what they are going to migrate, what it is all connected to, who is going to do the work and when they can get a “migration window” from their unit, as well as how they will navigate around the many real-time issues that may arise along the way. Planning ahead for potential issues gives IT clarity on factors that may affect the migration process, allowing the migration team to address problems in advance and in real-time.
All migrations need testing before deployment; and testing further contributes to the time and resource-intensive nature of migrations. Traditional pre-migration testing can take anywhere from a couple of weeks to a couple of months, depending on how complex the applications, databases and server inter-connections are in the data center. The process typically involves halting production periodically to take snapshots of data and testing those snapshots.  Each time the process is completed, IT must restore the database and start over. This typically involves multiple test runs and multiple cycles within the organization. The entire process can take anywhere from hours to entire days depending largely on the factors outlined above.
IT often faces an uphill battle in convincing leadership to agree to a migration project due to downtime risks and impacts. This can cause substantial delays, compounding the migration’s complexity. Elected and appointed officials may not always have exact numbers on hand, but they do realize that downtime is costly. Nineteen percent of respondents in the Vision Solutions 2015 State of Resilience report indicated that the cost of downtime ranged from $10,000 to $50,000 per hour. Fearing the steep costs of downtime and associated risks, leadership may hesitate to green light migration projects, preventing successful execution.
How can public sector IT leaders address this problem? Government IT leaders need more than ever to seek out trusted partners with a track record of helping organizations like them accomplish important migrations and system upgrades. Rather than reacting to perceived risks, limited expertise or cost of downtime, they should be confident and proactive, following best practices to set them up for success. Organizations that embrace proven technology and methodologies will be well-positioned to realize the full benefits of smooth migrations.
Uncertainty, risk and extended downtime don’t need to be migration realities. By working with a trusted partner and utilizing modern technology and methodology, public sector IT leaders can achieve near-zero downtime during migrations, minimizing impact on the organization and users. But to do so, they should consider the following when selecting a migration solution:
1. Real-time replication is paramount: Organizations should look for solutions that offer the most flexibility and currency of data possible while minimizing impact to users during testing and migration. This typically requires a software-based solution that replicates any activity taking place on the production server to the target server in real-time, allowing IT to keep the production server up and running rather than freezing it or periodically pausing it for snapshots. The production server remains fully functional, data is as current as the last transaction and users continue working. IT can test applications on the new server, and prove the migration methodology and plan, without impacting the production environment. Ultimately, this makes IT more productive on other tasks with improved uptime – all while migration is taking place.
A second consideration is how to take the distance from production server to target server out of the equation. Because real-time replication sends the changes as they occur, it minimizes the amount of communication line and distance becomes less of an issue. When coupled with compression and throttling in a product, this creates a high degree of efficiency.
Finally, because databases and servers are maintained in sync at all times, IT does not need to freeze production and wait for final validation of the testing server to finally perform the migration. Weekend migrations are no longer the norm as the switch to new environments can occur at any time the organization is ready and take place in as little as 20 minutes– a notable improvement over switch times in traditional migrations.
2. Unified consoles simplify the process: Another feature government IT leaders should demand in their migration solution is a unified console that allows IT to work on all types of migrations with a common workflow across operating systems and platforms. This provides a major advantage as it mitigates the need for different skillsets typically required for different types of migrations by platform or workload.
While IT staff certainly needs to understand the underlying architectures and databases, by using a uniform console and workflow, it reduces training time and maximizes the existing team’s skillset. A single operator can perform parallel migrations across multiple platforms after product training sessions, minimizing the drain on resources.
3. Consolidating migration streams delivers faster execution: Simultaneous executions also ease the impacts to the organization. A solution that allows users to run parallel streams of migrations saves organizations significantly more time than traditional methods. This method facilitates near-zero downtime, shortens time to completion, mitigates costs and frees up IT resources to focus on other projects.
4. Automation minimizes risk: Traditional methods typically require a fair degree of manual work, which equals a higher degree of risk.  While no migration can happen without people, solutions that reduce the required amount of human interaction from migrations via automation diminish risk. This is very important for organizations to keep in mind, as failing to choose a solution that provides APIs the ability to automate as much of the work as possible introduces additional human interaction and therefore risk. This lack of automation often results in failed migrations, migrations that run over budget or last longer than expected.
5. Hardware- and software-independent solutions enable flexibility: Every server is different and topologies change rapidly. Migrating across server types, chipsets, storage devices, databases, versions and the like all need to be addressed in a migration plan. A hardware- and software-independent solution reduces the risk potential in these areas. This model allows users to migrate data seamlessly from any one type of environment to another. The options are virtually endless – from physical to virtual to cloud across any operating system, chipset or storage device.
Using platform-independent technology makes many scenarios possible including migrating between storage from different vendors, migrating to a server located anywhere in the world, consolidating servers with many-to-one migration and moving operations to a new data center across extended distances with very little downtime.
Evolve Continuously to Achieve Migration Success
While data migrations will always entail a certain amount of risk and downtime, modern solutions have greatly improved chances for a positive outcome. IT leaders at Federal, State and Local governments and those at their agencies that act confidently, instead of out of fear, to take on important migrations for their organization will come out on the top of the food chain, evolving to thrive to the benefit of both their organization and the taxpayer.  

 

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2015)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network