LiveMigrator: Moving Petabytes of Big Data to the Cloud WITHOUT Impeding Productivity
By Van Diamandakis, Feb 26, 2020
Let’s consider two simple statements:
a) Enterprises want to move their big data to the cloud; and
b) Enterprises can’t afford to lose data consistency or shut down even a fraction of their business during migration.
Until now, points a) and b) were incompatible, if not mutually exclusive.
The fact is that enterprises want to fully embrace the benefits of big data in a cloud or hybrid cloud environment. They’re seeking the benefits of increased productivity, security and global data access for employees.
And most enterprises are already in the cloud. According to a recent Gartner survey, 94% of IT professionals responding were using at least one public or private cloud (McAfee put that number at 97%), and 84% had adopted a multi-cloud solution. Of those using a public cloud solution, 80% were using more than one service provider. And IDG found that 41% of enterprises are already migrating storage/archive/backup/file servers to the cloud, and 21% plan to do so in the coming year.
Yet until now, enterprises have been hard-pressed to overcome the major hurdle of moving their large-scale on-prem data to the cloud without either losing consistency or losing business. Large-scale data migration has thwarted CIOs for years because of the “live” nature of business data - which changes minute-to-minute and needs to be available 24/7 without interruption.
The Old Way
Legacy technology attempts to migrate data by copying it. As discussed above, the problem is that at scale copying petabytes of data takes time. Since data changes during copying, this multi-pass methodology dictates a re-scan following copying, which discovers the changes that occurred during copying and rectifies them in the copy. This pass can be lengthy for large data sets. And of course, while this re-scan is underway, additional changes occur to the live data…
Let’s take a simple example:
We have a petabyte of data to migrate. This migration can be very lengthy – let’s say 10 days for the sake of example. Since the app using this data set is live, in the course of each business day, approximately 100GB of data is either added or changed.
Thus, following the ten-day copying period, we end up with a deficit of 1000GB (10 days x 100 GB a day) of inconsistent data.
To address this delta, traditional migration solutions take another pass. It takes less time to scan than to copy – so we’ll say ten hours to rescan the petabyte of data for changes. Yet during these ten hours – essentially a workday – another 100 GB has changed. Another pass is required, during which an additional 100 GB changes…and so on, and so on.
In the end – businesses generally need to choose: shut down the business to achieve consistency, or accept the inconsistency. It’s a lose-lose deal.
And Then Came LiveMigrator
The core of WANdisco’s LiveMigrator solution is a unique technology that enables non-blocking file system read/write. This means that during petabyte-scale migration, LiveMigrator enables applications to fully access, ingest, and update data.
This is a big deal. The reason? Previous generations of cloud migration tech required costly and risky multiple passes to migrate data. This could take three to six months to work, and block users from making changes during the entire period. Not very conducive to point b) above. Enterprises can’t afford to…shut down even a fraction of their business during migration.
LiveMigrator’s non-blocking technology enables seamless migration of petabytes of unstructured data from on-premise data centers to any cloud vendor in one pass. Even as data is moving to the cloud, applications continue to access the existing on-prem environment, while users can choose to direct new workloads or queries at cloud assets.
What’s more, LiveMigrator automatically keeps on-premise data consistent with migrated cloud-based data - forming a hybrid cloud environment - while still complying with strict availability and performance service level agreements. And LiveMigrator requires no changes to the dependent application interface and does not impede application performance.
The Bottom Line
Migrating big data to the cloud requires skill and expertise — and 62% of big data migration efforts are harder to complete than expected or fail. Data migration obstacles – like how to move petabyte-scale big data to the cloud without stopping business – have been the stumbling block to innovation long enough. At WANdisco we decided to bring this to an end. Now, LiveMigrator changes everything.
About the author
Van Diamandakis, SVP of Marketing, WANdisco
Van is a proven Silicon Valley technology executive with over 25 years of operational experience that draws upon his track record leading global marketing transformations, driving to meaningful financial events including IPOs and acquisitions. Van has been at the forefront of B2B technology marketing and brings a unique ability to marry creativity, data, technology and leadership skills to rapidly build brand equity and successfully navigate tech companies through inflection points, accelerating revenue growth and valuation.
Related Blog Posts
Tech & Trends
Learn about Azure cloud storage solutions at Azure Storage Day
Microsoft is hosting Azure Storage Day on April 29, 2021 where you can learn more about Azure cloud...
Apr 19, 2021Read More
Tech & Trends
Why We Need FinOps for Cloud Cost Management
The move to the cloud can be like buying and moving into a new home. Cloud architects and business l...
Apr 08, 2021Read More
WANdisco’s LiveData Partner Network Recognized by CRN
Late last year, while there was still no light at the end of the coronavirus pandemic tunnel, we at...
Mar 29, 2021Read More