Tuesday, July 18, 2017

How to Quickly Import 100TB+ Data into AWS, GCP or Azure

How do you quickly import large volumes of data into the Microsoft Azure, Google Cloud Platform or Amazon Web Services (AWS) Cloud platforms?

This is an enterprise or SMB safer option over copying data over the internet. Especially, when it is over 10-500TB in size.

AWS Snowball:  (50TB and 80TB Blocks of Data)
https://aws.amazon.com/snowball/

AWS Snowmobile (Enterprises up to 100PB per Vehicle which is a ruggedized shipping container that comes with help from AWS Engineers to connect to a high-speed switch)
https://aws.amazon.com/snowmobile/

Google Cloud Platform (100TB and 480TB)
https://cloud.google.com/data-transfer/
AWS Competition Play: https://cloud.google.com/storage/transfer/

Microsoft Azure (Import/Export Service) - Microsoft's Azure Import offering is the most clunky
https://azure.microsoft.com/en-us/services/storage/import-export/
Process: https://docs.microsoft.com/en-us/azure/storage/storage-import-export-service
Pricing: https://azure.microsoft.com/en-us/pricing/details/storage-import-export/

Azure/Snowball Application: https://goo.gl/k73Tyg

NOTE: AWS has the huge lead in the import functionality and application support and it's also helpful that they have a business that is all about shipping and receiving which is a consideration to factor when making your operational decisions (always consider if you have to export everything "OUT" of the platform in the future "if" that becomes a problem and needs to be done, but likely it's a low prospect of changing when that much data is moved in. 

Google is really moving quickly though to compete with Amazon. Their 100TB and 480TB options are much cheaper than Azure and way less complicated.

Research Articles:
http://www.infoworld.com/article/2991286/cloud-computing/how-aws-azure-google-import-data-in-bulk.html


Share: