Skip to main content

How to Quickly Import 100TB+ Data into AWS, GCP or Azure

How do you quickly import large volumes of data into the Microsoft Azure, Google Cloud Platform or Amazon Web Services (AWS) Cloud platforms?

This is an enterprise or SMB safer option over copying data over the internet. Especially, when it is over 10-500TB in size.

AWS Snowball:  (50TB and 80TB Blocks of Data)
https://aws.amazon.com/snowball/

AWS Snowmobile (Enterprises up to 100PB per Vehicle which is a ruggedized shipping container that comes with help from AWS Engineers to connect to a high-speed switch)
https://aws.amazon.com/snowmobile/

Google Cloud Platform (100TB and 480TB)
https://cloud.google.com/data-transfer/
AWS Competition Play: https://cloud.google.com/storage/transfer/

Microsoft Azure (Import/Export Service) - Microsoft's Azure Import offering is the most clunky
https://azure.microsoft.com/en-us/services/storage/import-export/
Process: https://docs.microsoft.com/en-us/azure/storage/storage-import-export-service
Pricing: https://azure.microsoft.com/en-us/pricing/details/storage-import-export/

Azure/Snowball Application: https://goo.gl/k73Tyg

NOTE: AWS has the huge lead in the import functionality and application support and it's also helpful that they have a business that is all about shipping and receiving which is a consideration to factor when making your operational decisions (always consider if you have to export everything "OUT" of the platform in the future "if" that becomes a problem and needs to be done, but likely it's a low prospect of changing when that much data is moved in. 

Google is really moving quickly though to compete with Amazon. Their 100TB and 480TB options are much cheaper than Azure and way less complicated.

Research Articles:
http://www.infoworld.com/article/2991286/cloud-computing/how-aws-azure-google-import-data-in-bulk.html


Popular posts from this blog

How to Configure HP ILO 4 for Active Directory Login

1. Make sure that your Windows Active Directory Domain Controller has an SSL Certificate to support port 636 (HP's authentication doesn't like 389)

2. Use Softerra LDAP Administrator (or whatever is your preferred tool to get the OU path) if you don't know how to do it by heart (which...um...sometimes its better to verify).

3. Make sure that you have a way to authenticate users by adding in the OU where your restricted accounts are located. You don't want anyone in the domain to be able to login to the server.

ILO Login > Administration > User Administration - Click New and Add the Group DN Only:



Click Add Group and then you will see your group added. (*Make sure it's a security group)


4. Add in your Windows Active Directory DC to authenticate against (Verified against 2008R2):

ILO Login > Administration > Security - Directory (*Make sure it's the OU where the security group is)


5. Sign Out (Log off) and then Log Back in (If you don't see Direct…

How to Configure BGInfo for Windows Server 2012 R2

FYI: It's not hypervisor specific and works fine for physical servers also.
Download BGINFO from Microsoft Downloads Only
http://technet.microsoft.com/en-us/sysinternals/bb897557

1. Create a folder named bginfo under C:\bginfo
2. Extract all of the contents of bginfo to that folder.
3. Open Bginfo and setup your configurations.


*Custom configurations can be found here thanks to Shay Levy: http://blogs.microsoft.co.il/scriptfanatic/2008/07/22/bginfo-custom-information/

4. Once you have completed your custom configurations. Click on File Save As and save your .bgi configuration to C:\bginfo (Don't bother saving to C:\Windows\System32\* SysPrep and Imaging will strip and mess up any settings so don't bother) *Do NOT just clone your VM's!!


5. After you have saved your configuration. Create a batch file named whatever and add the following to the first line (*whatever you named the .bgi file is what you put second after the bginfo.exe path):


6. In case you forgot how. Enable…

How to Fix /storage/core filesystem Out of Disk Space Error on VCSA 6.0U1

How to fix the error of "The /storage/core filesystem is out of disk space or inodes"


Step 1: Login to the new VCSA 6.0U1 HTML5 web client. https://ip address:5480



Step 2: Enable SSH and Bash Shell
Step 3: Login as root and type "shell" at Command> shell
Step 4: df -h (Check if it's out of space)
/dev/mapper/core_vg-core               50G   50G     0 100% /storage/core
Step 5: Stop the services of VCSA: 
hostname: # service vmware-vpxd stop hostname: # service vmware-vpxd status (make sure it is stopped)
Step 6:  cd /storage/core
Step 7: rm -rf *.tgz (be CAREFUL...do this in the wrong directory and you will be retrieving from a backup.)


If you need help. Go to Cybercity (http://www.cyberciti.biz/faq/delete-all-files-folder-linux/
Step 8: service vmware-vpxd restart

Step 9: history -c
Step 10: Refresh the browser (https://ip address:5480). Now it's all green


VMware KB: (