here we are in the last lesson for this module where we're going to take a look at understanding, import and export services inside of Azure.
Our objectives include we're going to understand what exactly is the import export service, and then we'll review an example import workflow.
So first, what is the Azure Import Export service?
This allows you to securely import data into your containers or file share storage inside your storage account.
And what you do is you actually packed the data onto a disk drive and ship it to the Azure Data Center, where the storage account is located. Thes disk drives can either be solid state drives or traditional hard disk drives.
Now these dis that you use can be your own disc. Or you can actually use Microsoft supplied disc. Microsoft will ship you five encrypted solid state disc drives up to 40 terabytes per import order that you want to create.
They'll ship these to you. You can configure the disk drives to copy your data to that, and then send it back over to them to import
some use cases for the import export services are if you need to do a large data migration to the cloud, or you need to distribute some content to a partner. You could package it up and have it imported into their storage count. You could use it simply as a backup mechanism or for data recovery.
Let's take a look at some of the import export components.
First, you have the actual import export service.
This is located inside the azure portal, and you actually create it just like you would any other resource, like a storage account, virtual network or virtual machine. Next, you have the W import export tool. This is a command line utility you can download from Microsoft, and it plays important part in generating the import job.
It prepares your drives for import. It can actually copy the data.
It will encrypt those drives with bit locker
that actually generates a journal file when you create the import job,
and it also identifies the number of drives for your export jobs.
Now there are two versions of the W. A import export tool. You're gonna use version one if you're working with your azure blob or container storage,
and you'll use Version two if you're working with your azure file storage.
Now let's take a look at what a typical import job workflow looks like.
First, you as the customer will prepare your data using the W A import export tool. Copy your data over to the disk drives, generating the German files and encrypting the drives with bit locker. Once this is completed, you will packaged it up and actually ship it to the Azure data Center in the region where the azure storage account is located.
After the data is received by Microsoft, they will actually import it for you into the storage account.
And finally, if you're using your own disc for the job, Microsoft will package up the disk and return them back to you.
Let's also jump over to Microsoft docks and take a look at the full process of creating the import job.
So the first thing I want to check out is theme Microsoft Azure Import Export Tool Download page, and what I want to show you is the system requirements. This is going to require some type of Windows system since you are encrypting the drives using bit locker,
let's switch over to Microsoft docks, and in this article, it talks about how to prepare for an import export job. Have we scroll down here? First, we take a look at preparing the drives.
You'll need to format them within TFS, enable bit locker and then get the bit locker key using this manage Dash BTE command.
Finally, you'll prepare the drives using this command here with E W Import Export Tool, you'll specify a couple things like the journal file, the I D your drive letter, your bit locker key,
the Source directory or drive in the destination drive and your blob type.
Next, let's skip over to creating an import job Yule Log into the portal and actually going to all services storage and import export jobs and create it just like you would. Any other resource in this example were selecting import into azure, and we're specifying the name of our job, our subscription and the resource group to store the job in.
Finally, we'll get into the job details. Will you upload those journal files you created back in Step one?
Specify your storage count and it will auto. Select the drop off location, which is the same region as the storage account.
Next, you'll fill out the return shipping information so you can get your drives back if you're using your own drives. And finally, in the summary for the job, you'll see the next steps, which specifies the data center to ship the drives, too.
Finally, once you shift the drives, go back and update the job with the tracking information.
And once Microsoft receives the drives, they will upload it into your azure storage account.
And if you used your own drives, they will ship them back to you, using information from the job.
Next, I want to jump back over to the azure portal and in our govt 2020 storage account.
Over here on the left, we have data transfer, and this is actually a neat little tool for you to figure out the best way to get your data into Azure.
Let's say I have something small like 50 gigabytes,
and I have a good network connection that I could transfer it over
and let's just say my frequency is I'm only gonna upload the state at once.
This will then show the different solutions we have for getting our data in there, such as using the A Z copy utility or Power shell or CLI or just using the Web based portal inside of Asher. We're going to see you storage Explorer or some rest AP eyes or file Sync.
I scroll back up and I go in and change my estimated data size. Let's say Have something larger, like one petabytes.
It's going to change my options around
mostly our network. One stay the same.
But if I scroll down, I do get some offline data transfer options, such as using as your data box.
If I go back up and click on browse all solutions,
it's gonna shows all the options we have for getting data into our storage account
and down here. Just wanted to show under the offline options that we have the Azure Import Export service.
This is a neat tool. If you need to figure out the best way to get your data into your storage account
that does it for a quick demo. Let's come back to the slides and wrap this up.
That does it for this lesson. And it's the last lesson inside of Module four.
Today we discussed reviewing the Import Export Service.
We looked at the import job workflow and then we reviewed the documentation for creating the import job
and that actually does it for Module four. Coming up next is Module five. But there's only one lesson in it where we're going to have a quick summary of the course. You're almost done. Let's jump over there now and let's get into the last Montreuil.