Data Migrations and Azure

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with

Already have an account? Sign In »

7 hours 46 minutes
Video Transcription
Hello and welcome back to Cyber Aires. Microsoft Azure Administrator A Z 103 course I'm your instructor Ball. Carlson and Nevis has Episode 24 Data Migrations and Azure.
In today's episode, we're gonna talk about some of the different ways, both online and offline, to get various different amounts of data in tow. Azure effectively
get started. We're gonna go ahead and jump here into the portal and go back to the storage account that we've been working on for these past two episodes.
And we're gonna look here at the data transfer blade.
And this is a nice little kind of quasi calculator that, depending on the amount of data that you have to send in your network band whip and how often you're going to send that data in, Azure will make some recommendations on the various different options available to you. So to start, we're going to select a relatively small amount of data over relatively capable amount of bandwidth
coming into azure one time,
and we'll see a couple of different options here, and you'll notice that all of these options are network data transfer options or online options, and that's going to be because
the data could be transferred over relatively quickly. So in his little is three hours all and transfer is clearly effective.
Ese copy is going to be a set of command line tool that you're going to ultimately install. You confined the installer here
and you'll run a Z copy from the command line and essentially just pointed to the storage that you want to move over to azure until it where you want that to go in your azure storage account.
Now the first thing you're going to have to do is authenticate to that storage account, and depending on the type of storage that you're moving in tow, azure will depend on the methods available to you. Tow law Again.
Azure A D is going to be very similar to the way your law again using any of the easy command line tools you'll simply authenticate with user name and password.
Sass is going to use a Saskia, which we've talked about in the previous episode.
Once you've done that, you simply select the files that you would like to upload where you want them to go. An A Z copy goes to work, moving those files over the network for you.
We also talked about Azure Storage Explorer, which is a client site installed piece of software, and you can use storage Explorer toe. Also upload the same files should you choose to instead of from the command line, but from a gooey based piece of software
and the azure storage Rest A p. I would be the AP I that you would use if you had in house development that wanted to script some of this storage of load into the azure environment.
Now, as soon as we change the amount of data coming in over this connection, we're gonna get some different options due to the time required to upload that information.
And you'll notice that what we've added is data box EJ and data Box Gateway.
Now we'll start with talking about the data box gateway, and it's in on premise virtual machine that you install and as a V M in your environment and the data box. Gateway gives us some additional functionality from a Z copy.
The data box Gateway allows you to aggregate multiple streams of data, say, from different branches or different production environments into the data box gateway where it's stored, and then it ultimately will upload that information with some rules that you can set based on bend with available during certain times of the day.
So, for example, when you're in production hours, we want to use a very limited amount of bandwidth, and then after hours, it can upload a whole lot more a lot faster.
Data Box. Great way is also great because it integrates with on premise workloads. So a great example would be to think about local backup that needs local storage to run the process. You could use the data box gateway as the repositories for that job, and then once the job is complete,
the data box gateway would have the smarts to know to go ahead and upload that storage or those files
into your azure storage.
The data box Gateway does support storage gateways, SMB network file shares being sent into azure blob and azure files. So you have the full boat of options using the data box Gateway as a virtual machine in your environment.
Now the data box EJ is going to be similar thing to the
letterbox gateway, but it's going to be ah hardware appliance. It sits in your environment,
and this gives you some additional functionality that the data box gateway doesn't have. Primarily, you're able to do more pre processing on the data. So think removal of personally identifiable information before it gets uploaded.
And you can also do some interesting machine learning things within the data box. EJ itself, along with all of the same bandwidth utilization rules available in the data box Gateway.
And that rounds out all of the online ways that you can send information in tow. Azure again. It all depends on the amount of data that you have and the transfer frequency. So the UN promise virtual machine or piece of hardware may be really interesting, particularly if you have regular amounts of data that you want constantly uploaded into the azure storage environment.
But what happens when the storage amount gets relatively large?
At that point, we're gonna be running into primarily offline
storage options.
An example in use. Case I've heard of. This is data that's being collected and stored for Internet of things type devices on offshore oil rigs. Clearly, they don't have great band without in the middle of the ocean so they can use some of these offline products to store the information and they simply ship it back to Microsoft. And the information is ultimately consumed
by their azure storage environment.
And our option Here are the Azure Data Box Data box disc and the Azure data box Heavy.
Now, the data box disk are going to be disks that are ultimately sent to you from Microsoft.
They use up to 35 terabytes usable capacity, and you can have up to five discs per order.
Something interesting about the data box disc is that it's only going to support blob storage as a destination. You're going to interface with the disks, be a USB interface, and they're gonna be encrypted with A S 128 bit encryption.
Now for even more storage than that, you start stepping into the data box, and the data box itself is going to be a ruggedized computer, essentially in a hard plastic case,
and it's gonna carry 80 terabytes of usable capacity and a single device.
It supports azure blob or files,
and you're going to interface with it via a one or 10 gig network interface.
You can copy data to the data box using standard network attached storage protocols s and be in an F s.
And that device is going to use 256 bits. A s encryption.
The data box heavy is going to be essentially a data box, but it's the much bigger brother of the data box.
The data box heavy is going to support up to 800 terabytes of user bowl capacity. And essentially all of the other options on the data box heavy are the same as with the data box.
Now, if I reduce the amount of data that I need to get into azure a little bit, I'm provided with one other additional option. And that's the azure import export. And this is gonna be an offline way for you to ship your own disk drives to Microsoft to get the data into the Microsoft azure storage environment.
We're gonna walk through that process setting it up at least
here in portal. Right now,
I can come up here to create a resource, and I can type in import export job.
I'm gonna go ahead and click on that and we're gonna create an import export job,
and the first thing we're gonna have to do is name the job and select whether or not we're going to import or export data from Azure
we're gonna do This is an import readily free trial there and select I T Resource Group and hit OK.
Now you'll notice appear at the top. It talks about a journal file. Now this journal file is going to be a file or a manifest that is generated by the W ay import export tool that you run on your device on premise that has the files that you want to move into the azure storage environment. So
the machine that has your discs in it that you're copying the data, too,
is where you'll run the W ay import export tool.
It's ultimately the tool that you use to generate this journal file two bit locker, encrypt the files on that drive and to copy the files to that Dr.
You'll upload that journal file here so that Azure knows what files to expect or coming on that disk.
You'll select a destination but storage account,
and then you'll select. Okay, now, obviously, we can't do that. We don't have a journal file selected here today. But once you've done that, you simply select the shipping information so that Microsoft knows how to get that information back to you,
and then you're off to the races.
I do recommend you look through the documentation on the import export job and the W ay import export tool so that, you know, and understand what some of the limitations are there. For example, there are only certain types of hard drives that air supported.
You'll need to know that there are also only certain storage accounts that are supported.
So please check out that online documentation about the import export tool and supported drives for the import job.
Now we can come back here real quickly and select that we want this to be an export job. You'll see that everything is essentially the same.
We're going to select the storage account that we're gonna export data from.
We're going to select what data we want to export,
and they will select okay
and you'll see this return shipping information screen. And this is the exact same screen that we would see on the import job if we had uploaded that journal file.
You'll fill up the information for return shipping
and then you'll simply ship your drives to Microsoft. They'll load the data on them and they will return them to you. While we're talking about data migration, I want to go back up here to create a resource.
We're gonna look for a data box.
I'm gonna select as your data box and click create.
I'm gonna leave my subscription free trial. Leave this import. We're gonna select United States here for me.
And I'm going to say we're gonna put this data into the central US region and hit. Apply Now, you'll see that for my pains. You go account. I do not have access to data box disc data box or data box heavy. Microsoft will not send me that equipment with a prepaid plan. However, I can come in here and select to do the import export, which is the process that we just walked through.
if you were on an e A plan or an enterprise plan a Cloud Service's plan part of Microsoft Partner Network or a few other offer types, you would be able to select and set up data box options here as well. And the process would be to order the data box, get it in house, looking up to power into your network, migrate the files over
and simply ship the appliance back to Microsoft.
Now, there's one other option here available to us that we talked about. And we're gonna go up here to create a resource to see that
gonna type in data box,
and we're gonna see, like to date a box edge of data box Gateway. Now, remember the data box Gateway is going to be the virtual appliance and the data box EJ is going to be the hardware of plans. You can see that there are still limitations here on the pay as you go subscription. But we're gonna go ahead and accept that limitation.
We're gonna select where we want to deploy these resource is too. And note there is a limitation here on what's available. We're gonna make this a data box gateway because that's our only option. And here you can see the options in the cost for data Bucks, Gateway and Data Box EJ,
we don't have access to the data box edge on the pay as you go plan, but I can click here and create the data box. Gateway.
You can see a little bit more about the prerequisites for the virtual machine here, but suffice it to say that you can launch this on hyper V or VM wear. Once you've lost it in provision, the virtual machine, you connect it and set it up, hook it up to your azure account, and then you can begin tohave the data box gateway sending data into the azure environment.
So in today's episode, we talked about a number of different ways to get large amounts of data into the azure environment, when your network bandwidth simply will not tolerate the amount of data or continuous workloads that you may need to get into azure, how we can optimize those workloads to make sure that they get into the azure storage environment efficiently.
I recommend you look through and read about the documentation regarding all of these options and become a little more familiar with what they are. And definitely no the process for requesting these devices and these options for a Z one of three
coming up Next, we're gonna begin talking about and deploying a content delivery network so that all of the home movie goodness that you have is available to all of your relatives scattered all over the world.
Thanks for joining me today and I'm looking forward to the next episode.
Up Next
AZ-103 Microsoft Azure Administrator

This is a training course for the Microsoft Azure AZ-103 Certification. The Microsoft Azure Administrator training course teaches students to perform tasks like managing Azure subscriptions and resources, implementing and managing storage, deploying and managing virtual machines (VM) and networks, and managing identities!

Instructed By