Managing Data Using Storage Explorer

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
Required fields are marked with an *

Already have an account? Sign In »

1 hour 52 minutes
Video Transcription
all right, you made it. This is the start of module four or we're gonna be talking about how to manage your data. And first, we're gonna take a look at how we can use Storage Explorer.
The objectives include we're gonna take a look at our different Storage Explorer data actions
and then jump out and take a look at a demo of working with containers and table storage.
So first, let's take a look at the different data actions that we have inside a storage Explorer. It's going to change depending on what service we're working with.
First off, inside of containers, we couldn't do uploads and downloads of files. We can also add new folders to change up our file system hierarchy. We can also copy your l's to specific blob objects and select all copy and paste. And we can also clone a folder with a new name as well as delete. None delete.
So these air some pretty standard file system data actions that we can take
next. We have files which is very similar to our containers, but we do have the extra action here of connecting to our a virtual machine. This is going to create a script or set of commands for us to run on our virtual machines in order to connect up our manage fall share.
Next, we have queues. Now, remember, cues are just places where weaken store and retrieve messages for applications to use.
So, inside of our cues weaken view messages, add messages de que messages or even clear out the queue.
Finally, we have tables where we can store data for applications. We can run quarry against our data, do imports or exports of the data,
add data records, select up our column options for viewing and also take a look at table statistics
so you can see using Storage Explorer. We have a wider range of actions that we can use to manage data inside are different services
that does it for some of the concepts there. Next, we're gonna jump out to our storage explorer and take a look at a demo where we're going to upload and download data from our container, and then we're gonna jump over to our table service. We're gonna upload a set of data into it and ransom queries for it. Let's jump over to our demo
here we are back in storage Explorer and I've gone ahead and disconnected from our previous demos, using our shared access signatures and everything. And I went head and reconnected using my entire azure active directory account. So I've accessed everything. So let's go and expand out blob containers. And what I'm gonna do is I'm going to right click and actually create a new blob container here,
and I'm gonna just name it Log files.
So here, inside of our new container, let's go ahead and get some data into it. So first, I'm going to select the upload button here, and I'm going to choose an entire folder of luck files that I have just like the three dots to open our foul dialogue menu and select the Log Files folder I have here.
Next in the style log, we can select what type of all type we want it to be.
And since these air log files, I'm going to select a pin blobs.
Finally, I'm just gonna leave it at the root of our container for the destination directory.
So go ahead and click on upload.
It's cueing the transfer of our Log Files folder into our container, and it's generating a shared access signature to do so.
This does take a few minutes to upload. We can start seeing a speed that it's uploading those with and as percentage done.
So we'll go ahead and speed this up and wait till it's complete.
And now our transfers complete. And if we go into our Log Files folder here and minimize er activities, we could see all the log files I have. I have 50 of them and they're each one make about each.
So next let's go into the download option. I'll select all these files. Click on download.
I'll go open to our demo folder and create a new folder.
I'll select this folder.
We'll bring back up our activities pain and we can see We're starting to do a transfer from our container and folder named Log Files into our local file system.
And again we can see it says using S A S, meaning it's generating a shared excess signature in order to accomplish this on the back end force.
All right, that was pretty quick for the download. Let's go ahead and switch over to Windows Explorer and take a look at our new file, so we just downloaded.
And here in our demo folder, I have our download folder here. If we go into it, there's all my log files I just downloaded from our container
so you could see weaken, use Storage Explorer to upload or download
large amounts of data from our containers. So let's go ahead and switch back to Storage Explorer.
I'm going to collapse this and let's go into tables
and I'm going to right click and create a new table,
and we're just gonna name it movie data. And here I have a blank table with no information currently in it.
Let me switch over to a C S V Fall. What I have here is a top 50 list of movies and how much they grossed when they're in their theatrical runs and you'll notice here and column C actually haven't named Partition Key. And in column E, it's named Rocchi.
This is how you establish a unique record inside of our table data. It's gonna combine a partition key and a rookie.
So here for a partition, I have the name of the movie and for the road data. I have the year it came out.
Let's go back to storage Explorer. We'll click on Import and I'll select my CSP foul here,
and I'm zoomed in a little bit. So it's kind of hard to see. But it's taken the first row of our data here, and it's making sure everything matches up. So I have into 64 for the gross value name its peak ranking, and we have the partition key in the Rocchi here.
So let's go ahead and select on insert, and it's gonna upload this data automatically for us,
and I consort the data very easily here. Using the columns, let me sort by rank real quick,
and then we can go and create a query so I can take some of the state of here will say the gross value.
Select the right data type,
and I want to find all the movies that have grossed $2 billion or more.
Once have the micro reconfigured, I can click the play button or run, and I'll bring back our top five results. Here,
let me take my Rocchi, which is the year the movie was released. We'll switch this over and say I want to find all the movies that are less than or equal to 1999. Let's find all our top movies that came out before the year 2000.
And here we just have for those,
so you could see very easily we can take a blank table. And if we have an existing data set, we can import it using Storage Explorer
that does it for a demo. Let's jump back to the slides and wrap this up.
That does it for our demo. In this lesson where we took a look at our different stores, you explore data actions.
And then we jumped out to our demo, where we upload and download data in our containers and uploaded data into our table.
Coming up. Next, we're gonna take a look at another tool we can use to get data into our stores services with using a Z copy command line tool. See you in the next episode.
Up Next