Time
14 hours 28 minutes
Difficulty
Intermediate
CEU/CPE
15

Video Transcription

00:00
Hello, Siberians. Welcome to Lesson 3.7. Off Monetary off this constituted Is that reserva one Microsoft Azure architect design.
00:10
Here is another view off the demo that I'll be walking you true today
00:16
in Ha Jer. I currently have this to service is running have another blob storage and I have a series with father contents that in customer information already uploaded to that as a blob storage
00:32
I also have on I just see quote database created already in Hajer.
00:39
Let me go ahead and show you this story sauces on. Then I'll show you the rest off. What are we doing?
00:47
I'm currently on the Azure Pato. You can see my storage account he And that's my sequel, several that also has my sequel database for Great. And with Fresh Start, you can see my secret database of a day
01:00
if I clicked, connects to my sequel database and I click on Query. A. Ditto
01:06
So I probably did not have permission on the network level, so let's go ahead and give myself permission on the network level. To do that, I will need to go to my sever
01:17
and I need to go on the firewall and virtual networks and now needs to select the option tohave my client I p address on. I'll save that, so that should give me network connectivity access.
01:30
The other reason why I brought you to this pain is to point out this configuration to you allow. How's your services and resources to access this Ever. I need tohave this configuration and neighborhood on my secrets ever enough for Data Factory to be able to connect to heat.
01:49
So let's go back to my secret database, and I'm going to connect the Duchess and Query editor.
02:00
So now I'm logged in to my sequel database. If I explained that you see that I currently have no tables, so let's go ahead and create a table
02:12
that quiver is gonna create a simple table that's gonna have this different columns.
02:17
So go ahead and run the query,
02:21
and that succeeded
02:23
on If I go to refresh these, I should see my near table over here. So that's kid.
02:30
The other thing that I'll show you is my storage account on the CIA's. We found that I already have in it, so let's go back to my research group. Let's selects the storage account.
02:38
I have a container,
02:42
a blob container and called customer data. And within a block, continents. The CIA's with foul called customer data that C. S V.
02:50
If I click on that and if I click on edit, you can see the information in the essentially just contents for columns. Also, just
02:58
comma separated Just first name, last name, age and email.
03:01
So that's what I currently have. Let's see what I'm gonna be showing you the rest of this demo.
03:16
So the first thing that I'll be showing you is I'll be creating on as your data factory
03:22
out. Don't go ahead and creates a pipeline within that Did the factory,
03:27
then my pipeline out. I'll specify Link Service's boats to my azure blob storage and to Maja sequel database.
03:35
I'll specify the data set for the azure blob that was specified. The CIA's referral on the data set for your secret that was specified. The as your database to a secret database table. I just created
03:49
our then create a copy activity to link those together to copy the data from the sea every foul into my sequel database table on. Of course, all of that we execute within the integration. One time in Azure data factory,
04:03
We're very fight the prerequisites already. So go ahead and show you the creation off on Azure did a factory service.
04:13
This is official representation off. What are between? Let's go create the service
04:18
on back to the Azure Pato on. What I'll Do is I'll go back over here. I'll click and create a recess
04:26
on our type in data factory
04:32
on nightly. Condemn a factory. I click and creates,
04:36
and I'll specify in name for my data factory. I'll just call leads a the F
04:43
just 11 to 0. That's the Suffolk that I'm using for every resource that I'm creating for this particular lesson
04:49
are, put it in. Every such groups are positing the results Goodbye created earlier,
04:55
and I was specified the location as UK self. I'm not going to be using C I. C. D for NASA. I'm gonna disabled the option to just get
05:02
on our great and click on Creates.
05:04
It's a simple is that it's creating a Deere factory. So after we've created a did a factory, then we can go into beauty. I did the pipelines,
05:15
so data factory now successfully created That didn't take long to create a tower. It only took a few seconds, actually.
05:23
So what about this? Like Lincoln go to results. You can see the edge of the factory
05:27
resource over here.
05:30
So the next thing that I'll be showing you is how to create an azure did a factory pipeline.
05:35
And here's the visual representation of what we were doing. So before I quiet and creates anything like the links service is our first of all nature created pipeline in Azure did a factory.
05:46
So let's go ahead and do that now.
05:50
And back to the azure. Did a factory service in Azure Pato. What our click on is our click on water and monitor here.
06:01
So in order to create a new pipeline, I could eat it. I used the wizard option here,
06:06
but what I'm gonna be doing is I'm gonna be using the proper auto vin
06:11
option over here. So let's click on water on the left hand side on. I have the option to specify what I want to order
06:17
our guide and click on hot new results and or click on ads New pipeline.
06:24
So let's give the pipeline name.
06:28
I'm giving it the name of customer data as your blob toe Azure sequel
06:31
hand. That's all I need to do for now. Created a pipeline and give it in and giving it a name.
06:36
There's no need to click on seven anything for now because we're still gonna be working within this pipeline.
06:44
So the next one I'll be showing you is how to create link services for a job blob storage and I just sequel in the factory.
06:55
So back to my pipeline enough for me to create the Link Service is I go to the lower left corner over here and I click on connections.
07:06
Now, when I click on connections, it opens up this connection. Tub and I have linked Service is a guide and click on New for New Links Service on. I can specify different categories, and then I can pick the option they want. Now, if I go under the azure category,
07:20
I should be able to see as your blob storage so quiet and click on that and I'll click and continue.
07:27
I'll have to give this link service the name. Keep it simple. Our call it as your blood storage
07:33
on when it comes to the one time that this is going to execute in so that I don't creating a new one off Aladin Beard in a custom one. I'm just gonna let
07:44
used the default as your integration. One time
07:47
he's gonna ask me for the authentication method to connect. Stow this azure blob storage account.
07:54
I'm gonna leave that sets to account key right The most the better way may be to you something light managed identity are even part like it's actually a right. But for now, let's just use that counts. Kids keep it simple.
08:07
I was specified them not gonna be integrating with key vote. I'm just gonna be providing the connection string
08:13
on DDE. What option do I have to die once used to select the account method, I will say for my job description so that final quiet and select managers description and I'll select my storage accounts directly from here and that my storage account over there.
08:28
So once I've selected the storage accounts,
08:31
I have the option to click on test connection over here. So let's go ahead and click contest connection.
08:37
So that's successful. That's great. So click on Create
08:41
On That created the links Every Treasure blob storage. Let's do the same thing for as your sequel database
08:50
so under linked service is over here. I'll just go ahead and click on New and I'll go on the Hajer and I'll look for Azure
08:58
sequel database.
09:01
And here we go. I just see called Database Likely can continue
09:05
on dhe to keep it simple again. Our colleague as your secret database. Live it to run into default indication one time abusing the connection. String out quiet and select my subscription.
09:16
I'll select my seven name, which is this,
09:20
and I'll select my database name. Which is that the debatable day. So let's go ahead and select Start
09:26
on Dhe, the other indication type that are using abusing sequel authentication.
09:31
So I was specified my user name for connecting to my sequel database,
09:37
and I'll specify my password.
09:48
I can don't test the connection again,
09:50
and that's successful and a great and click on create. So now I jotted a factory knows how to connect to this true data sauces as a blob and as your sequel.
10:01
The next thing that I'll show you how to create the deter sets for the CIA's Reef Hour in blob storage and for the adjust sequel database table,
10:13
which would be our House put. So let's greatly did a set for the inputs and data set for the output.
10:18
That's quiet to do that now,
10:20
back in the AS O pato
10:22
Yeah, for me to be able to create the sets, I go back to my pipeline
10:26
over here
10:28
on within my pipeline. I can click on
10:31
New, and I can click on new data set here.
10:35
So let's click on New Data Sets on. I'm gonna be selecting data set for a job love stoppage and I just see quote database also So data set fires your bluff storage. Let's click on Continue
10:45
and you can see the supported for much types.
10:48
So, in her case, would be using C. S V so great and select. See as we and our click on continue
10:56
and I'll need to specify the name for that. So let's specify
11:01
cost Ahmadi to see as we are specified, links every star begins and that's all specified. The azure blob storage link service, like reading earlier on this is where I have to specify the import so I can specify. The container directory of foul is just ways I can just click on browse. I go on the customer data container on, I can see
11:22
my father. So let's select that.
11:24
So that automatically populated
11:26
on. I'll leave. The imports came in from the connection stops, so that's fine. Are quiet and click on. Okay,
11:37
on what This is done. Is it giving me the option to modify the name of a year? So I'll just leave this
11:43
the way that it is our show. Your future is actually
11:46
if I click under the connection section. Yeah, I can test the connection for my link service again, but here's the option to preview my data. So if I click on that option,
11:58
it was actually reach out and preview my data so I can see what my data looks like. Yes. Oh, that's great.
12:03
The other thing that we can also do here, which are pointed to use I can also do things like, let's say my column limiter is not a comma. It's something else. I can select what I'm getting for the limiting
12:16
and I can specify other option. The other thing that I want to select a deception that says Foster role as a that because my first role in this yes, every foul is indeed my head. So let's quiet and select that option.
12:33
So that's like That's so that's so good.
12:35
That's quiet and creates the data set for Azure sequel database table.
12:41
So what I'll do is I'll go back to my pipeline
12:46
and I've got two new
12:48
data sets.
12:50
This time around, I will look for data set for as your secret database out. We can continue
12:56
and I'll give it a name.
12:58
So what I'll do is I'll give you the name off azure sea quote, table out, put on my links. Every should be. I just secret database what I created earlier
13:09
on DDE.
13:11
I can specify my table name here, So that's the table that I created. Elias also likes that table
13:18
on the imports. Kimye will be from connection stops so that final go ahead and click okay to these also
13:24
so it can give us the information there. I can click on the connection. If I previewed data, there should be nothing there, so it's currently empty.
13:31
So the next thing that I required to show you its outward creates the copy activity which ties everything together, so to speak.
13:39
So
13:41
what? I'm gonna be doing this. I'm gonna be creating this copy activity. There's gonna be copying information for my input. The decides to the output data sets.
13:50
Let's go ahead and do that. So if I go back to my pipeline top over here
13:56
on dhe on the pipeline tab, you can see the different activities that I have feelings. And there's a lot of them, right? So the one I'll be using it will be under the category of movement transform,
14:05
um, category
14:07
on. I can drag this copy data, so that's one of the most popular activities. So if I drag, that's to this work in pain or to this workspace, the copy data option. It's gonna give me the information down here. So first of all, I have to give you the name.
14:24
Give it a name off customer data, C S V, two Sequels. So let's go ahead and put that in there. I can give it a description if I want to, but the main part is I need to specify the sauce. That's where I specify my source data set, which is the CIA's we found
14:41
on dhe. I can can again previewed data and I can make seven modifications there.
14:46
And then I have the sink, which the sink is also the destination so I can space for my destination. They decide who to be mad. I just said, Quote, table
14:56
on. I specify Dad's.
14:58
So it's it's gonna give me the option track. You can actually say automatically create table here, so I'm not gonna be using that. But the other important one is the mapping,
15:07
because in my cases, some of the others don't don't map directly. So if I go on the map in I click on imports, schemers is gonna impart skimmers from the source in the destination so I can map them to each other.
15:20
So here we go, does the sauce and dusted destination. So let's have a look at that very quickly
15:26
so you can see that under the source.
15:28
Um, you know the destination. I have the first name and last name columns, which is not maps to anything. So in this case, I have just first as they had a column self select, forced to be marched that on our select last to be mapped to last name. So all the hardest I'm matched already.
15:50
So once I have these, I can go ahead and test the pipeline.
15:56
So I've always recommend testing the pipeline before we go ahead and publish it. So that's always a good party. So that's where we're gonna be publishing something that doesn't work.
16:07
So if I quiet and click on the bog here,
16:14
Andi, it's gonna start running that
16:18
on, we're gonna be able to view the status and see what that looks like over here.
16:25
And by the way, White's going through this testing. What I'll do is I'll go back to Hajer.
16:32
Let's go back to my research group. Let's go in there. Just sequel so I can connect back to my database.
16:45
But in my database information on dhe. I can see my table over here, So that's good.
16:53
Let's see how the task of ST ST succeeded now. So that looks good. That's quiet and verify within my sequel
17:02
database here.
17:03
So what? I'm gonna be doing this. I'm gonna be
17:07
selecting
17:08
off for my table a great and run the squaring on there we go. You see all this information now populated, caused data factory has moved them across and I can actually check the count. I think I have Tru TV 100 users.
17:22
So let's go ahead and execute on the Querida counts. Time for great and execute. That's that's 300 years is already inserted into this table, so that's great.
17:33
The final thing that we can do is we can publish this pipeline. So that's something that we can reuse. So at the top, you can click on Publish all
17:45
when I click on Publish, our is gonna be the bullet's gonna deploy this pipeline. So the data factory
17:52
and what that means is I can now shadow. So let's go successfully published I can have a trigger. I can trigger this entire workflow either. Manually, I can treat guide on a shadow. So that's something that takes the data from time to time and inserts them from one source. Randall Destination.
18:10
I contribute based on an event,
18:11
so that's one of the advantages off actually the factory.
18:15
So that brings me to the hand off this particular demonstration. I hope you found it useful on. I'll see you in the next lesson.

Up Next

AZ-301 Microsoft Azure Architect Design

This AZ-301 training covers the skills that are measured in the Microsoft Azure Architect Design certification exam. Learn strategies to plan for the exam, target your areas of study, and gain hands-on experience to prepare for the real world.

Instructed By

Instructor Profile Image
David Okeyode
Cloud Security Architect
Instructor