Hello, Siberians. Welcome to Lesson 3.6. Off monetary off, this car stated, is the trees were one Microsoft Azure architect design
here. The lentil objectives. For this video,
we'll start out by giving an overview off the Azure Data Factory service.
This will help you to get a clearer understanding off water service. Ease and wait does.
Then we'll build on the understanding by showing a sample scenario off where a judge at the factory can be off years.
Finally, I'll cover some butin connectors that we can use with the Azure. Did a factory service? What is a job in a factory?
According to Microsoft, it's a cloud based data integration service that allows us to orchestrate an automates data movement on data transformation.
So what? That's Mrs that's This service is about automating the movement off data between different external data stops.
Yes, a quick example.
Let's consider a case where we have certain user information, start in on a job blob storage
and want to move this information from its current states. Want to move it into and adjust sequel database table.
This is a use case that I jotted a factory can help us to achieve.
But how are we going to achieve this with as your data factory here's out?
The first thing that we need to do is we need to lead as your data factory. Now, the addresses off both external data sources on the keys are the authorization to be able to access step.
So this is called a linked service in Azure did a factory.
So in the scenario that we have on the screen when needs to create two linked service is
one for just, uh, vich and the order for your sequel
on the links. Every sees what will contend Toby Connection strings that did the factory can use that run time to connect to azure storage on also to connect. So I just sequel database.
Then we need to data sets.
So a data set is defined as a named view off data that references the data that wants to use us import or how outwards.
What does that mean?
So let's go back to us. In our view again,
there's your blob data sets that will be creating. It's going to specify the foul that we want to work with in this case is going to specify. The Blob folder is going to specify the object that contents are inputs data, which is our series. We found
they had just said quote, database,
table data sets. What that would do is Dutch was specified the sequel table, which would be the how to put in other words, Data said, Help us to say, as a data set for the data that we want to walk with as the imports on, as did the sets for where they helped put East, which is a sequel table
After of defiance tastes when needs to define the job that will move the data
in As your data factory. This job is referred to as an activity,
so the most popular activity in a Georgia. The factory's a copy activity, which simply moves data from one sauce to a destination.
Activity usually goes within the pipeline. So what at mrs we typically have multiple chains of activities on. Then we'll group them together with in something called a pipeline in as your data factory.
So this whole process of connecting toe external data stars on copying data from a sauce straight destination.
By the way, the destination is also referred to as a sink in a job in a factory.
But this whole process that were describing it's gonna have to execute some way
it's gonna have to execute in an environment. It's gonna need compute to be able to perform the processes of connecting to deter stars on collecting information regarding the data set.
That's where the integration of one times comes in
integration. One time in Azure Data factory is the computer infrastructure that's used by a judge in a factory to provide at data integration capabilities. In other words, on integration. One time is what provides the bridge between our activity
on the links service's
hand off course Hall of these will be defined on that. The Azure did a factory service.
This light is just to give you a little bit off an overview off some of the beauty and connectors that exist in azure data factory. So when we're talking about being able to move data between different data sauces on being able to do that, the transformation,
this is the scope of what we're referring to
Whenever you see the light blue, that means as your little factories supports, read and write another world. It can use this data store as an input was in house. Put on wherever you see the light Papo.
That's where it can. It supports that as an input. Supportive. Read only. So, for example, we could move data from Azure blob. We could move that in treasure Cosmos TB from Azure Data leg. Gentoo. We could move that into sales far so we can move it up between this different extra now sauces,
external data starts.
So here's some other useful information about as your data factory,
as it did a factory asked to fishing's We have vision one, and we have vision, too.
Vision, too, is an improvement of vision one. It's as support for capabilities like pipeline runs and activity ones and trigger runs. In other words, it much more matured, and it as much more capabilities and also integration than what vision one does. So we always want to use vision to going forward.
Yeah, some of the other use cases off a judge at the factory beyond what I described earlier.
So, for example, that allowed us to be able to abuse dynamic data pipelines for big data workflow that some of the fees ever talked about data movement, data transformation As we move data between different data sources,
here's another good use case for George. At the factory, we can migrate SS eyes to a Georgia, the factory.
So what that misses the sequel Server Integration Service is that we may be running on premises. We can actually do a lift and shift off this service on dhe shift that in tragedy, the factory on what that means is that we can have a whole platform as a service solution with a sequel. Server integration service is running in a job in a factory
existing in Nigel sequel or just sequel managed incense.
This brings me to the hand off this lesson. Thanks very much for watching, and I'll see you in the next lesson.