Hello and welcome back to Cyber Aires. Microsoft Azure Administrator A Z 103 course. I'm your instructor, Will Carlson. And this is Episode 14 Logs in Azure.
In today's episode, we're gonna create a log Analytics work space, which is just where logs are aggregated and stored in azure. We're gonna then configure a resource to actually send its logs off to that workspace, and then we're gonna query those logs and azure itself,
jumping right into the portal. We're gonna get started by coming up here to all service is
and searching for log Analytics.
We're gonna click here on log analytics workspaces,
and we're gonna add a workspace.
This is relatively straightforward. This name does need to be unique.
And we're gonna attach this to a resource group of our choosing
and put it in the region that we would like for it to be yet
really the pricing tier where that's that you can click on that
and my free subscription. I don't have any other options, but you can check on the log analytics pricing and more detailed by clicking on this documentation right here.
We're gonna go ahead and select, okay?
And that's going to deploy that log analytics workspace. Well, that's working. I want to mention that
and remind you that Log analytics is not on by default. We did not have a workspace until we created one. That also means that logging is not turned on. Our resource is by default either. So we're gonna have to deploy logging onto our resource is that we want longing for
logging is also not available in all regions. And each subscription can have multiple
workspaces as well. Some uses of that, maybe to segregate your logs from networking Lennox and Windows logs all in their own workspace. But now that that work spaces created like I mentioned, we've gotta configure a resource to sin logs into this workspace.
And in order to do that, we're gonna refresh this page.
Our log analytics workspace shows up right here. Gonna go ahead and click on that.
And then I'm gonna come down here and tow workspace data sources
gonna click on virtual machines.
And then we're gonna click on this virtual machine that I want to get connected into long analytics
because this is an azure based virtual machine. This is really pretty simple. I can deploy the Log Analytics agent right from here simply by selecting connect.
That's gonna take a little bit of time to get that log analytics agent installed on this Lennox machine. And when that's done, this machine will begin sending its logs over to the log analytics workspace.
It is interesting to note that the Log Analytics agent can be installed on both
systems here in Azure and on promised systems as well.
Now that this virtual machine is configured to send logs into the work space, we're going to give it a little bit of time to run. And while we wait on some logs to populate their
I'm gonna go ahead back here into log and lettuce workspaces. Click back on this log,
talked very briefly about some of the data sources here in a work space. We already deployed a virtual machine to send logs into the workspace.
You can also have logs that are sent to the storage account ultimately be consumed by the Log Analytics workspace as well.
There are a number of places here in azure where you can have log sent into a storage account, and then you can have a log analytics workspace. Consume those logs from that storage account as well
as your activity logs we've talked about in previous episodes. And those are gonna be the arm a P I logs. But if all those restored for around three months and if you want to keep those things for longer, you can ultimately have them consumed by a work space as well.
So there are a number of ways and sources of logs for log analytics workspaces
that are built in here to azure. You can also deploy the agent onto on premise resource is and have its store the logs in the workspace as well.
Now we're gonna go back over to the monitor blade back into logs and check out some of logs that have been sent by our virtual machine.
And here in azure monitor logs, you can see the log analysis window. And if you have any experience with sequel, the language that's here in log analytics will be familiar to you. The language is called. Cousteau would like to look up a little bit more about Cousteau. You're definitely welcome to and again, they're very similar to sequel queries.
Also, if you have any experience with other Log Analytics tools such as Splunk, the Great Log and a whole host of others. This will seem very familiar to you as well,
since we just recently got our virtual machines sending logs over here. Log Analytics. We don't have a whole lot of interesting information here yet, but we have enough to show you the just of Cousteau and how the Log Analytics
So if I click on computer availability here and go ahead and run this list heartbeats query, and that's gonna show me all of the heart beats that have been made from our virtual machine,
I can expand that and it will show me all of the details of this particular log message. Now, again, I want to remind you
Azure hasn't reinvented the wheel here. This is simply a way with log analytics to consume logs that generate in our I t lives all across the board. So think of a centralized Windows Event viewer.
So none of these logs are particularly unique in and of themselves. They're going to be different based on the source of those logs. The content of those logs is gonna be largely dependent upon the source.
This manufacturer, a Fortinet piece of equipment, may send different looking logs than a Cisco piece of equipment, but ultimately they're likely to both be cyst logs in format.
So a little bit about Cousteau here, you can see that we're searching for the heartbeat. Ah, pipe character is going to separate commands and we're searching where the time generated is greater than one hour ago.
This is how Cousteau is gonna work. The pipe character is very important. There are gonna be a lot of other Cousteau commands such as summarize and Bin and a whole host of those I advise you
take a little bit of time to look through the Microsoft documentation on Cousteau and for a Z 103 at least to be familiar with some of the main commands here within Cousteau, where summarized been in some of the major things that you might look for. So heartbeat here as an example.
Another interesting thing about log data is that log data is immutable. You cannot go in and delete one individual log here in log analytics.
The logs have to be deleted or removed either by deleting the workspace completely or through the use of retention rules set here in Azure.
The results here on screen are also limited to a meager 10,000 results. So if you have more than that, you'll have to filter down less than that or it will truncate them for you.
Another interesting thing is that you can save queries here for re use because much like the metrics, if I come out of log analytics and back, this will clear out and start over fresh.
I can export this query. I can also, as with summer, other similar areas of azure. I can pin this to the dashboard and see these results on my dashboard, Not particularly telling for this particular log. But maybe you wanted to know failed log in logs for a particular Windows server or your domain control.
You might want that to be pinned to your dashboard so you could see if there were new alerts there.
You can also create an alert rule. So if there was a particular log that fired off, say, five failed log in attempts or a locked out user account, you wanted to be alerted about that for a particularly sensitive machine. You could set up your query here in Cousteau. Run it. Make sure the results were what you expected,
and then you could, ultimately
creating alert rule from this log Analytics workspace.
So in today's episode, we talked about the fact that we store logs and azure using a Log Analytics workspace.
We now know that logs were not sent there by default. We have to configure Resource is to send their log files to the workspace.
And we also talked a bit about Cousteau and the Log Analytics query tool and how it's very similar to a Splunk or a gray log central logging tool as well.
Coming up next, we're going to be talking about some ways and azure as an administrator to help you control costs both by
reducing resource utilization, particularly on items that are not being used to their fullest extent, and also how to set up some alerting and monitoring related to budgets. Thanks so much for joining me today. I'm looking forward to the next episode