Hello and welcome back to the Splunk Enterprise Certified Administrator course on Cyber. This is lesson 10 point to where we'll be doing a deep dive into props and transforms and the configurations that will commonly be making with those files. So the learning objectives is to first talk about what makes props and transforms unique.
Then talk about which settings each of them
configures will do a deep dive into
each file, respectively. Talk about what the magic eight props configurations are that you should always be making for each of your data inputs and then talk about how props and transforms work together. And how to do that will show you an example of that.
Why are we learning this? This is gonna be crucial for making sure that you configure your data inputs so that
the data looks good on his and has the proper field names is routed properly to me, crucial to a good, well managed Splunk environment. To make sure that you're making these configurations for your Splunk inputs
and also the props, Magic eight will be huge for
increasing your performance efficiency for your entire environment. So it's very important that we know this information.
So what makes props and transforms unique? First, we've discussed this throughout the course. On and off. They do apply at multiple times, or at least props does. So. Props configurations can occur at ingest at index or at search time.
So it's important to know. And then also something that's unique about these two files is that they work in tandem.
So if you create a transform stanza, it does not do anything unless you invoke it via props dot com. So that is important to know.
So what is props? Configure Basically line breaking timestamp extraction character set encoding allows you to do some data anonymous ation, but for more advanced stuff, you'll need to use transforms as well last year to do basic re routes of your data. And what I mean by that is just do a basic
change of a metadata field allows you to invoke transform statements, and then it also handles your field extractions and field aliases, calculated fields, lookups things of that nature.
So among those things that we mentioned, there are eight specific configurations that should be made for every single
input that you have in Splunk and these are, ah, list of those eight settings,
so we'll go through these basically in categories in the next few slides. So first there are the settings that define event format, and there's four of those, but
they occur at different times. So should line merge we discussed previously. You should set to false because instead of using the default line breaker where every event is or every every event is broken into individual lines and then reassembled, we're gonna cut that out. So
get rid of should Line, merge by setting it to false
and then use the line breaker to express a regular expression that will identify a unique pattern that signifies one event ending and another beginning.
And that way you're events will only break
on our between events vs per line. So if you set those two settings at the parsing days on your index or heavy border, then you get to eliminate that entire emerging pipeline, and I gives you quite a bit of efficiency
and then seven and eight or free or foreigners, and they basically interact with way that Splunk Ford's data generally, it'll send it in blocks of, I believe, 512
Maia bites, but I could be wrong on that. So don't quote me.
But this way, when when you do it just based on size essentially run the risk of, um,
cutting off parts of events. So if you don't define to Splunk how it knows to find the end of event, it leads into you having Teoh resend data and send confirmation back and forth that, hey, did you get a complete message? It just makes the community the Splunk, to spawn communication a lot more convoluted.
And so if you use these two settings instead of spoiling sticking to a hard set
size limit on what data it sends, it will make sure it always sends on a at the end of a complete event. So you're gonna say event at breaker enable equals two. True to say, hey, we're gonna use event breaker and then specify that same rejects that you were using in line breaker in event breaker so that Splunk knows
how to break events properly.
The next couple of settings are going to assist Splunk with timestamp extraction. Basically, pulling time sample logs is very, very important to Splunk. These happen at a parsing time as well. And if you don't specify these settings, Splunk will basically run through the log over and over, trying to match different
time formats, and it can lead to a lot of processing.
But with these three settings, you can explicitly tell Splunk where to find the time stamp, what it should look like, how many characters long it should be. And then that will make the time stamp extraction much more efficient. So time prefixes, just a Reg. Exe that specifies what
data in the log proceeds the time stamp a lot of times the time since the first thing. So it's just a carrot saying, Hey, Time stamps right at the beginning of the log time format uses time format variables to describe what the time stamp will look like. And Max Tan Timestamp look ahead
is just going to be the number of characters that are in the time stamp.
And then finally you have truncate, which tell a Splunk if it's long. If if a single event is longer than this number of bytes than snippet at this value, the default is 10,000 you may have to tweet that if you're events air coming in regularly larger than 10,000.
I've seen this before with Jason events where they get truncated and then that breaks the formatting. And they're not even recognizes Jason anymore because they don't have
the end curly brackets at the end. So you just have to kind of evaluate your data to see what's the longest event that we expect a legitimate event to be, maybe building a little bit of buffer to that and then set that to your truncate value.
Now, how does props dot com work? So essentially you're gonna have a stanza, and that's going to specify what data you're applying to. It will either be a source tied by default or you specify host or source by putting host colon, colon or source colon, colon and then a value in the stanza.
And then you'll obviously have all your relevant attributes for that data
within or underneath that stands a name. Here's an example of what that would look like so you can see the first example is just applying to a source type. 2nd 1 is showing you a demonstration of how you can use source or that could be a host is well with that colon, colon syntax.
And then just whatever attributes are under there
will be applied to that. Had a
So what? This transforms configure now that we talked basically all about props that we need Teoh basically anything that's a little bit more complex than what props can normally handle. Ah, lot of the things that transforms does is based on matching her rejects to the log and then wreaking fields. So that's what filtering out
Rogic's based rerouting and not and
anonima izing data basically all do. Then also, if you need to set like an advanced field like a multi value or delimit er based or a field that begins with a number or an internal underscore starting field, you can do those as well, and I'll just show you, um, basically the same thing. We just talked about what props
where we'll talk quickly about what
makes up the transforms icon file, and then we'll look at an example one. So the stands is just going to an arbitrary name that describes basically what the settings that are gonna be underneath the Sanso are meant to dio, and then one stands Ah,
basically equates to one transform you put all the settings underneath that very similar.
And then you will actually invoke the transforms via props, and we'll show you how to do that as well. So here's a couple examples of
stands us. So here's the 1st 1 is at redirect. That's just arbitrary name, but it's searching whatever data it's applied to in props for the pattern application. And then it's changing the index value to for boasts. So it takes whatever is in
format, and it applies that to whichever key is specified in deskee
and there special ways. You have to reference the metadata fields so you can find that actually, in the transforms dot com documentation
under the Keys section for more information on that, and then the next one basically finds your session I d. And a few skates it so that you don't actually have them in your role log so you can see this Reg. Exe um,
basically captures anything that store where Session I d equals word characters and whatever, whatever, whatever, but you can see the format it uses. Dollar sign. $1 signed to to refer Teoh. Whatever content is in the first capture group or the second capture Perspectively.
And then it rewrites the actual session i d. With this hashtag hashtag hashtag hashtag hashtag.
And then it rewrites
the actual log file with,
all of that content. So it's saying that first capture group is everything up until session i d. And then it captures the session i D but replaces the session I d with a ***, skated value
and then puts in this
dot star, which is just the rest of the log as well. So basically, it's saying capture everything, replaced the session i d. And then put everything back in again.
And then finally, you have a couple more that will
discard the events by changing cute and all. Q. Which just means don't index if it's greater than 500 characters, the role log,
uh, and then you have one that will create a new
index field called error code with whatever the first capture group and there is, you need right meta equals. True to write a new index field, and then finally you are re keying the host field with whatever value is bound in the first capture off that rejects
then, in order to actually use these, you have to specify stands and props dot com to specify the scope of where you want. Apply the transforms to. Then you use the transforms attribute to call it and the attributes. But everything that's after this
transforms hyphen is just no arbitrary name. And then you after the equal sign, you have to specify
the actual stands. The names from your transform stock come from,
and then it will apply them appropriately. So that's how you define it first and transforms and then invoke it via props.
So we covered the two reasons why props insurance forms are unique, which is that they work in tandem and they also prop supplies. At multiple times.
We talked about which settings, props and transforms are both capable of configuring and looked at some examples. We talked about the eight key props, configurations and why you should always be making them. And then we also demonstrate how props and transforms work in conjunction with each other. So that's everything you need to know about props and transforms, and you're ready to
start making these configurations on your own. So we'll see in the next video
where we'll do some labs doing just that