Time
8 hours 30 minutes
Difficulty
Beginner
CEU/CPE
10

Video Transcription

00:03
Okay, So for our next section in Tijuana three, we're gonna talk about text streams.
00:09
Let me see you quite a few commands here
00:12
that you have available to you to manipulate
00:16
standard output, standard input and
00:19
and the stream of data going in between programs.
00:24
And these are really handy tools
00:28
to be aware of
00:29
because it allows you to do lots of different things with files and,
00:33
uh, be able to kind of filter out information that you want right away without having to go hunting for it in a more manual sense.
00:43
Now, if you remember in the previous section we talked a little bit about standard input, standard output and standard error.
00:52
So these are the io streams. I can filter the input. I can filter the output. I could also filter the errors.
00:59
And I may want Thio do this. Rivera's reasons, because I'm manipulating information. Maybe I'm trying to merge log files or get some information out of a log file or sorts of information counted.
01:11
There's lots of different reasons why I would do some of these things.
01:15
So
01:17
let's look at some examples.
01:19
You've already seen me use the pipe character
01:23
and the greater than symbol
01:26
pipe will take standard output
01:30
and send it to the standard input of some other command, which follows the pipe character,
01:36
whereas the redirection
01:38
symbol of greater than will take the output that would normally go to standard out and send it to a file. Typically,
01:48
I could also use the less than simple
01:51
to use a file as input for standard input,
01:55
so I can do that in both directions
01:59
and sometimes the commands. We could get a little bit
02:02
difficult to to read because you've got a command. Maybe you're piping the output of that to something else, and then I've got redirection from standard input to something else. It can look a little bit confusing,
02:15
but as well as I'll show in some of the examples here, it's not not too difficult to understand what you practice a little bit.
02:22
So I've got a handful of commands here that will cover, and then
02:25
I'll go to another selection of commands that you need to know for the exam.
02:30
We start with the Cat Innate Command, which which have already used several times.
02:35
This basically takes some argument on the command line and send it to standard output.
02:39
Assuming it's a text file, this works as you would expect. If it's a binary file, you might get some strange characters on the screen. Your terminal Mike beep.
02:50
You cannot output a binder file to the screen that you normally would be able to see. A text file
02:55
can also run the OD Command to do an Octel dump.
03:00
This might be useful for programming purposes, debugging purposes.
03:05
I'll demonstrate it, but it's, um
03:08
not not a very commonly used to manned by any means.
03:12
Then we had to split command, which does just what it sounds like, allows me to take a file and split it into multiple pieces of a certain size.
03:20
This was typically done in the old days when you wanted to try to take a large file and fitted on some
03:25
removal immediate, like a floppy disk or a tape.
03:30
If it's too big, I could split into smaller pieces, copy those separately and get around that problem.
03:37
I've already used work counts several times.
03:39
This is very handy to be able to count lines count, words
03:45
we'll use had entailed toe. Look at the beginning section of a file and also the end section of the file, specifying the number of lines that we want
03:54
expanding and un expand.
03:58
I have a very easy to understand purpose. It takes a space and converted the tabs or converts tabs spaces.
04:05
Then we have translate,
04:08
translate. Let's we take
04:11
two different character sets and translate from one to the other.
04:15
Easy to understand examples. I've got a file that's all lower case. I want to make it all uppercase or the other way around,
04:20
and I can convert other characters as well, and we'll look at the details for that.
04:26
And then we have the printer command,
04:28
which takes a file and,
04:30
uh,
04:30
formats in such a way so that it's more conducive to printing.
04:35
Let's have a look
04:38
at some of these options.
04:42
You've already seen a pipe and the redirect, but I can still,
04:46
uh, do a quick example here. So I'm in my home directory
04:53
and
04:54
I'm going to, uh,
04:57
if I run the L. L Come in. I've got a bunch of files
05:00
now. I can ruin allow, and I can pipe it
05:05
to another command
05:09
you can pipe into sort.
05:10
We'll see sort here in just a minute.
05:15
Now you'll notice that
05:16
the output is different.
05:20
No, I just gave Sort as a
05:26
as a, uh without any without any arguments.
05:29
It sorts on the first field that encounters. So in this case, it actually sorted on
05:34
the provision strength.
05:38
And then then it will go to the next field. The next field, since the's first field's a little bit different, See how it's sort of a little bit? Their second and third fields and four fields are all the same.
05:47
And then it went
05:51
to the rest of listing here.
05:54
Now I could I could do something a little bit more interesting. Instead of piping this to sword, I can pipe it to grip.
06:00
And let's say I want to look for
06:02
string file one.
06:06
This is kind of a roundabout way of doing this. I could I could have more easily just
06:11
typed. L l file one.
06:14
Sorry.
06:17
Yeah, that same thing. I lost a shell
06:20
for one with a
06:23
whoops
06:26
with the wild card.
06:29
The nice thing about using the pipe is that I can take any command, send it the output to another command,
06:36
and I can also do something like this a zay ran this before. I can save the output
06:43
so I can, uh,
06:45
save this. An upward filed called
06:47
out got text.
06:50
So now I don't get anything to standard out.
06:54
But if I look in my shell, I've got a file called out dot tax which I just created, and I could cat this
07:00
oops,
07:01
extra character there
07:04
and I see there's the output of my command.
07:08
So saving the output of command is really useful
07:11
because I may need that later or maybe do using it as part of a script.
07:17
Now let's look at the Octel dump Command.
07:21
I can just run this
07:25
or rather, help pansy.
07:27
So my options are I can
07:30
specify various settings
07:33
writing the octo bites of a file doing man page on this real quick.
07:42
So lets me send the output of a follow in Octel to this to the standard out.
07:47
But it also supports other formats,
07:51
so decimal form and, um
07:56
floating point.
08:00
But for a simple demonstration, I will just
08:03
run OD against
08:05
a nice by near That I know exists has been if config
08:11
and there's the config program and Octel format,
08:18
then we have our split command
08:22
and what I've done is I've copied var log messages
08:26
to my local director here, So I've got a nice large file to work with.
08:37
If I run split,
08:37
uh, dash, dash help,
08:41
I can see that I can split by the number of bites,
08:45
or I could split by the number of lines. So if I can't
08:50
messages text and pipe this
08:54
to workout dash out, I can see that it's got 19,419 lines.
09:00
It's pretty easy way to, uh,
09:01
to get that information.
09:13
No, not with the split command
09:15
I can run, since I've got a large file messages out text.
09:20
What I can do is decide to split this in various ways. Aiken say split
09:24
based on line numbers. I know it was 19,000.
09:28
I'm gonna put it into
09:31
5000 line chunks
09:37
now. If I do a long listing by default, it's It names the files, starting with a default naming of X. Aye aye. Then it goes to exit A, B, E X, A C and so on,
09:50
and you can change the naming. But for for illustration purposes,
09:54
this is fine.
09:56
So now contest to see if there's work. I can Cat x, a
10:01
pipe that toward count Dash l and it's 5000 lines.
10:05
I'm gonna go ahead and remove these. You notice I'm using the dash R F
10:09
This says, remove the files that I specify enforced removal. So I don't have to acknowledge each one.
10:16
I know that X ray will match those files. I can go ahead and remove those.
10:22
Maybe I want to split based on,
10:24
um,
10:26
the number or so. That was a number of lines, but I might also want to split
10:31
based on the, uh
10:35
the size of each
10:37
chuck.
10:39
So I've got
10:41
1.83 megabytes.
10:45
I want to split on
10:48
100,000.
10:52
I actually think it could specify this in
11:03
Oh, it's our dead dish and is the number of chunks so I can put into even number.
11:09
Dash B is bite size,
11:15
and I think I can specify that. And
11:18
megabytes, every gun.
11:22
So these are the units
11:28
so split dash and five messages dot text
11:33
should split into five evenly sized items, which is really convenient. If I know that on my storage medium on Lee supported 400,000 bites, let's say
11:43
I could say, OK, split it based on that divider.
11:48
But then I can do it again with
11:52
Dash Pizza and I'll say
11:54
100 Meg.
11:58
Oh, actually should erase these other ones first.
12:03
Now run split
12:07
100 mag increments
12:16
and it looks like it. Uh,
12:20
well, maybe it Maybe it Maybe I shouldn't have done 100 magnification on 10 minutes.
12:26
Let's try that again.
12:30
I'd only give me one file,
12:31
but you see the usefulness I can break it up by the number of lines or the size or divided into evenly sized chunks.
12:41
I've used the word count command several times.
12:43
Now we're count. Uh, also,
12:46
I typically use it with the, um,
12:50
dash l'd account lines, but I can also count characters. I can count bites and I can count actual words.
12:58
So if I can't
13:01
messages dot text
13:03
a pipe that toe work on bash elegant that 19,000.
13:07
But now, if I took count the number of words that's gonna be much larger. 227,000
13:13
and a word is some string of text
13:16
that's separated by delimit er. Could be a space could be a semi colon in this case, or some other cup,
13:22
the limiter and I can define which dl emitters I want to use as a separator.
13:28
Then we have our head and tail commands. If I run hey, head against messages dot text
13:35
by default, it gives me the 1st 10 lines
13:39
Very useful. Sometimes there's information at the beginning of file you want to look at
13:43
I can also run head
13:46
with
13:46
a a number
13:48
and it will always show me that number of lines. So head Dash five shows me the 1st 5 lines.
13:54
Similarly, I can run tail
13:56
on messages, not text
14:00
See the last 10 lines
14:01
Or if I specify something else,
14:05
I can see the light the last 20 lines.
14:09
So this is really useful
14:11
just to see the beginning of the end of the file.
14:13
And even more useful is to,
14:18
um
14:18
you the tale dash F option.
14:22
So
14:24
I've got a *** remove this.
14:35
So what I'm gonna do is run tail dash F
14:41
against
14:43
out dot text
14:46
Now tail gave me a text, only has 22
14:52
lines of data in it
14:54
and you'll notice I didn't get my prompt back. So tell Dash is waiting for more information.
14:58
So What I can do is I can open a new tab
15:09
who didn't like my password.
15:20
So now I'm gonna echo a new line of data
15:28
to out dot text.
15:31
Oops. Almost made a mistake there.
15:33
We were talking about redirection earlier.
15:35
A single redirect. We'll create a new file or override existing file. What? I want to actually do it. Upend my data. So I have to use a double redirection
15:43
in order to do this.
15:46
So now I've sent
15:48
a new line of data
15:52
to the file. That's what out dot text now contains.
15:56
But because I was I was running tail dash F
16:00
against this file. That new line that I just added shows up on my standard output.
16:04
So you might think, But why is this useful? The reason this is useful
16:08
is because there may be times when I want to
16:14
monitor log file.
16:17
For instance, if I if I look at it
16:22
var log messages
16:25
and I do something like open a new shell
16:30
or do something else that generates a new log entry, it'll spit out to the screen.
16:36
So
16:37
all right, there. We got another message.
16:40
So, starting fifth Section 51 that was the new shall. I just opened up here.
16:48
That's my new, uh,
16:51
new session that it was talking about. I'm gonna go ahead and close that,
16:59
and so this is useful because I can continue to monitor this file in a separate window without having a concert, do a tail command or open the file to look at it.
17:07
I could just do control. So you to get out of there
17:12
so expand and a nun expand. If I had files,
17:18
for instance, I have
17:19
my file one dot text which I created earlier. It has some items in it file to doubt text tests of items which will look at here in a little bit.
17:36
So there's there spaces there in, uh,
17:40
in the in the file.
17:45
So now if I created file
17:48
that has some tabs in it,
17:51
like file three dot texture, I've got four lines with tabs.
17:56
I could run the expand, command and tell it
18:00
maybe I don't want these tabs. I want thio
18:03
reduce each tab to a, uh instead of what looks like
18:08
five way. We have
18:15
seven characters. I wanna reduce each tab to two characters
18:22
or actually that should give me too. But,
18:33
oh, it's too, including the first line. So there's if I change into four,
18:37
it's 1234
18:42
So pretty useful for doing a little bit of four. Manning
18:48
our next command to consider is translate,
18:53
and the trips like to me is kind of used. So, for instance, if I have a
18:57
file
19:00
cold translate dot text, I just typed in the alphabet.
19:03
This is a real simple example.
19:06
But what I can do is I can
19:08
cat this file
19:11
pipe that to translate
19:15
and the sin texture is a little bit tricky. But basically what I'm saying is I want to count a translate all letters a through Let's say m
19:26
Oops.
19:27
I want to translate those characters on Lee
19:30
to lower Case
19:37
So you notice
19:38
a through M
19:41
more upper case. Now they're lower case, and that could go backwards and forwards with this. Of course,
19:45
looking at the help command for translate,
19:51
I can do things like go back up here for a second.
19:56
I can delete characters so I can take,
19:57
uh, lower case letters and get rid of them all. Get rid of all my upper case letters
20:02
or I can translate translate
20:04
using these different,
20:07
uh,
20:08
sequences here, all letters and digits
20:11
or all letters. So, ALF in America, I can translate letters and digits in that same range.
20:18
I can translate all my control characters and non printable characters. Things like four feeds new lines and so on. I can get rid of spaces.
20:27
It's a little bit similar to the way that the Stream editor work said,
20:32
but I can do it in a little bit more of a
20:36
a precise manner to basically say, Take all these characters and convert them to these other characters.
20:42
Then I have the printer command, the PR command.
20:45
So if I run
20:48
PR against messages dot text, which we know is that 19,000
20:55
line long file,
20:56
I'll run a PR, you know, send it to more
21:00
and you'll notice
21:02
it gives me a nice header at the top, tells me what the date and time is
21:07
with the files called and the page number,
21:08
and as I hidden space, get my next page number.
21:15
So if I was printing this, this could be pretty handy, and you can obviously change the number of lines per printed page, according to your to your needs
21:23
pretty handy.
21:26
So those are some good commands to be familiar with. We have some other ones to look at as well. Before we're done with this module,
21:32
I can number the lines in a file, which is very handy.
21:37
And I can also control the width of the file as I send it to San Standard out or just some other
21:45
output.
21:47
We've got the sort command. This lets me manipulate data and sorted according to various
21:52
parameters that I specified.
21:56
Now the unique command cut is a little bit of overlap with sword because unique.
22:00
I want to remove duplicates. But I can also do this with sort command directly.
22:04
And we've got the cut command, which is very
22:07
useful when I've got a file that has delimit er's. I can easily go in and and only see certain pieces of that file.
22:18
The Pace command lets me take two files that have matching field definitions and paste them together.
22:25
Uh, this has limited usefulness for for your your typical daily tasks, but we'll see how it works. Nonetheless,
22:33
you saw the split command earlier wth economy verse of that is the joint command. I can take multiple files and join them
22:41
into a
22:44
single file, if I wish.
22:45
Then we'll go through the stream editor or said
22:48
show a couple of quick examples for that.
22:52
And then finally, the diff command, which lets me compared to files together directly to see what the difference is might be
23:00
and thats useful when you're got two different config files and not sure what changes someone made. I can run the diff command to see where there
23:07
where they might be
23:10
differing from each other.
23:11
All right, so let's have a look.
23:15
First, we'll start with the number line.
23:21
So number number being the lines in the file. I've got some different options, but the basic function is just to run an L against some some file.
23:30
We know that this file messages, not text that I copied earlier has 19,000 lines 19,419. So a number each of those
23:38
now this just went to standard out. I could also redirect this
23:45
because we saw earlier
23:47
thio
23:48
messages. That's call it numbered,
23:59
and I've saved that file
24:00
with the line numbers attached pretty handy,
24:06
and we have the format command
24:10
so if I format.
24:12
If I run for mayor against file one dot text
24:17
when it's doing is joining the lines together,
24:21
uh, that were
24:22
that were previously separate.
24:26
You can see this because I can catch the old file,
24:30
but when I might want to do to make this more useful is use the dash w
24:38
hoops together.
24:41
Okay with
24:44
So I said, Okay, only give me a five character with So it'll give me five characters for the first column.
24:51
Five for the second. Even though it's longer than five, it'll still printed.
24:55
And so this way I can
24:57
good
25:00
tell the formatting how many Hewat I want to be. If I say 20 characters, it'll think up to 20 characters on each line,
25:07
and you could see why that might be useful for taking certain kinds of data and formatting it to be printed or to be sent to another program.
25:18
I don't really use format too much. The sort command, however, is
25:22
much more useful, much more regularly useful.
25:29
So what I've done is I've copied the password file
25:32
to my current directory and called and password dot Text is a good wanted to demonstrate some of these techniques.
25:40
So if I can't pass word dot text,
25:44
it's in the order that it's in. But maybe I don't want that. Maybe I want to sort it
25:51
now. It'll look at the first field and sort according to that. To that value.
26:00
There it is. I should have cleared the screen first to see that more easily. Let's do that again
26:06
and their pipe, this tomb or so we don't lose everything strolling off the screen.
26:10
So now we can see the sorting on that first field. Everything is alphabetical,
26:15
and I could I can save this output if I wantto have sort this file and
26:19
capture that.
26:22
But let's look at a different file. I've got one here called a sort dot text, which I created,
26:27
and this is a series of numbers. These could be the first field of data Entry's could be anything, but it's It's a good way to see how this works
26:34
now. If I If I run sword against sword dot text,
26:41
it sorts them, but not in the way that we might expect.
26:45
This is what's called Lexa graphical sorting. So, um, 11 is greater than one. According to this sorting method 14 is greater than 1112 is also greater than one form.
26:57
This kind of sorting is not the way humans normally think of sorting, so it's a little bit unusual,
27:03
but it is still a
27:04
a way to sort characters according to a certain algorithm.
27:10
Now, if I want to sort these numerically Aiken type sort dash and
27:15
for this file
27:18
hopes, which has the wrong one sort dash end for sort of text.
27:22
Now the file shows me the numbers sorted in a order that's more predictable. This is the way people normally sort things.
27:30
But as you can see, I've got some
27:33
duplicates here.
27:34
So what I can do
27:37
is add the dash, you option to say, sort according to a new miracle settings, but also remove the duplicates. So now I've got my numbers sorted correctly with the duplicate lines removed. That's very handy
27:55
Now. I can also run the unique command
28:00
you need command
28:03
Good because, well, let's let's do this again. So there's sort of a text,
28:07
a bunch of unsorted numbers with a bunch of duplicates. I run unique against this data.
28:12
I can get where the duplicates but it's still not sorted.
28:15
So as I mentioned before, more than one way to do things in UNIX. Lennox.
28:21
I can run unique sort
28:22
and then pipe this output to soar. Dash end
28:26
duplicates already removed, but I want it in numerical order. So now I've got the same result.
28:32
So I can do this in multiple ways. Personally, I think Soar dash and you makes more sense
28:37
because it's it's less typing. But you can see why.
28:41
Why? That might be
28:42
an option.
28:47
All right, now we have the cut command.
28:49
So since I've got my password filed,
28:56
that's, uh, that's the cop of the past reform. Maybe I only want to look at
29:00
a listing of the user name, which is the first field and the user i D. Number, which is the third field.
29:08
So what I can do is I can cat password dot text
29:12
and I can fight this to cut.
29:15
Now, If I preferred to come without any kid without any
29:19
options,
29:21
it tells me I need to specify some parameters
29:23
so we look at cut
29:26
hope.
29:29
The main thing that we're concerned with is a delimit er
29:34
started the DL emitters here
29:37
in this case to delimit er is
29:38
huh?
29:40
The colon character. And then I want to tell which fields I want.
29:47
So I will cat password, text
29:52
pipe that to cut
29:52
and the Taliban. My delimit er is
29:57
the colon character.
30:00
And then I wanted look at Fields one and feels three.
30:04
So
30:06
I did exactly what I wanted.
30:08
I've got each user name and the third That's the first field in the third field. Waas the user i d number we know Route zero
30:18
and so on.
30:23
Maybe I also want to look at the home directory,
30:30
so that should be field number seven.
30:34
So I want to see 13 and seven
30:38
were simple.
30:40
The problem with cut is that you need a
30:44
file that's got delimit er's and and and and and it's consistently formatted that way.
30:49
Otherwise, it's hard to tell the program where you want to do this. Thea cutting to display that information.
30:57
Then we have our paste command.
31:00
So
31:02
if I, uh capt
31:03
for one
31:07
dot taxi and filed to dock text,
31:10
we can see that those are two files with six lines each.
31:15
I can paste them together, and this might be useful in certain cases
31:18
where I've got data that's very consistently formatted.
31:22
And I want to, uh,
31:25
take these two files like this and join up together, paste them together, side by side.
31:30
So now I can see I've got line one from both files pasted together, lying to and all the way through line six.
31:37
There are certain instances where this capability is is pretty useful.
31:41
I can also join files together.
31:45
So if I wanted to join, found one
31:48
with file, too?
31:52
No. Since
31:55
these files, uh, share a common
31:59
ah field in the beginning,
32:00
which is the line number in this case,
32:02
it doesn't show this in the output, but I've joined the two files together.
32:07
Uh, this is joining it. Just two standard out, but I didn't actually create anything. I would have to save this
32:15
as a
32:16
separate item.
32:20
So remember, I ran split.
32:22
I could run split,
32:27
dash, and two for messages, not text.
32:30
And we see that I've just created these two files right
32:36
now. What I can do is I can I can join
32:40
x a with ex baby
32:45
and save the output as password
32:47
joined text.
32:53
Oh, see, it's complained that they're not sorted. So
32:58
the joint Command does require
33:00
files to re sort it.
33:00
Uh,
33:02
and so what I did. That's probably bad example. What I did earlier waas
33:07
created took file one and filed to that I was using and sorted those.
33:12
So if I can't find one
33:15
sorted
33:21
and fall to sorted,
33:22
I can join those together
33:31
now that that didn't send me any output to my screen because I'm,
33:37
um,
33:38
sorting them together, but not actually showing the output to standard up.
33:44
If I had a really large file like, there is a way to to join things together like I just showed,
33:51
and I think what you have to do there is used the
33:53
the no check option which tells, sort or tell the joint command. Don't bother looking. If there's if the two files they're trying to join together our sordid already.
34:06
All right, moving on we have the said command
34:09
now said
34:13
is a stream editor, so I can use regular expressions
34:16
to change words of modifying, do search and replace type options.
34:27
So I've got a file called said dot text,
34:30
and it's just contains a sentence, as this is a test using said.
34:35
What I can do is I can, Kat said dot text
34:38
pipe this to said,
34:42
And I can
34:45
change, For instance, the word
34:47
this to that. So use asked for a substitute.
34:52
I'll substitute this for that.
34:59
Who played, I forgot.
35:00
There's always some detail that you need. Thio specify. If I'm using regular expressions, I need to tell said with a dash e
35:09
option.
35:17
Okay, so my message here is Tell me this. This is Unterman. Aidan. What I forgot to do was type in the trailing slash
35:25
So syntax is obviously very important.
35:28
What I'm doing is saying
35:30
Substitute this
35:31
for that.
35:32
And if I don't use the trailing slash before I use my single court and it doesn't know what I'm trying to accomplish
35:40
now, if I run it
35:44
Previously it said, this is a test now it says that is a test
35:50
pretty useful
35:52
and said has quite a few
35:54
options
35:57
for doing more advanced things like I can,
35:59
huh?
36:00
I can run this again,
36:04
and now I'll change this too.
36:07
Send the, uh, put back to said dot text.
36:10
I'm sorry. I'll call it Dust said
36:13
tue dot text.
36:20
So what I've done is is made a change to the file and then created a new file with that output.
36:27
I can also do this in a little bit more complex way.
36:30
I could say also substitute
36:32
test for experiment
36:37
groups.
36:54
All right, so I'm a little bit of a minor air there
36:59
and what I did wrong waas
37:01
and syntax again is kind of tricky. Sometimes we have to refer to a man, Paige, and double check our work.
37:07
So I'm using, said Dash E, which let's meet
37:10
sounds using regular expressions. But I'm any multiple replacements on the same line,
37:17
and I'll get rid of this.
37:22
So the, um,
37:24
far we're gonna operate on your jacket.
37:29
I'm ready. Catting it.
37:30
So this is a test using said That's what said dot text contains. Someone changed this to that and then changed test to experiment.
37:39
So now we have that as an experiment. Using said, instead of this is a test using set.
37:45
I did this using cat at the beginning of the command line.
37:49
I could have just as easily run, said directly,
37:52
and then specified,
37:58
said Don tux at the end of the line. So again multiple ways I can use cat to send the output through a pipe or run it this way
38:06
for more convenience.
38:09
And then we have to look at our def command.
38:13
So if I run,
38:15
uh, let's say I'm going to CAT
38:17
found one dot text
38:21
hand filed to dock text.
38:23
Each has six lines. They're all different except for the last line which matches.
38:29
And I did this on purpose in order to illustrate what how diff works.
38:35
So if I run def on File one and filed two,
38:38
it tells me that the 1st 5 lines
38:42
on the first file, which is the arrow pointing to the left that's filed the first following specified,
38:47
are different than
38:50
the next five lines in the second file
38:53
Lying six, which exists in both files. And I called it matched just for purposes to illustrate this. Since line six matches, it doesn't send me any output.
39:07
Now I can I can show on Lee those lines that match. If I wish
39:17
there's ways to do that, I can specify my, uh, I'll put a width of the of the one I'm looking at
39:25
and quite a few other things that are really useful. But the main thing with def is that I'm trying to
39:32
compared to files together and see where the differences are
39:37
in order to
39:39
be able to do some work with that information
39:47
so I could ignore matching lines.
39:52
I can ignore the case
39:54
dealing with Spaces and so on. Sometimes def will get a little bit confused when you're when you've got spaces and tabs.
40:04
All right, so it's a big, big list of commands that we looked at.
40:07
And
40:08
I recommend doing a little bit of practice with each of these,
40:12
looking at the man Paige and looking at help in order to get yourself better prepared for the exam.
40:17
All right, that ends our processing text dreams. Next, we will get into file management.
40:24
See you then. Thank you.

Up Next

CompTIA Linux+

Our self-paced online CompTIA Linux+ course prepares students with the knowledge to become a certified Linux+ expert, spanning a curriculum that covers Linux maintenance tasks, user assistance and installation and configuration.

Instructed By

Instructor Profile Image
Dean Pompilio
CEO of SteppingStone Solutions
Instructor