WGet Lab

[toggle_content title="Transcript"] Leo Dregier here, I want to talk with you about a tool called web getter or wget this is great for retrieving something from a web server if you know exactly what the target is and what you want to retrieve. So it will go into the internet - grab whatever you want from a particular server and then copy it to the working directory that you are working in. So if I just print my working directory you can see / so we are going to do a wget and give us the format of a command so it is a missing the URL. So we need - wget options in the URL or if you want to see the detailed parameters then you get tab help and that will give you the full capabilities of this tool. We are not going to be come over this now and this video but I just want to illustrate that it is extremely powerful of utility especially for scripting. You have got basically stuff like versions or if you would like to go into the background you have got the specific login options. If you run for post login or things like that it is always from the go. If you want to download the number of tries where you want to put the output. What do you want to do with the server response. If you want to just see if it is there and not actually downloaded - you can use this tool in spider mode. You can specify you only go wrong before networks or only over IPv6 networks. Specific user names and passwords for ftp and http purposes - so if you do need something that requires user names and password. Certainly you are going to add that in here for scripting purposes because you don't have usernames and passwords and cleared that. If you want a grab a directory structure or not it has got a handful of http options again user names passwords, client settings, headers or cookies or post its - strings and add that to your logs. If you want to go over https or transport level security in one of the files and the certificates and their versions and one of the private keys in uses and things like that. You have ftp specific options and I would say ftp username and password parameters instead of the other user and parameter. Items from before so if you are specific ftp user the specifics of ftp and it will work much better. If you want a cursor all over the files and directories and again you have some additional recursive options like accept this or reject that. Okay we are going to actually just go into a simple version of this wget http:// www.facebook.com/robots.txt hit enter and it will go out to the internet to get robots.txt from Facebook. And there you go so resolving facebook.com it when out connecting to Facebook founded on this IP port 80 if you are not connected. http request for a response got an okay and then you can see basically it tried to copy the file and it saved it. So if I do an LS I should be able to see that robots.txt - now we can edit all hard drive. Now we can actually edit that - if I were to use the contents of this and you could do this dozens of different ways. You could do a head tail - less more, whatever you prefer. So I am just going to tail the file - tail robot.txt and you can basically see the last ten lines by the fault of that file just to prove that there is something. Or we can do - something like last robots.txt and you can go line by line and what is interesting here you can see all of the spidering that Facebook specifically does not allow. I have not done anything unethical here - robots.txt file is probably available just about everybody's site it is extremely popular within the web world just about every website has and always should be publicly accessible for read access. But i can learn some stuff there - I can learn that basically there is phps you are probably using apache server. I can see some of the directories like hash tags and photos and check point and hash tags and it is specific to the actual spiders that it is telling that not to index the sites. So here is the google bot no don't check these things. Here is IA archiever don't check these things. Here is the MSN bot disallow these activities, here is the nav robot, disallow these activities or sizz in the bath - disallow these activities. Slur which like a common program for indexing etc. etc. Yahoo - nonetheless you get a basic idea of how to go to a website. Grab a file copy it to a local host and then basically analyze that file. So it is relatively simple but any sort of web application pen testing reconnaissance for websites. You definitely want to know how to use a wget. So hope you enjoyed the video and I will see you in the next one. [/toggle_content] The final lesson in the Web Application series focuses on the actual web application/web server data.  WGet is the utility we use to accomplish web data analysis in penetration testing. The WGet lab demonstrates how to retrieve data on a web server that you specifically know is there. This is an extremely powerful scripting tool that offers a wide selection of switch options for total customization so you can scan for targeted information. You’ll also learn how to effectively use the WGet tool more precisely, including how to use FTP specific parameters for conducting targeted FTP scans and web application vulnerability searches as part of a thorough penetration testing strategy.
Recommended Study Material
Learn on the go.
The app designed for the modern cyber security professional.
Get it on Google Play Get it on the App Store

Our Revolution

We believe Cyber Security training should be free, for everyone, FOREVER. Everyone, everywhere, deserves the OPPORTUNITY to learn, begin and grow a career in this fascinating field. Therefore, Cybrary is a free community where people, companies and training come together to give everyone the ability to collaborate in an open source way that is revolutionizing the cyber security educational experience.

Cybrary On The Go

Get the Cybrary app for Android for online and offline viewing of our lessons.

Get it on Google Play

Support Cybrary

Donate Here to Get This Month's Donor Badge

Skip to toolbar

We recommend always using caution when following any link

Are you sure you want to continue?