Download Whole Website or Directories by using wget in Linux
You might have googled a software for downloading a specified website or directory on either Windows or Linux platform . Yes, a bunch of tools can do this for you. Actually, we can do this by using a simple command, wget, on Linux platform. It is highly customizable, just a powerful crawler. You will find it fantastic and really cool. Let me just show you how!
The command above let you download the “windows” directory at the domain of “techstroke.com” recursively, starting from the url www.techstroke.com/Windows/
How do you like it? Hah, really cool?
Finally, let me explain a bit more about the parameters. Of course, you can refer to its documentation.
The options are:
–recursive: download the entire Web site.
–domains-techstroke.com: don’t follow links outside techstroke.com.
–no-parent: don’t follow links outside the directory /Windows/.
–page-requisites: get all the elements that compose the page (images, CSS and so on).
–html-extension: save files with the .html extension.
–convert-links: convert links so that they work locally, off-line.
–restrict-file-names=windows: modify filenames so that they will work in Windows as well.
–no-clobber: don’t overwrite any existing files (used in case the download is interrupted and
Post a Comment