Thursday, June 4, 2009

Download a website

Here is a simple effective way to get the files downloaded recursively from a website without actually visiting each and every link to the sub pages.

wget -r -p -k -E http://www.linuxdriver.co.il/ldd3/

...where
  • -r is for recursive download of pages
  • -p is for linking pages locally so that user can browse them easily once the download is completed.
  • -k is to create the directory structure, and
  • -E is to create .html extensions to the type XHTML or text files.

No comments:

Post a Comment