Thursday, June 4, 2009

Download a website

Here is a simple effective way to get the files downloaded recursively from a website without actually visiting each and every link to the sub pages.

wget -r -p -k -E http://www.linuxdriver.co.il/ldd3/

...where
  • -r is for recursive download of pages
  • -p is for linking pages locally so that user can browse them easily once the download is completed.
  • -k is to create the directory structure, and
  • -E is to create .html extensions to the type XHTML or text files.

Kill all process instances at one go

Sometimes you might need to urgently kill all the java processes. Use the following command to do so:

ps -ef | grep java | xargs kill -9 awk '{print $2}' > /dev/null 2>&1