Why?


Search This Blog

Monday, May 25, 2015

Download website with wget

Download website with wget

I use this to backup my blogger site


wget \
--recursive \
--no-clobber \
--page-requisites \
--html-extension \
--convert-links \
--domains glenewhittenberg.blogspot.com \
http://glenewhittenberg.blogspot.com

This command downloads the Web site http://whittenberg.blogspot.com/.

The options are:

    --recursive: download the entire Web site.
    --domains website.org: don't follow links outside website.org.
    --no-parent: don't follow links outside the directory tutorials/html/.
    --page-requisites: get all the elements that compose the page (images, CSS and so on).
    --html-extension: save files with the .html extension.
    --convert-links: convert links so that they work locally, off-line.
    --restrict-file-names=windows: modify filenames so that they will work in Windows as well.
    --no-clobber: don't overwrite any existing files (used in case the download is interrupted and resumed).

No comments:

Post a Comment