Now assuming you have a working version of wget installed on your system, let's go through the parameters that are neccessary to have Wget do what we want it to, using the following example for making a snapshot of all my posts on eeeGadgets:
wget -r -l1 -N -k -x http://eeegadgets.blogspot.com/search/label/all posts
-r wget searches the target website recursively for subfolders and external folders that are being linked to
-l1 defines the depth of links to store in our snapshot (in most cases 1 or 2 is sufficient here)
-N wget adds a timestamp to the snapshot for easy later archiving of different snapshots
-k causes wget to change "absolute" (real web) links into "relative" (snapshot file) links, this should always be enabled if you want the links in your snapshot to also work when offline viewing it
-x wget uses the original folder structure as found on the target website (there's also an opposite parameter -nd which causes all files to be written into just one big directory)
When called with these parameters, wget will download all necessary files into the current folder, so you should first make an appropriate one ("mkdir /home/MySnapshotFolderName && cd /home/MySnapshotFolderName") before running the "wget..." command.
And voilà - a complete snapshot of your favorite website, stored on your personal harddisk for eternal backup...
No comments:
Post a Comment