Welcome to bytebang » The blog about all and nothing » How to make static webpage snapshots with wget

How to make static webpage snapshots with wget

Apr 19 2017

The Problem

For some reasons I needed an offline copy of a certain homepage. Well, everybody knows that your browser can save a single page with a few mouse clicks - but what if you wanted to have a copy of the whole homepage ?

The Solution

I don't know how the wayback machine (internet archive) does it but I came up with the following solution:

wget is a very powerful tool, which enables you to download arbitrary files from the internet. Beside that is has also the capability to download web pages recursively.

In my case I wanted to backup the whole content of a web page which was accessible at the domain www.pension-zadl.at

This one-liner was everything I needed:

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains www.pension-zadl.at --no-parent http://www.pension-zadl.at/

Well - it downloads the whole web page (recursively for all files which are on the domain www.pension-zadl.at) and modifies the links within the downloaded files in a way where they are pointing only to the downloaded resources.

This is a handy way how to create a simple (static) backup of a whole web presence.

Get Social

(c) 2024, by bytebang e.U. - Impressum - Datenschutz / Nutzungsbedingungen