How to Download an Entire Website for Offline Viewing

David Archivarix
3 min readFeb 14, 2019

--

It’s easy enough to save individual web pages for offline reading, but what if you want to download an entire website? Well, it’s easier than you think! Here are four nifty tools you can use to download any website for offline reading, zero effort required.

1. Archivarix.com

Archivarix is a wayback machine online downloader to recreate websites from the Wayback Machine web.archive.org

– Downloading and processing the content takes place on our server. You do not need to waste time on it, we will send you ready zip archive with entire website.

– Recovered files with text (html, css, js) are in a separate folder to make it easier to do a search and replace. Internal linkage is restored using mod_rewrite in .htaccess

– You get a ready and workable website that does not contain 404 pages, broken images, external links, scripts and other garbage that do not work. All broken files are replaced with dummy ones that you can edit. All banners, counters and other external scripts are deleted using the AdBlock database.

– And what is the most important, our service optimizes the restored content in accordance with the recommendations of Google Developers. The image files are compressed and all EXIFs are deleted from them. Our script deleting comments from html, optimizing css and js and makes much more to make the website better than it was.

You can restore up to 200 files on one website for free. A first thousand above this limit will cost $5 per thousand of files (0.5 cents per file). Every next thousand will cost $0.5 only (0.05 cents per file).

2. WebCopy

WebCopy by Cyotek takes a website URL and scans it for links, pages, and media. As it finds pages, it recursively looks for more links, pages, and media until the whole website is discovered. Then you can use the configuration options to decide which parts to download offline.

The interesting thing about WebCopy is you can set up multiple “projects” that each have their own settings and configurations. This makes it easy to re-download many different sites whenever you want, each one in the same exact way every time.

3. HTTrack

HTTrack is an extremely popular program for downloading websites. Although the interface isn’t quite modern, it functions very well for its intended purpose. The wizard is easy to use and will follow you through settings that define where the website should be saved and some specifics like what files should be avoided in the download.

For example, exclude whole links from the site if you have no reason to extract those portions.

4. SiteSucker

If you’re on a Mac, your best option is SiteSucker. This simple tool rips entire websites and maintains the same overall structure, and includes all relevant media files too (e.g. images, PDFs, style sheets).

It has a clean and easy-to-use interface that could not be easier to use: you literally paste in the website URL and press Enter.

One nifty feature is the ability to save the download to a file, then use that file to download the same exact files and structure again in the future (or on another machine). This feature is also what allows SiteSucker to pause and resume downloads.

SiteSucker costs $5 and does not come with a free version or a free trial.

We don’t recommend downloading huge sites because you’ll need thousands of MBs to store all of the media files we use.

The best sites to download are those with lots of text and not many images, and sites that don’t regularly add new pages or changed. Static information sites, online ebook sites, and sites you want to archive in case they go down are ideal.

http://techblogcorner.com/2019/02/13/how-to-download-an-entire-website-for-offline-viewing/

--

--