Although Wi-Fi is available everywhere these days, you might nevertheless find yourself without it from time to time. And when you do, there may be websites you wish you should've saved so that you had access to them while offline—perhaps for research, entertainment.

It's pretty basic to save individual web pages for offline reading, but what if you want to download an entire website? Don't worry, it's easier than you think. But don't take our word for it. Here are several nifty tools you can use to download any website for offline reading without any hassles.

1. WebCopy

copying a website for offline use with webcopy

WebCopy by Cyotek takes a website URL and scans it for links, pages, and media. As it finds pages, it recursively looks for more links, pages, and media until the whole website is discovered. Then you can use the configuration options to decide which parts to download offline.

The interesting thing about WebCopy is you can set up multiple projects that each have their own settings and configurations. This makes it easy to re-download many sites whenever you want; each one, in the same way every time.

One project can copy many websites, so use them with an organized plan (e.g., a "Tech" project for copying tech sites).

How to Download an Entire Website With WebCopy

It's pretty simple to save an entire website using this tool. Here's how:

  1. Install and launch the app.
  2. Navigate to File > New to create a new project.
  3. Type the URL into the Website field.
  4. Change the Save folder field to where you want the site saved.
  5. Play around with Project > Rules… (learn more about WebCopy's Rules).
  6. Navigate to File > Save As… to save the project.
  7. Click Copy in the toolbar to start the process.

Once the copying is done, you can use the Results tab to see the status of each individual page and/or media file. The Errors tab shows any problems that may have occurred, and the Skipped tab shows files that weren't downloaded. But most important is the Sitemap, which shows the full directory structure of the website as discovered by WebCopy.

To view the website offline, open File Explorer and navigate to the saved folder you'd earlier designated. Open the index.html (or sometimes index.htm) in your browser of choice to start browsing. As you can see, this is very different from saving individual webpages through apps, another practice which also all-too-common nowadays.

Download: WebCopy for Windows (Free)

2. HTTrack

downloading a whole website with httrack

HTTrack is more known than WebCopy, and is arguably better because it's open-source and available on platforms other than Windows. The interface is a bit clunky and leaves much to be desired. However, it works well, so don't let that turn you away.

Like WebCopy, it uses a project-based approach that lets you copy multiple websites and keep them all organized. You can pause and resume downloads, and you can update copied websites by re-downloading old and new files.

How to Download Complete Website With HTTrack

To get your whole website for offline use, follow the steps below:

  1. Install and launch the app.
  2. Click Next to begin creating a new project.
  3. Give the project a name, category, base path, then click on Next.
  4. Select Download website(s) for Action, then type each website's URL in the Web Addresses box, one URL per line. You can also store URLs in a TXT file and import it, which is convenient when you want to re-download the same sites later. Click Next.
  5. Adjust parameters if you want, then click on Finish.

Once everything is downloaded, you can browse the site normally, simply by going to where the files were downloaded and opening the index.html or index.htm in a browser.

How to Use HTTrack With Linux

If you are an Ubuntu user, here's how you can use HTTrack to save a whole website:

  1. Launch the Terminal and type the following command:
            sudo apt-get install httrack
        
  2. It will ask for your Ubuntu password (if you've set one). Type it in, and hit Enter. The Terminal will download the tool in a few minutes.
  3. Finally, type in this command and hit Enter. For this example, we downloaded the popular website, Brain Pickings.
            httrack https://www.brainpickings.org/
        
  4. This will download the whole website for offline reading.
    downloading a whole website with httrack on ubuntu

You can replace the website URL here with the URL of whichever website you want to download. For instance, if you wanted to download the whole Encyclopedia Britannica, you'll have to tweak your command to this:

        httrack https://www.britannica.com/
    

Download: HTTrack for Windows | Linux | Android (Free)

3. SiteSucker

downloading a complete website with sitesucker

If you're on a Mac, your best option is SiteSucker. This simple tool copies entire websites, maintains the same structure, and includes all relevant media files too (e.g., images, PDFs, style sheets). It has a clean and easy-to-use interface—you literally paste in the website URL and press Enter.

One of its nifty features is the ability to save an in-progress download to a file, then use that file to download the same files and structure again in the future (or on another machine). This feature is also what allows SiteSucker to pause and resume downloads.

SiteSucker costs $5 and does not come with a free version or a free trial, which is its biggest downside. The latest version requires macOS 11 Big Sur or higher. Older versions of SiteSucker are available for older Mac systems, but some features may be missing.

Download: SiteSucker for iOS | Mac ($4.99)

4. Wget

Wget is a command-line utility that can retrieve all kinds of files over the HTTP and FTP protocols. Since websites are served through HTTP and most web media files are accessible through HTTP or FTP, this makes Wget an excellent tool for downloading entire websites.

Wget comes bundled with most Unix-based systems. While Wget is typically used to download single files, it can also be used to recursively download all pages and files that are found through an initial page:

        wget -r -p https://www.makeuseof.com
    

Depending on the size, it may take a while for the complete website to be downloaded.

downloading a complete website with wget

However, some sites may detect and prevent what you're trying to do because ripping a website can cost them a lot of bandwidth. To get around this, you can disguise yourself as a web browser with a user agent string:

        wget -r -p -U Mozilla https://www.thegeekstuff.com
    

If you want to be polite, you should also limit your download speed, so you don't hog the web server's bandwidth. Plus, you can pause between each download, so you don't overwhelm the web server with too many requests. Use this command:

        wget -r -p -U Mozilla --wait=10 --limit-rate=35K https://www.thegeekstuff.com
    

How to Use Wget on a Mac

On a Mac, you can install Wget using a single Homebrew command: brew install wget.

  1. If you don't already have Homebrew installed, download it with this command:
            /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
        
  2. Next, install Wget with this command:
            brew install wget
        
  3. After the Wget installation is finished, you can download the website with this command:
            wget path/to/local.copy http://www.brainpickings.org/
        

On Windows, you'll need to use this ported version instead. Download and install the app, and follow the instructions to complete the site download.

5. Teleport Pro

teleport pro windows app

Get Teleport Pro. Seriously. Although the last version of the app was released back in 2015, the app works just as fine as it did back then.

Apart from simply downloading a whole website, the app packs a host of other features and intricacies as well. For instance, when you download and install the app, in the app's main menu you'll see these options to choose from:

  • Create a copy of the website on a hard drive.
  • Duplicate an entire website along with its directory structure.
  • Look up a site for specific types of files.
  • Search a website for given keywords.
  • Navigate all the sites linked from a central site.

These and a slew of other similar features make this app a handy program for your toolkit. The only downside we found was that of a limit on the free version. Using the unregistered version, you can only copy 500 pages in a single project. Moreover, you can only use it for a total of 40 times, after which you'll have to purchase the premium version.

Download: Teleport Pro for Windows (Free trial, paid version available)

6. Archivarix

Archivarix is another interesting site downloader that will help you download your websites for offline usage later on. Apart from being a simple website downloader, however, it helps you also helps you restore your files from the Web Archive. The only downside, perhaps, would be its limit on the free version that only lets you download 200 files in a single go.

It also offers a handy WordPress plugins that lets you import images from third-party websites.

Another feature that caught our attention was the ability to download Onion websites. So go ahead and try out the free version. If you like it, then you can go for the premium version next, which will cost you about $10 per thousand of files. From there, every next thousand will be for $1. Not a bad deal, certainly!

Download: Archivarix (Free, paid version available)

Easily Download Entire Websites for Offline Usage

Now that you know how to download an entire website, you should never be caught without something to read, even when you have no internet access.

But remember: the bigger the site, the bigger the download. Therefore, we don't recommend downloading massive sites like MUO because you'll need plenty of spare storage to store all the media files such sites use. Only want to download a specific webpage for offline viewing? You can easily do that as well.