HTTrack is great, it's got lots of useful features including sophisticated file type download options and it’s easy to install at least, easy under Windows (where it’s known as WinHTTrack).
GNU Wget is a free software package for retrieving files using HTTP(S) and FTP, the most.
Download HTTrack. SurfOnline is another Windows-only software that you can use to download websites for offline use however it is not free. Instead of opening webpages in a browser like Chrome, you can browse downloaded pages right inside SurfOnline. Like HTTrack, there are rules to download file types however it is very limited.
Download the official µTorrent® (uTorrent) torrent client for Windows, Mac, Android or Linux- uTorrent is the #1 bittorrent download client on desktops worldwide.
Recently, I came about some e-books that are html only (sucks yeah), but they are good books and I want to really have them locally. So I need to download ’em.
However the most of us work with Windows, i know some of you work on OS X just like me. And as we all know a lot of software is not available for OS X. Just like site mirror software Httrack. But know, how to download Httrack? Here's a tutorial how to download and install those two. Download X-code from the App Store.
I know. There are GUI tools for it. But what if you are stuck in a terminal only server? I am behind a very strict proxy, but I have a server that I can FTP into and the server is not behind the proxy. But the server is terminal only, hence the wget option.
wget can download the whole internet if you so wish. and it’s simple
wget -r url
Now before you go there are a few caveats.
The sites will be downloaded, but will not be really suitable for offline viewing. To enable relative links do
wget -rk url
Download Httrack For Macbook Air
The above will convert the files to be suitable for offline viewing as necessary. You might want wget to keep the original files.
wget -rkK url
Download Httrack
Also another caveat. This option will only download the html file. To tell wget to download all files necessary to display the page properly (images, sounds, linked css etc) use
wget -rkp url
Again, don’t go yet. The default level of links to follow is 5. This might be too much (or too small in case your plan is to download the whole internets). you can specify the link level thus
wget -rkpl 5 url
Finally, you might want wget to do all the hard work of downloading the internet and delete the files immediately after.
wget -r –delete-after url
man wget
is also a good place to start learning more about the things that wget can do.