We surf the internet. We do it every day. Nowadays, I have a 2Mbps broadband connection, but life needs more.
I used to have a slow connection of 56Kbps a few months back. Bandwidth was a big issue at that time (It’s still an issue 😂). But India is “shining” we have a cheap broadband connection now.
When I was on 56Kbps, I had this habit of making an offline cache of useful pages. I usually used the HTTrack Website Copier
“It allows you to download a World Wide website from the Internet to a local directory,building recursively all structures, getting html, images, and other files from the server to your computer. Links are rebuilt relatively so that you can freely browse to the local site (works with any browser). You can mirror several sites together so that you can jump from one to another. You can, also, update an existing mirror site, or resume an interrupted download””
SpiderZilla is intended to be a Firefox and Mozilla Suite extension for offline browsing. It is only a front-end for the open source command line program HTTrack Website Copier.
SpiderZilla is not under active development. The creators have stopped developing it more and updating it for the latest version. So I thought why don’t do it myself. You can find it on my Labs page.