![]() This software is really easy to use and in 5 clicks you can setup the crawl depth, the save path and the type of media you want to save. The chrome extension Singlefile that is easy to use only saves what's on the monitor and is not a crawler that goes through all the links.Įdit: Cyotek Webcopy was the answer. Wail/WARC was also something I was thinking about but it is also too complicated (At least to read the websites afterward). It does this by asynchronously copying the sites webpages. Wget is too complicated for him, then I was thinking of httrack but it is still not a one-click solution. SiteSucker is a Macintosh application that automatically downloads Web sites from the Internet. His hobby is old pictures of everything from France ( like 1850-1945) and he wants to archive whole websites because there's a high chance someone will forget to pay for the server space and they will disappear forever. And he asked me how to save entire websites for archiving purposes. ![]() I love him very much but I also know that I can't explain to him wget. ![]() My father is a bit of a technology "illiterate", he's 60+, types with 2 fingers and doesn't understand there's a middle mouse button.
0 Comments
Leave a Reply. |