Back

buonaparte's audio and text links

uisukii Wrote:Not to distract the thread too much, but it there a practical way for someone to mirror the site, or at least save some of the content for people to use in case the users.bestweb.net site where to go offline, for someone on limited bandwidth, using a low powered computer such as a laptop?
This mirror automatically updates every week from the users.bestweb.net site:
http://rawtoast.eurybia.feralhosting.com..._mountain/

If you want them locally, on linux (I guess on a Mac too) you can use:

wget --mirror -w 2 http://users.bestweb.net/~siom/martian_mountain/

The -w 2 adds a two second delay between each attempt to get a file, this lowers the chance of the webhost blocking your connection. On windows there probably is an application to do this, or you can try cygwin to run the linux commands.

On a limited bandwidth connection, I'd use wget but only for a subfolder so you have some control. The entire site is somewhere around 18GB.

You could use reject on wget to block downloading of zips:

Quote:‘-R rejlist’
‘--reject rejlist’
‘reject = rejlist’
The ‘--reject’ option works the same way as ‘--accept’, only its logic is the reverse; Wget will download all files except the ones matching the suffixes (or patterns) in the list.

So, if you want to download a whole page except for the cumbersome mpegs and .au files, you can use ‘wget -R mpg,mpeg,au’. Analogously, to download all files except the ones beginning with ‘bjork’, use ‘wget -R "bjork*"’. The quotes are to prevent expansion by the shell.
I am considering having a look at creating some extra sftp users so you can dump files on the box. I am not particularly fond of useful resources being held only on file sharing sites, who knows what will happen to mediafire, et al.
Reply

Messages In This Thread